A Simple Computer-Aided Three-Dimensional Molecular Modeling for the Octant Rule
ERIC Educational Resources Information Center
Kang, Yinan; Kang, Fu-An
2011-01-01
The Moffitt-Woodward-Moscowitz-Klyne-Djerassi octant rule is one of the most successful empirical rules in organic chemistry. However, the lack of a simple effective modeling method for the octant rule in the past 50 years has posed constant difficulties for researchers, teachers, and students, particularly the young generations, to learn and…
An evaluation of rise time characterization and prediction methods
NASA Technical Reports Server (NTRS)
Robinson, Leick D.
1994-01-01
One common method of extrapolating sonic boom waveforms from aircraft to ground is to calculate the nonlinear distortion, and then add a rise time to each shock by a simple empirical rule. One common rule is the '3 over P' rule which calculates the rise time in milliseconds as three divided by the shock amplitude in psf. This rule was compared with the results of ZEPHYRUS, a comprehensive algorithm which calculates sonic boom propagation and extrapolation with the combined effects of nonlinearity, attenuation, dispersion, geometric spreading, and refraction in a stratified atmosphere. It is shown there that the simple empirical rule considerably overestimates the rise time estimate. In addition, the empirical rule does not account for variations in the rise time due to humidity variation or propagation history. It is also demonstrated that the rise time is only an approximate indicator of perceived loudness. Three waveforms with identical characteristics (shock placement, amplitude, and rise time), but with different shock shapes, are shown to give different calculated loudness. This paper is based in part on work performed at the Applied Research Laboratories, the University of Texas at Austin, and supported by NASA Langley.
A simple threshold rule is sufficient to explain sophisticated collective decision-making.
Robinson, Elva J H; Franks, Nigel R; Ellis, Samuel; Okuda, Saki; Marshall, James A R
2011-01-01
Decision-making animals can use slow-but-accurate strategies, such as making multiple comparisons, or opt for simpler, faster strategies to find a 'good enough' option. Social animals make collective decisions about many group behaviours including foraging and migration. The key to the collective choice lies with individual behaviour. We present a case study of a collective decision-making process (house-hunting ants, Temnothorax albipennis), in which a previously proposed decision strategy involved both quality-dependent hesitancy and direct comparisons of nests by scouts. An alternative possible decision strategy is that scouting ants use a very simple quality-dependent threshold rule to decide whether to recruit nest-mates to a new site or search for alternatives. We use analytical and simulation modelling to demonstrate that this simple rule is sufficient to explain empirical patterns from three studies of collective decision-making in ants, and can account parsimoniously for apparent comparison by individuals and apparent hesitancy (recruitment latency) effects, when available nests differ strongly in quality. This highlights the need to carefully design experiments to detect individual comparison. We present empirical data strongly suggesting that best-of-n comparison is not used by individual ants, although individual sequential comparisons are not ruled out. However, by using a simple threshold rule, decision-making groups are able to effectively compare options, without relying on any form of direct comparison of alternatives by individuals. This parsimonious mechanism could promote collective rationality in group decision-making.
Weiss, Volker C
2010-07-22
One of Guggenheim's many corresponding-states rules for simple fluids implies that the molar enthalpy of vaporization (determined at the temperature at which the pressure reaches 1/50th of its critical value, which approximately coincides with the normal boiling point) divided by the critical temperature has a value of roughly 5.2R, where R is the universal gas constant. For more complex fluids, such as strongly polar and ionic fluids, one must expect deviations from Guggenheim's rule. Such a deviation has far-reaching consequences for other empirical rules related to the vaporization of fluids, namely Guldberg's rule and Trouton's rule. We evaluate these characteristic quantities for simple fluids, polar fluids, hydrogen-bonding fluids, simple inorganic molten salts, and room temperature ionic liquids (RTILs). For the ionic fluids, the critical parameters are not accessible to direct experimental observation; therefore, suitable extrapolation schemes have to be applied. For the RTILs [1-n-alkyl-3-methylimidazolium bis(trifluoromethylsulfonyl)imides, where the alkyl chain is ethyl, butyl, hexyl, or octyl], the critical temperature is estimated by extrapolating the surface tension to zero using Guggenheim's and Eotvos' rules; the critical density is obtained using the linear-diameter rule. It is shown that the RTILs adhere to Guggenheim's master curve for the reduced surface tension of simple and moderately polar fluids, but that they deviate significantly from his rule for the reduced enthalpy of vaporization of simple fluids. Consequences for evaluating the Trouton constant of RTILs, the value of which has been discussed controversially in the literature, are indicated.
Complex dynamics and empirical evidence (Invited Paper)
NASA Astrophysics Data System (ADS)
Delli Gatti, Domenico; Gaffeo, Edoardo; Giulioni, Gianfranco; Gallegati, Mauro; Kirman, Alan; Palestrini, Antonio; Russo, Alberto
2005-05-01
Standard macroeconomics, based on a reductionist approach centered on the representative agent, is badly equipped to explain the empirical evidence where heterogeneity and industrial dynamics are the rule. In this paper we show that a simple agent-based model of heterogeneous financially fragile agents is able to replicate a large number of scaling type stylized facts with a remarkable degree of statistical precision.
Quantitative genetic versions of Hamilton's rule with empirical applications
McGlothlin, Joel W.; Wolf, Jason B.; Brodie, Edmund D.; Moore, Allen J.
2014-01-01
Hamilton's theory of inclusive fitness revolutionized our understanding of the evolution of social interactions. Surprisingly, an incorporation of Hamilton's perspective into the quantitative genetic theory of phenotypic evolution has been slow, despite the popularity of quantitative genetics in evolutionary studies. Here, we discuss several versions of Hamilton's rule for social evolution from a quantitative genetic perspective, emphasizing its utility in empirical applications. Although evolutionary quantitative genetics offers methods to measure each of the critical parameters of Hamilton's rule, empirical work has lagged behind theory. In particular, we lack studies of selection on altruistic traits in the wild. Fitness costs and benefits of altruism can be estimated using a simple extension of phenotypic selection analysis that incorporates the traits of social interactants. We also discuss the importance of considering the genetic influence of the social environment, or indirect genetic effects (IGEs), in the context of Hamilton's rule. Research in social evolution has generated an extensive body of empirical work focusing—with good reason—almost solely on relatedness. We argue that quantifying the roles of social and non-social components of selection and IGEs, in addition to relatedness, is now timely and should provide unique additional insights into social evolution. PMID:24686930
Less can be more: How to make operations more flexible and robust with fewer resources
NASA Astrophysics Data System (ADS)
Haksöz, ćaǧrı; Katsikopoulos, Konstantinos; Gigerenzer, Gerd
2018-06-01
We review empirical evidence from practice and general theoretical conditions, under which simple rules of thumb can help to make operations flexible and robust. An operation is flexible when it responds adaptively to adverse events such as natural disasters; an operation is robust when it is less affected by adverse events in the first place. We illustrate the relationship between flexibility and robustness in the context of supply chain risk. In addition to increasing flexibility and robustness, simple rules simultaneously reduce the need for resources such as time, money, information, and computation. We illustrate the simple-rules approach with an easy-to-use graphical aid for diagnosing and managing supply chain risk. More generally, we recommend a four-step process for determining the amount of resources that decision makers should invest in so as to increase flexibility and robustness.
Digit Reversal in Children's Writing: A Simple Theory and Its Empirical Validation
ERIC Educational Resources Information Center
Fischer, Jean-Paul
2013-01-01
This article presents a simple theory according to which the left-right reversal of single digits by 5- and 6-year-old children is mainly due to the application of an implicit right-writing or -orienting rule. A number of nontrivial predictions can be drawn from this theory. First, left-oriented digits (1, 2, 3, 7, and 9) will be reversed more…
Regression Analysis by Example. 5th Edition
ERIC Educational Resources Information Center
Chatterjee, Samprit; Hadi, Ali S.
2012-01-01
Regression analysis is a conceptually simple method for investigating relationships among variables. Carrying out a successful application of regression analysis, however, requires a balance of theoretical results, empirical rules, and subjective judgment. "Regression Analysis by Example, Fifth Edition" has been expanded and thoroughly…
Rule-governed behavior: teaching a preliminary repertoire of rule-following to children with autism.
Tarbox, Jonathan; Zuckerman, Carrie K; Bishop, Michele R; Olive, Melissa L; O'Hora, Denis P
2011-01-01
Rule-governed behavior is generally considered an integral component of complex verbal repertoires but has rarely been the subject of empirical research. In particular, little or no previous research has attempted to establish rule-governed behavior in individuals who do not already display the repertoire. This study consists of two experiments that evaluated multiple exemplar training procedures for teaching a simple component skill, which may be necessary for developing a repertoire of rule-governed behavior. In both experiments, children with autism were taught to respond to simple rules that specified antecedents and the behaviors that should occur in their presence. In the first study, participants were taught to respond to rules containing "if/then" statements, where the antecedent was specified before the behavior. The second experiment was a replication and extension of the first. It involved a variation on the manner in which rules were presented. Both experiments eventually demonstrated generalization to novel rules for all participants; however variations to the standard procedure were required for several participants. Results suggest that rule-following can be analyzed and taught as generalized operant behavior and implications for future research are discussed.
Empirical likelihood-based tests for stochastic ordering
BARMI, HAMMOU EL; MCKEAGUE, IAN W.
2013-01-01
This paper develops an empirical likelihood approach to testing for the presence of stochastic ordering among univariate distributions based on independent random samples from each distribution. The proposed test statistic is formed by integrating a localized empirical likelihood statistic with respect to the empirical distribution of the pooled sample. The asymptotic null distribution of this test statistic is found to have a simple distribution-free representation in terms of standard Brownian bridge processes. The approach is used to compare the lengths of rule of Roman Emperors over various historical periods, including the “decline and fall” phase of the empire. In a simulation study, the power of the proposed test is found to improve substantially upon that of a competing test due to El Barmi and Mukerjee. PMID:23874142
Simple heuristics and rules of thumb: where psychologists and behavioural biologists might meet.
Hutchinson, John M C; Gigerenzer, Gerd
2005-05-31
The Centre for Adaptive Behaviour and Cognition (ABC) has hypothesised that much human decision-making can be described by simple algorithmic process models (heuristics). This paper explains this approach and relates it to research in biology on rules of thumb, which we also review. As an example of a simple heuristic, consider the lexicographic strategy of Take The Best for choosing between two alternatives: cues are searched in turn until one discriminates, then search stops and all other cues are ignored. Heuristics consist of building blocks, and building blocks exploit evolved or learned abilities such as recognition memory; it is the complexity of these abilities that allows the heuristics to be simple. Simple heuristics have an advantage in making decisions fast and with little information, and in avoiding overfitting. Furthermore, humans are observed to use simple heuristics. Simulations show that the statistical structures of different environments affect which heuristics perform better, a relationship referred to as ecological rationality. We contrast ecological rationality with the stronger claim of adaptation. Rules of thumb from biology provide clearer examples of adaptation because animals can be studied in the environments in which they evolved. The range of examples is also much more diverse. To investigate them, biologists have sometimes used similar simulation techniques to ABC, but many examples depend on empirically driven approaches. ABC's theoretical framework can be useful in connecting some of these examples, particularly the scattered literature on how information from different cues is integrated. Optimality modelling is usually used to explain less detailed aspects of behaviour but might more often be redirected to investigate rules of thumb.
Linsenmeyer, Katherine; Strymish, Judith; Gupta, Kalpana
2015-12-01
The emergence of multidrug-resistant (MDR) uropathogens is making the treatment of urinary tract infections (UTIs) more challenging. We sought to evaluate the accuracy of empiric therapy for MDR UTIs and the utility of prior culture data in improving the accuracy of the therapy chosen. The electronic health records from three U.S. Department of Veterans Affairs facilities were retrospectively reviewed for the treatments used for MDR UTIs over 4 years. An MDR UTI was defined as an infection caused by a uropathogen resistant to three or more classes of drugs and identified by a clinician to require therapy. Previous data on culture results, antimicrobial use, and outcomes were captured from records from inpatient and outpatient settings. Among 126 patient episodes of MDR UTIs, the choices of empiric therapy against the index pathogen were accurate in 66 (52%) episodes. For the 95 patient episodes for which prior microbiologic data were available, when empiric therapy was concordant with the prior microbiologic data, the rate of accuracy of the treatment against the uropathogen improved from 32% to 76% (odds ratio, 6.9; 95% confidence interval, 2.7 to 17.1; P < 0.001). Genitourinary tract (GU)-directed agents (nitrofurantoin or sulfa agents) were equally as likely as broad-spectrum agents to be accurate (P = 0.3). Choosing an agent concordant with previous microbiologic data significantly increased the chance of accuracy of therapy for MDR UTIs, even if the previous uropathogen was a different species. Also, GU-directed or broad-spectrum therapy choices were equally likely to be accurate. The accuracy of empiric therapy could be improved by the use of these simple rules. Copyright © 2015, American Society for Microbiology. All Rights Reserved.
A simple model of bipartite cooperation for ecological and organizational networks.
Saavedra, Serguei; Reed-Tsochas, Felix; Uzzi, Brian
2009-01-22
In theoretical ecology, simple stochastic models that satisfy two basic conditions about the distribution of niche values and feeding ranges have proved successful in reproducing the overall structural properties of real food webs, using species richness and connectance as the only input parameters. Recently, more detailed models have incorporated higher levels of constraint in order to reproduce the actual links observed in real food webs. Here, building on previous stochastic models of consumer-resource interactions between species, we propose a highly parsimonious model that can reproduce the overall bipartite structure of cooperative partner-partner interactions, as exemplified by plant-animal mutualistic networks. Our stochastic model of bipartite cooperation uses simple specialization and interaction rules, and only requires three empirical input parameters. We test the bipartite cooperation model on ten large pollination data sets that have been compiled in the literature, and find that it successfully replicates the degree distribution, nestedness and modularity of the empirical networks. These properties are regarded as key to understanding cooperation in mutualistic networks. We also apply our model to an extensive data set of two classes of company engaged in joint production in the garment industry. Using the same metrics, we find that the network of manufacturer-contractor interactions exhibits similar structural patterns to plant-animal pollination networks. This surprising correspondence between ecological and organizational networks suggests that the simple rules of cooperation that generate bipartite networks may be generic, and could prove relevant in many different domains, ranging from biological systems to human society.
A simple rule reduces costs of extragroup parasitism in a communally breeding bird.
Riehl, Christina
2010-10-26
How do cooperatively breeding groups resist invasion by parasitic "cheaters," which dump their eggs in the communal nest but provide no parental care [1,2]? Here I show that Greater Anis (Crotophaga major), Neotropical cuckoos that nest in social groups containing several breeding females [3], use a simple rule based on the timing of laying to recognize and reject eggs laid by extragroup parasites. I experimentally confirmed that Greater Anis cannot recognize parasitic eggs based on the appearance of host egg phenotypes or on the number of eggs in the clutch. However, they can discriminate between freshly laid eggs and those that have already been incubated, and they accordingly eject asynchronous eggs. This mechanism is reliable in naturally parasitized nests, because group members typically lay their eggs in tight synchrony, whereas the majority of parasitic eggs are laid several days later. Rejection of asynchronous eggs therefore provides a rare empirical example of a complex, group-level behavior that arises through relatively simple "rules of thumb" without requiring advanced cognitive mechanisms such as learning, counting, or individual recognition. Copyright © 2010 Elsevier Ltd. All rights reserved.
A simple rule of thumb for elegant prehension.
Mon-Williams, M; Tresilian, J R
2001-07-10
Reaching out to grasp an object (prehension) is a deceptively elegant and skilled behavior. The movement prior to object contact can be described as having two components, the movement of the hand to an appropriate location for gripping the object, the "transport" component, and the opening and closing of the aperture between the fingers as they prepare to grip the target, the "grasp" component. The grasp component is sensitive to the size of the object, so that a larger grasp aperture is formed for wider objects; the maximum grasp aperture (MGA) is a little wider than the width of the target object and occurs later in the movement for larger objects. We present a simple model that can account for the temporal relationship between the transport and grasp components. We report the results of an experiment providing empirical support for our "rule of thumb." The model provides a simple, but plausible, account of a neural control strategy that has been the center of debate over the last two decades.
Economic modelling with low-cognition agents
NASA Astrophysics Data System (ADS)
Ormerod, Paul
2006-10-01
The standard socio-economic model (SSSM) postulates very considerable cognitive powers on the part of its agents. They are able to gather all relevant information in any given situation, and to take the optimal decision on the basis of it, given their tastes and preferences. This behavioural rule is postulated to be universal. The concept of bounded rationality relaxes this somewhat, by permitting agents to have access to only limited amounts of information. But agents still optimise subject to their information set and tastes. Empirical work in economics over the past 20 years or so has shown that in general these behavioural postulates lack empirical validity. Instead, agents appear to have limited ability to gather information, and use simple rules of thumb to process the information which they have in order to take decisions. Building theoretical models on these realistic foundations which give better accounts of empirical phenomena than does the SSSM is an important challenge to both economists and econophysicists. Considerable progress has already been made in a short space of time, and examples are given in this paper.
Simple spatial scaling rules behind complex cities.
Li, Ruiqi; Dong, Lei; Zhang, Jiang; Wang, Xinran; Wang, Wen-Xu; Di, Zengru; Stanley, H Eugene
2017-11-28
Although most of wealth and innovation have been the result of human interaction and cooperation, we are not yet able to quantitatively predict the spatial distributions of three main elements of cities: population, roads, and socioeconomic interactions. By a simple model mainly based on spatial attraction and matching growth mechanisms, we reveal that the spatial scaling rules of these three elements are in a consistent framework, which allows us to use any single observation to infer the others. All numerical and theoretical results are consistent with empirical data from ten representative cities. In addition, our model can also provide a general explanation of the origins of the universal super- and sub-linear aggregate scaling laws and accurately predict kilometre-level socioeconomic activity. Our work opens a new avenue for uncovering the evolution of cities in terms of the interplay among urban elements, and it has a broad range of applications.
Emergence of a coherent and cohesive swarm based on mutual anticipation
Murakami, Hisashi; Niizato, Takayuki; Gunji, Yukio-Pegio
2017-01-01
Collective behavior emerging out of self-organization is one of the most striking properties of an animal group. Typically, it is hypothesized that each individual in an animal group tends to align its direction of motion with those of its neighbors. Most previous models for collective behavior assume an explicit alignment rule, by which an agent matches its velocity with that of neighbors in a certain neighborhood, to reproduce a collective order pattern by simple interactions. Recent empirical studies, however, suggest that there is no evidence for explicit matching of velocity, and that collective polarization arises from interactions other than those that follow the explicit alignment rule. We here propose a new lattice-based computational model that does not incorporate the explicit alignment rule but is based instead on mutual anticipation and asynchronous updating. Moreover, we show that this model can realize densely collective motion with high polarity. Furthermore, we focus on the behavior of a pair of individuals, and find that the turning response is drastically changed depending on the distance between two individuals rather than the relative heading, and is consistent with the empirical observations. Therefore, the present results suggest that our approach provides an alternative model for collective behavior. PMID:28406173
How children perceive fractals: Hierarchical self-similarity and cognitive development
Martins, Maurício Dias; Laaha, Sabine; Freiberger, Eva Maria; Choi, Soonja; Fitch, W. Tecumseh
2014-01-01
The ability to understand and generate hierarchical structures is a crucial component of human cognition, available in language, music, mathematics and problem solving. Recursion is a particularly useful mechanism for generating complex hierarchies by means of self-embedding rules. In the visual domain, fractals are recursive structures in which simple transformation rules generate hierarchies of infinite depth. Research on how children acquire these rules can provide valuable insight into the cognitive requirements and learning constraints of recursion. Here, we used fractals to investigate the acquisition of recursion in the visual domain, and probed for correlations with grammar comprehension and general intelligence. We compared second (n = 26) and fourth graders (n = 26) in their ability to represent two types of rules for generating hierarchical structures: Recursive rules, on the one hand, which generate new hierarchical levels; and iterative rules, on the other hand, which merely insert items within hierarchies without generating new levels. We found that the majority of fourth graders, but not second graders, were able to represent both recursive and iterative rules. This difference was partially accounted by second graders’ impairment in detecting hierarchical mistakes, and correlated with between-grade differences in grammar comprehension tasks. Empirically, recursion and iteration also differed in at least one crucial aspect: While the ability to learn recursive rules seemed to depend on the previous acquisition of simple iterative representations, the opposite was not true, i.e., children were able to acquire iterative rules before they acquired recursive representations. These results suggest that the acquisition of recursion in vision follows learning constraints similar to the acquisition of recursion in language, and that both domains share cognitive resources involved in hierarchical processing. PMID:24955884
Digit reversal in children's writing: a simple theory and its empirical validation.
Fischer, Jean-Paul
2013-06-01
This article presents a simple theory according to which the left-right reversal of single digits by 5- and 6-year-old children is mainly due to the application of an implicit right-writing or -orienting rule. A number of nontrivial predictions can be drawn from this theory. First, left-oriented digits (1, 2, 3, 7, and 9) will be reversed more frequently than the other asymmetrical digits (4, 5, and 6). Second, for some pairs of digits, the correct writing of the preceding digit will statistically predict the reversal of the current digit and vice versa. Third, writing hand will have little effect on the frequency of reversals, and the relative frequencies with which children reverse the asymmetrical digits will be similar regardless of children's preferred writing hand. Fourth, children who reverse the left-oriented digits the most are also those who reverse the other asymmetrical digits the least. An empirical study involving 367 5- and 6-year-olds confirmed these predictions. Copyright © 2013 Elsevier Inc. All rights reserved.
Expectations for inflationary observables: simple or natural?
NASA Astrophysics Data System (ADS)
Musoke, Nathan; Easther, Richard
2017-12-01
We describe the general inflationary dynamics that can arise with a single, canonically coupled field where the inflaton potential is a 4-th order polynomial. This scenario yields a wide range of combinations of the empirical spectral observables, ns, r and αs. However, not all combinations are possible and next-generation cosmological experiments have the ability to rule out all inflationary scenarios based on this potential. Further, we construct inflationary priors for this potential based on physically motivated choices for its free parameters. These can be used to determine the degree of tuning associated with different combinations of ns, r and αs and will facilitate treatments of the inflationary model selection problem. Finally, we comment on the implications of these results for the naturalness of the overall inflationary paradigm. We argue that ruling out all simple, renormalizable potentials would not necessarily imply that the inflationary paradigm itself was unnatural, but that this eventuality would increase the importance of building inflationary scenarios in the context of broader paradigms of ultra-high energy physics.
Making sense of information in noisy networks: human communication, gossip, and distortion.
Laidre, Mark E; Lamb, Alex; Shultz, Susanne; Olsen, Megan
2013-01-21
Information from others can be unreliable. Humans nevertheless act on such information, including gossip, to make various social calculations, thus raising the question of whether individuals can sort through social information to identify what is, in fact, true. Inspired by empirical literature on people's decision-making when considering gossip, we built an agent-based simulation model to examine how well simple decision rules could make sense of information as it propagated through a network. Our simulations revealed that a minimalistic decision-rule 'Bit-wise mode' - which compared information from multiple sources and then sought a consensus majority for each component bit within the message - was consistently the most successful at converging upon the truth. This decision rule attained high relative fitness even in maximally noisy networks, composed entirely of nodes that distorted the message. The rule was also superior to other decision rules regardless of its frequency in the population. Simulations carried out with variable agent memory constraints, different numbers of observers who initiated information propagation, and a variety of network types suggested that the single most important factor in making sense of information was the number of independent sources that agents could consult. Broadly, our model suggests that despite the distortion information is subject to in the real world, it is nevertheless possible to make sense of it based on simple Darwinian computations that integrate multiple sources. Copyright © 2012 Elsevier Ltd. All rights reserved.
A simple rule for the costs of vigilance: empirical evidence from a social forager.
Cowlishaw, Guy; Lawes, Michael J.; Lightbody, Margaret; Martin, Alison; Pettifor, Richard; Rowcliffe, J. Marcus
2004-01-01
It is commonly assumed that anti-predator vigilance by foraging animals is costly because it interrupts food searching and handling time, leading to a reduction in feeding rate. When food handling does not require visual attention, however, a forager may handle food while simultaneously searching for the next food item or scanning for predators. We present a simple model of this process, showing that when the length of such compatible handling time Hc is long relative to search time S, specifically Hc/S > 1, it is possible to perform vigilance without a reduction in feeding rate. We test three predictions of this model regarding the relationships between feeding rate, vigilance and the Hc/S ratio, with data collected from a wild population of social foragers (samango monkeys, Cercopithecus mitis erythrarchus). These analyses consistently support our model, including our key prediction: as Hc/S increases, the negative relationship between feeding rate and the proportion of time spent scanning becomes progressively shallower. This pattern is more strongly driven by changes in median scan duration than scan frequency. Our study thus provides a simple rule that describes the extent to which vigilance can be expected to incur a feeding rate cost. PMID:15002768
Against the empirical viability of the Deutsch-Wallace-Everett approach to quantum mechanics
NASA Astrophysics Data System (ADS)
Dawid, Richard; Thébault, Karim P. Y.
2014-08-01
The subjective Everettian approach to quantum mechanics presented by Deutsch and Wallace fails to constitute an empirically viable theory of quantum phenomena. The decision theoretic implementation of the Born rule realized in this approach provides no basis for rejecting Everettian quantum mechanics in the face of empirical data that contradicts the Born rule. The approach of Greaves and Myrvold, which provides a subjective implementation of the Born rule as well but derives it from empirical data rather than decision theoretic arguments, avoids the problem faced by Deutsch and Wallace and is empirically viable. However, there is good reason to cast doubts on its scientific value.
Improving Predictions of Multiple Binary Models in ILP
2014-01-01
Despite the success of ILP systems in learning first-order rules from small number of examples and complexly structured data in various domains, they struggle in dealing with multiclass problems. In most cases they boil down a multiclass problem into multiple black-box binary problems following the one-versus-one or one-versus-rest binarisation techniques and learn a theory for each one. When evaluating the learned theories of multiple class problems in one-versus-rest paradigm particularly, there is a bias caused by the default rule toward the negative classes leading to an unrealistic high performance beside the lack of prediction integrity between the theories. Here we discuss the problem of using one-versus-rest binarisation technique when it comes to evaluating multiclass data and propose several methods to remedy this problem. We also illustrate the methods and highlight their link to binary tree and Formal Concept Analysis (FCA). Our methods allow learning of a simple, consistent, and reliable multiclass theory by combining the rules of the multiple one-versus-rest theories into one rule list or rule set theory. Empirical evaluation over a number of data sets shows that our proposed methods produce coherent and accurate rule models from the rules learned by the ILP system of Aleph. PMID:24696657
Rates of profit as correlated sums of random variables
NASA Astrophysics Data System (ADS)
Greenblatt, R. E.
2013-10-01
Profit realization is the dominant feature of market-based economic systems, determining their dynamics to a large extent. Rather than attaining an equilibrium, profit rates vary widely across firms, and the variation persists over time. Differing definitions of profit result in differing empirical distributions. To study the statistical properties of profit rates, I used data from a publicly available database for the US Economy for 2009-2010 (Risk Management Association). For each of three profit rate measures, the sample space consists of 771 points. Each point represents aggregate data from a small number of US manufacturing firms of similar size and type (NAICS code of principal product). When comparing the empirical distributions of profit rates, significant ‘heavy tails’ were observed, corresponding principally to a number of firms with larger profit rates than would be expected from simple models. An apparently novel correlated sum of random variables statistical model was used to model the data. In the case of operating and net profit rates, a number of firms show negative profits (losses), ruling out simple gamma or lognormal distributions as complete models for these data.
ERIC Educational Resources Information Center
OAH Magazine of History, 2002
2002-01-01
Summarizes a teaching document that is part of "Teaching the JAH" (Journal of American History) which corresponds to the article, "Empires, Exceptions, and Anglo-Saxons: Race and Rule between the British and Unites States Empires, 1880-1910" (Paul A. Kramer). Provides the Web site address for the complete installment. (CMK)
Correlated pay-offs are key to cooperation
Frommen, Joachim G.; Riehl, Christina
2016-01-01
The general belief that cooperation and altruism in social groups result primarily from kin selection has recently been challenged, not least because results from cooperatively breeding insects and vertebrates have shown that groups may be composed mainly of non-relatives. This allows testing predictions of reciprocity theory without the confounding effect of relatedness. Here, we review complementary and alternative evolutionary mechanisms to kin selection theory and provide empirical examples of cooperative behaviour among unrelated individuals in a wide range of taxa. In particular, we focus on the different forms of reciprocity and on their underlying decision rules, asking about evolutionary stability, the conditions selecting for reciprocity and the factors constraining reciprocal cooperation. We find that neither the cognitive requirements of reciprocal cooperation nor the often sequential nature of interactions are insuperable stumbling blocks for the evolution of reciprocity. We argue that simple decision rules such as ‘help anyone if helped by someone’ should get more attention in future research, because empirical studies show that animals apply such rules, and theoretical models find that they can create stable levels of cooperation under a wide range of conditions. Owing to its simplicity, behaviour based on such a heuristic may in fact be ubiquitous. Finally, we argue that the evolution of exchange and trading of service and commodities among social partners needs greater scientific focus. PMID:26729924
Simple cellular automaton model for traffic breakdown, highway capacity, and synchronized flow.
Kerner, Boris S; Klenov, Sergey L; Schreckenberg, Michael
2011-10-01
We present a simple cellular automaton (CA) model for two-lane roads explaining the physics of traffic breakdown, highway capacity, and synchronized flow. The model consists of the rules "acceleration," "deceleration," "randomization," and "motion" of the Nagel-Schreckenberg CA model as well as "overacceleration through lane changing to the faster lane," "comparison of vehicle gap with the synchronization gap," and "speed adaptation within the synchronization gap" of Kerner's three-phase traffic theory. We show that these few rules of the CA model can appropriately simulate fundamental empirical features of traffic breakdown and highway capacity found in traffic data measured over years in different countries, like characteristics of synchronized flow, the existence of the spontaneous and induced breakdowns at the same bottleneck, and associated probabilistic features of traffic breakdown and highway capacity. Single-vehicle data derived in model simulations show that synchronized flow first occurs and then self-maintains due to a spatiotemporal competition between speed adaptation to a slower speed of the preceding vehicle and passing of this slower vehicle. We find that the application of simple dependences of randomization probability and synchronization gap on driving situation allows us to explain the physics of moving synchronized flow patterns and the pinch effect in synchronized flow as observed in real traffic data.
Meys, Evelyne; Rutten, Iris; Kruitwagen, Roy; Slangen, Brigitte; Lambrechts, Sandrina; Mertens, Helen; Nolting, Ernst; Boskamp, Dieuwke; Van Gorp, Toon
2017-12-01
To analyze how well untrained examiners - without experience in the use of International Ovarian Tumor Analysis (IOTA) terminology or simple ultrasound-based rules (simple rules) - are able to apply IOTA terminology and simple rules and to assess the level of agreement between non-experts and an expert. This prospective multicenter cohort study enrolled women with ovarian masses. Ultrasound was performed by non-expert examiners and an expert. Ultrasound features were recorded using IOTA nomenclature, and used for classifying the mass by simple rules. Interobserver agreement was evaluated with Fleiss' kappa and percentage agreement between observers. 50 consecutive women were included. We observed 46 discrepancies in the description of ovarian masses when non-experts utilized IOTA terminology. Tumor type was misclassified often (n = 22), resulting in poor interobserver agreement between the non-experts and the expert (kappa = 0.39, 95 %-CI 0.244 - 0.529, percentage of agreement = 52.0 %). Misinterpretation of simple rules by non-experts was observed 57 times, resulting in an erroneous diagnosis in 15 patients (30 %). The agreement for classifying the mass as benign, malignant or inconclusive by simple rules was only moderate between the non-experts and the expert (kappa = 0.50, 95 %-CI 0.300 - 0.704, percentage of agreement = 70.0 %). The level of agreement for all 10 simple rules features varied greatly (kappa index range: -0.08 - 0.74, percentage of agreement 66 - 94 %). Although simple rules are useful to distinguish benign from malignant adnexal masses, they are not that simple for untrained examiners. Training with both IOTA terminology and simple rules is necessary before simple rules can be introduced into guidelines and daily clinical practice. © Georg Thieme Verlag KG Stuttgart · New York.
IOTA simple rules in differentiating between benign and malignant ovarian tumors.
Tantipalakorn, Charuwan; Wanapirak, Chanane; Khunamornpong, Surapan; Sukpan, Kornkanok; Tongsong, Theera
2014-01-01
To evaluate the diagnostic performance of IOTA simple rules in differentiating between benign and malignant ovarian tumors. A study of diagnostic performance was conducted on women scheduled for elective surgery due to ovarian masses between March 2007 and March 2012. All patients underwent ultrasound examination for IOTA simple rules within 24 hours of surgery. All examinations were performed by the authors, who had no any clinical information of the patients, to differentiate between benign and malignant adnexal masses using IOTA simple rules. Gold standard diagnosis was based on pathological or operative findings. A total of 398 adnexal masses, in 376 women, were available for analysis. Of them, the IOTA simple rules could be applied in 319 (80.1%) including 212 (66.5%) benign tumors and 107 (33.6%) malignant tumors. The simple rules yielded inconclusive results in 79 (19.9%) masses. In the 319 masses for which the IOTA simple rules could be applied, sensitivity was 82.9% and specificity 95.3%. The IOTA simple rules have high diagnostic performance in differentiating between benign and malignant adnexal masses. Nevertheless, inconclusive results are relatively common.
Reasoning with alternative explanations in physics: The cognitive accessibility rule
NASA Astrophysics Data System (ADS)
Heckler, Andrew F.; Bogdan, Abigail M.
2018-06-01
A critical component of scientific reasoning is the consideration of alternative explanations. Recognizing that decades of cognitive psychology research have demonstrated that relative cognitive accessibility, or "what comes to mind," strongly affects how people reason in a given context, we articulate a simple "cognitive accessibility rule", namely that alternative explanations are considered less frequently when an explanation with relatively high accessibility is offered first. In a series of four experiments, we test the cognitive accessibility rule in the context of consideration of alternative explanations for six physical scenarios commonly found in introductory physics curricula. First, we administer free recall and recognition tasks to operationally establish and distinguish between the relative accessibility and availability of common explanations for the physical scenarios. Then, we offer either high or low accessibility explanations for the physical scenarios and determine the extent to which students consider alternatives to the given explanations. We find two main results consistent across algebra- and calculus-based university level introductory physics students for multiple answer formats. First, we find evidence that, at least for some contexts, most explanatory factors are cognitively available to students but not cognitively accessible. Second, we empirically verify the cognitive accessibility rule and demonstrate that the rule is strongly predictive, accounting for up to 70% of the variance of the average student consideration of alternative explanations across scenarios. Overall, we find that cognitive accessibility can help to explain biases in the consideration of alternatives in reasoning about simple physical scenarios, and these findings lend support to the growing number of science education studies demonstrating that tasks relevant to science education curricula often involve rapid, automatic, and potentially predictable processes and outcomes.
Mad cows, terrorism and junk food: should public policy reflect perceived or objective risks?
Johansson-Stenman, Olof
2008-03-01
Empirical evidence suggests that people's risk-perceptions are often systematically biased. This paper develops a simple framework to analyse public policy when this is the case. Expected utility (well-being) is shown to depend on both objective and perceived risks (beliefs). The latter are important because of the fear associated with the risk and as a basis for corrective taxation and second-best adjustments. Optimality rules for public provision of risk-reducing investments, "internality-correcting" taxation (e.g. fat taxes) and provision of costly information to reduce people's risk-perception bias are presented.
Eisenhardt, K M; Sull, D N
2001-01-01
The success of Yahoo!, eBay, Enron, and other companies that have become adept at morphing to meet the demands of changing markets can't be explained using traditional thinking about competitive strategy. These companies have succeeded by pursuing constantly evolving strategies in market spaces that were considered unattractive according to traditional measures. In this article--the third in an HBR series by Kathleen Eisenhardt and Donald Sull on strategy in the new economy--the authors ask, what are the sources of competitive advantage in high-velocity markets? The secret, they say, is strategy as simple rules. The companies know that the greatest opportunities for competitive advantage lie in market confusion, but they recognize the need for a few crucial strategic processes and a few simple rules. In traditional strategy, advantage comes from exploiting resources or stable market positions. In strategy as simple rules, advantage comes from successfully seizing fleeting opportunities. Key strategic processes, such as product innovation, partnering, or spinout creation, place the company where the flow of opportunities is greatest. Simple rules then provide the guidelines within which managers can pursue such opportunities. Simple rules, which grow out of experience, fall into five broad categories: how- to rules, boundary conditions, priority rules, timing rules, and exit rules. Companies with simple-rules strategies must follow the rules religiously and avoid the temptation to change them too frequently. A consistent strategy helps managers sort through opportunities and gain short-term advantage by exploiting the attractive ones. In stable markets, managers rely on complicated strategies built on detailed predictions of the future. But when business is complicated, strategy should be simple.
Understanding the complex dynamics of stock markets through cellular automata
NASA Astrophysics Data System (ADS)
Qiu, G.; Kandhai, D.; Sloot, P. M. A.
2007-04-01
We present a cellular automaton (CA) model for simulating the complex dynamics of stock markets. Within this model, a stock market is represented by a two-dimensional lattice, of which each vertex stands for a trader. According to typical trading behavior in real stock markets, agents of only two types are adopted: fundamentalists and imitators. Our CA model is based on local interactions, adopting simple rules for representing the behavior of traders and a simple rule for price updating. This model can reproduce, in a simple and robust manner, the main characteristics observed in empirical financial time series. Heavy-tailed return distributions due to large price variations can be generated through the imitating behavior of agents. In contrast to other microscopic simulation (MS) models, our results suggest that it is not necessary to assume a certain network topology in which agents group together, e.g., a random graph or a percolation network. That is, long-range interactions can emerge from local interactions. Volatility clustering, which also leads to heavy tails, seems to be related to the combined effect of a fast and a slow process: the evolution of the influence of news and the evolution of agents’ activity, respectively. In a general sense, these causes of heavy tails and volatility clustering appear to be common among some notable MS models that can confirm the main characteristics of financial markets.
Transport Properties of Complex Oxides: New Ideas and Insights from Theory and Simulation
NASA Astrophysics Data System (ADS)
Benedek, Nicole
Complex oxides are one of the largest and most technologically important materials families. The ABO3 perovskite oxides in particular display an unparalleled variety of physical properties. The microscopic origin of these properties (how they arise from the structure of the material) is often complicated, but in many systems previous research has identified simple guidelines or `rules of thumb' that link structure and chemistry to the physics of interest. For example, the tolerance factor is a simple empirical measure that relates the composition of a perovskite to its tendency to adopt a distorted structure. First-principles calculations have shown that the tendency towards ferroelectricity increases systematically as the tolerance factor of the perovskite decreases. Can we uncover a similar set of simple guidelines to yield new insights into the ionic and thermal transport properties of perovskites? I will discuss recent research from my group on the link between crystal structure and chemistry, soft phonons and ionic transport in a family of layered perovskite oxides, the Ln2NiO4+δ Ruddlesden-Popper phases. In particular, we show how the lattice dynamical properties of these materials (their tendency to undergo certain structural distortions) can be correlated with oxide ion transport properties. Ultimately, we seek new ways to understand the microscopic origins of complex transport processes and to develop first-principles-based design rules for new materials based on our understanding.
Simple cellular automaton model for traffic breakdown, highway capacity, and synchronized flow
NASA Astrophysics Data System (ADS)
Kerner, Boris S.; Klenov, Sergey L.; Schreckenberg, Michael
2011-10-01
We present a simple cellular automaton (CA) model for two-lane roads explaining the physics of traffic breakdown, highway capacity, and synchronized flow. The model consists of the rules “acceleration,” “deceleration,” “randomization,” and “motion” of the Nagel-Schreckenberg CA model as well as “overacceleration through lane changing to the faster lane,” “comparison of vehicle gap with the synchronization gap,” and “speed adaptation within the synchronization gap” of Kerner's three-phase traffic theory. We show that these few rules of the CA model can appropriately simulate fundamental empirical features of traffic breakdown and highway capacity found in traffic data measured over years in different countries, like characteristics of synchronized flow, the existence of the spontaneous and induced breakdowns at the same bottleneck, and associated probabilistic features of traffic breakdown and highway capacity. Single-vehicle data derived in model simulations show that synchronized flow first occurs and then self-maintains due to a spatiotemporal competition between speed adaptation to a slower speed of the preceding vehicle and passing of this slower vehicle. We find that the application of simple dependences of randomization probability and synchronization gap on driving situation allows us to explain the physics of moving synchronized flow patterns and the pinch effect in synchronized flow as observed in real traffic data.
Optimal two-phase sampling design for comparing accuracies of two binary classification rules.
Xu, Huiping; Hui, Siu L; Grannis, Shaun
2014-02-10
In this paper, we consider the design for comparing the performance of two binary classification rules, for example, two record linkage algorithms or two screening tests. Statistical methods are well developed for comparing these accuracy measures when the gold standard is available for every unit in the sample, or in a two-phase study when the gold standard is ascertained only in the second phase in a subsample using a fixed sampling scheme. However, these methods do not attempt to optimize the sampling scheme to minimize the variance of the estimators of interest. In comparing the performance of two classification rules, the parameters of primary interest are the difference in sensitivities, specificities, and positive predictive values. We derived the analytic variance formulas for these parameter estimates and used them to obtain the optimal sampling design. The efficiency of the optimal sampling design is evaluated through an empirical investigation that compares the optimal sampling with simple random sampling and with proportional allocation. Results of the empirical study show that the optimal sampling design is similar for estimating the difference in sensitivities and in specificities, and both achieve a substantial amount of variance reduction with an over-sample of subjects with discordant results and under-sample of subjects with concordant results. A heuristic rule is recommended when there is no prior knowledge of individual sensitivities and specificities, or the prevalence of the true positive findings in the study population. The optimal sampling is applied to a real-world example in record linkage to evaluate the difference in classification accuracy of two matching algorithms. Copyright © 2013 John Wiley & Sons, Ltd.
Empirical evaluation of interest-level criteria
NASA Astrophysics Data System (ADS)
Sahar, Sigal; Mansour, Yishay
1999-02-01
Efficient association rule mining algorithms already exist, however, as the size of databases increases, the number of patterns mined by the algorithms increases to such an extent that their manual evaluation becomes impractical. Automatic evaluation methods are, therefore, required in order to sift through the initial list of rules, which the datamining algorithm outputs. These evaluation methods, or criteria, rank the association rules mined from the dataset. We empirically examined several such statistical criteria: new criteria, as well as previously known ones. The empirical evaluation was conducted using several databases, including a large real-life dataset, acquired from an order-by-phone grocery store, a dataset composed from www proxy logs, and several datasets from the UCI repository. We were interested in discovering whether the ranking performed by the various criteria is similar or easily distinguishable. Our evaluation detected, when significant differences exist, three patterns of behavior in the eight criteria we examined. There is an obvious dilemma in determining how many association rules to choose (in accordance with support and confidence parameters). The tradeoff is between having stringent parameters and, therefore, few rules, or lenient parameters and, thus, a multitude of rules. In many cases, our empirical evaluation revealed that most of the rules found by the comparably strict parameters ranked highly according to the interestingness criteria, when using lax parameters (producing significantly more association rules). Finally, we discuss the association rules that ranked highest, explain why these results are sound, and how they direct future research.
An equation of state for the financial markets: connecting order flow to price formation.
NASA Astrophysics Data System (ADS)
Gerig, Austin; Mike, Szabolcs; Doyne Farmer, J.
2006-03-01
Many of the peculiarities of price formation in the financial marketplace can be understood as the result of a few regularities in the placement and removal of trading orders. Based on a large data set from the London Stock Exchange we show that the distribution of prices where people place orders to buy or sell follows a surprisingly simple functional form that depends on the current best prices. In addition, whether or not an order is to buy or sell is described by a long-memory process, and the cancellation of orders can be described by a few simple rules. When these results are combined, simply by following the rules of the continuous double auction, the resulting simulation model produces good predictions for the distribution of price changes and transaction costs without any adjustment of parameters. We use the model to empirically derive equations of state relating order flow and the statistical properties of prices. In contrast to previous conjectures, our results demonstrate that these distributions are not universal, but rather depend on parameters of individual markets. They also show that factors other than supply and demand play an important role in price formation.
Tinnangwattana, Dangcheewan; Vichak-Ururote, Linlada; Tontivuthikul, Paponrad; Charoenratana, Cholaros; Lerthiranwong, Thitikarn; Tongsong, Theera
2015-01-01
To evaluate the diagnostic performance of IOTA simple rules in predicting malignant adnexal tumors by non-expert examiners. Five obstetric/gynecologic residents, who had never performed gynecologic ultrasound examination by themselves before, were trained for IOTA simple rules by an experienced examiner. One trained resident performed ultrasound examinations including IOTA simple rules on 100 women, who were scheduled for surgery due to ovarian masses, within 24 hours of surgery. The gold standard diagnosis was based on pathological or operative findings. The five-trained residents performed IOTA simple rules on 30 patients for evaluation of inter-observer variability. A total of 100 patients underwent ultrasound examination for the IOTA simple rules. Of them, IOTA simple rules could be applied in 94 (94%) masses including 71 (71.0%) benign masses and 29 (29.0%) malignant masses. The diagnostic performance of IOTA simple rules showed sensitivity of 89.3% (95%CI, 77.8%; 100.7%), specificity 83.3% (95%CI, 74.3%; 92.3%). Inter-observer variability was analyzed using Cohen's kappa coefficient. Kappa indices of the four pairs of raters are 0.713-0.884 (0.722, 0.827, 0.713, and 0.884). IOTA simple rules have high diagnostic performance in discriminating adnexal masses even when are applied by non-expert sonographers, though a training course may be required. Nevertheless, they should be further tested by a greater number of general practitioners before widely use.
NASA Astrophysics Data System (ADS)
Vaucouleur, Sebastien
2011-02-01
We introduce code query by example for customisation of evolvable software products in general and of enterprise resource planning systems (ERPs) in particular. The concept is based on an initial empirical study on practices around ERP systems. We motivate our design choices based on those empirical results, and we show how the proposed solution helps with respect to the infamous upgrade problem: the conflict between the need for customisation and the need for upgrade of ERP systems. We further show how code query by example can be used as a form of lightweight static analysis, to detect automatically potential defects in large software products. Code query by example as a form of lightweight static analysis is particularly interesting in the context of ERP systems: it is often the case that programmers working in this field are not computer science specialists but more of domain experts. Hence, they require a simple language to express custom rules.
Bröder, A
2000-09-01
The boundedly rational 'Take-The-Best" heuristic (TTB) was proposed by G. Gigerenzer, U. Hoffrage, and H. Kleinbölting (1991) as a model of fast and frugal probabilistic inferences. Although the simple lexicographic rule proved to be successful in computer simulations, direct empirical demonstrations of its adequacy as a psychological model are lacking because of several methodical problems. In 4 experiments with a total of 210 participants, this question was addressed. Whereas Experiment 1 showed that TTB is not valid as a universal hypothesis about probabilistic inferences, up to 28% of participants in Experiment 2 and 53% of participants in Experiment 3 were classified as TTB users. Experiment 4 revealed that investment costs for information seem to be a relevant factor leading participants to switch to a noncompensatory TTB strategy. The observed individual differences in strategy use imply the recommendation of an idiographic approach to decision-making research.
An integrated theory of the mind.
Anderson, John R; Bothell, Daniel; Byrne, Michael D; Douglass, Scott; Lebiere, Christian; Qin, Yulin
2004-10-01
Adaptive control of thought-rational (ACT-R; J. R. Anderson & C. Lebiere, 1998) has evolved into a theory that consists of multiple modules but also explains how these modules are integrated to produce coherent cognition. The perceptual-motor modules, the goal module, and the declarative memory module are presented as examples of specialized systems in ACT-R. These modules are associated with distinct cortical regions. These modules place chunks in buffers where they can be detected by a production system that responds to patterns of information in the buffers. At any point in time, a single production rule is selected to respond to the current pattern. Subsymbolic processes serve to guide the selection of rules to fire as well as the internal operations of some modules. Much of learning involves tuning of these subsymbolic processes. A number of simple and complex empirical examples are described to illustrate how these modules function singly and in concert. 2004 APA
Visual conspicuity: a new simple standard, its reliability, validity and applicability.
Wertheim, A H
2010-03-01
A general standard for quantifying conspicuity is described. It derives from a simple and easy method to quantitatively measure the visual conspicuity of an object. The method stems from the theoretical view that the conspicuity of an object is not a property of that object, but describes the degree to which the object is perceptually embedded in, i.e. laterally masked by, its visual environment. First, three variations of a simple method to measure the strength of such lateral masking are described and empirical evidence for its reliability and its validity is presented, as are several tests of predictions concerning the effects of viewing distance and ambient light. It is then shown how this method yields a conspicuity standard, expressed as a number, which can be made part of a rule of law, and which can be used to test whether or not, and to what extent, the conspicuity of a particular object, e.g. a traffic sign, meets a predetermined criterion. An additional feature is that, when used under different ambient light conditions, the method may also yield an index of the amount of visual clutter in the environment. Taken together the evidence illustrates the methods' applicability in both the laboratory and in real-life situations. STATEMENT OF RELEVANCE: This paper concerns a proposal for a new method to measure visual conspicuity, yielding a numerical index that can be used in a rule of law. It is of importance to ergonomists and human factor specialists who are asked to measure the conspicuity of an object, such as a traffic or rail-road sign, or any other object. The new method is simple and circumvents the need to perform elaborate (search) experiments and thus has great relevance as a simple tool for applied research.
Leadership in moving human groups.
Boos, Margarete; Pritz, Johannes; Lange, Simon; Belz, Michael
2014-04-01
How is movement of individuals coordinated as a group? This is a fundamental question of social behaviour, encompassing phenomena such as bird flocking, fish schooling, and the innumerable activities in human groups that require people to synchronise their actions. We have developed an experimental paradigm, the HoneyComb computer-based multi-client game, to empirically investigate human movement coordination and leadership. Using economic games as a model, we set monetary incentives to motivate players on a virtual playfield to reach goals via players' movements. We asked whether (I) humans coordinate their movements when information is limited to an individual group member's observation of adjacent group member motion, (II) whether an informed group minority can lead an uninformed group majority to the minority's goal, and if so, (III) how this minority exerts its influence. We showed that in a human group--on the basis of movement alone--a minority can successfully lead a majority. Minorities lead successfully when (a) their members choose similar initial steps towards their goal field and (b) they are among the first in the whole group to make a move. Using our approach, we empirically demonstrate that the rules of swarming behaviour apply to humans. Even complex human behaviour, such as leadership and directed group movement, follow simple rules that are based on visual perception of local movement.
NASA Astrophysics Data System (ADS)
Murakami, Hisashi; Gunji, Yukio-Pegio
2017-07-01
Although foraging patterns have long been predicted to optimally adapt to environmental conditions, empirical evidence has been found in recent years. This evidence suggests that the search strategy of animals is open to change so that animals can flexibly respond to their environment. In this study, we began with a simple computational model that possesses the principal features of an intermittent strategy, i.e., careful local searches separated by longer steps, as a mechanism for relocation, where an agent in the model follows a rule to switch between two phases, but it could misunderstand this rule, i.e., the agent follows an ambiguous switching rule. Thanks to this ambiguity, the agent's foraging strategy can continuously change. First, we demonstrate that our model can exhibit an optimal change of strategy from Brownian-type to Lévy-type depending on the prey density, and we investigate the distribution of time intervals for switching between the phases. Moreover, we show that the model can display higher search efficiency than a correlated random walk.
Huang, Li; Yuan, Jiamin; Yang, Zhimin; Xu, Fuping; Huang, Chunhua
2015-01-01
Background. In this study, we use association rules to explore the latent rules and patterns of prescribing and adjusting the ingredients of herbal decoctions based on empirical herbal formula of Chinese Medicine (CM). Materials and Methods. The consideration and development of CM prescriptions based on the knowledge of CM doctors are analyzed. The study contained three stages. The first stage is to identify the chief symptoms to a specific empirical herbal formula, which can serve as the key indication for herb addition and cancellation. The second stage is to conduct a case study on the empirical CM herbal formula for insomnia. Doctors will add extra ingredients or cancel some of them by CM syndrome diagnosis. The last stage of the study is to divide the observed cases into the effective group and ineffective group based on the assessed clinical effect by doctors. The patterns during the diagnosis and treatment are selected by the applied algorithm and the relations between clinical symptoms or indications and herb choosing principles will be selected by the association rules algorithm. Results. Totally 40 patients were observed in this study: 28 patients were considered effective after treatment and the remaining 12 were ineffective. 206 patterns related to clinical indications of Chinese Medicine were checked and screened with each observed case. In the analysis of the effective group, we used the algorithm of association rules to select combinations between 28 herbal adjustment strategies of the empirical herbal formula and the 190 patterns of individual clinical manifestations. During this stage, 11 common patterns were eliminated and 5 major symptoms for insomnia remained. 12 association rules were identified which included 5 herbal adjustment strategies. Conclusion. The association rules method is an effective algorithm to explore the latent relations between clinical indications and herbal adjustment strategies for the study on empirical herbal formulas. PMID:26495415
Predicting the stability of nanodevices
NASA Astrophysics Data System (ADS)
Lin, Z. Z.; Yu, W. F.; Wang, Y.; Ning, X. J.
2011-05-01
A simple model based on the statistics of single atoms is developed to predict the stability or lifetime of nanodevices without empirical parameters. Under certain conditions, the model produces the Arrhenius law and the Meyer-Neldel compensation rule. Compared with the classical molecular-dynamics simulations for predicting the stability of monatomic carbon chain at high temperature, the model is proved to be much more accurate than the transition state theory. Based on the ab initio calculation of the static potential, the model can give out a corrected lifetime of monatomic carbon and gold chains at higher temperature, and predict that the monatomic chains are very stable at room temperature.
Oosterman, Joukje M; Heringa, Sophie M; Kessels, Roy P C; Biessels, Geert Jan; Koek, Huiberdina L; Maes, Joseph H R; van den Berg, Esther
2017-04-01
Rule induction tests such as the Wisconsin Card Sorting Test require executive control processes, but also the learning and memorization of simple stimulus-response rules. In this study, we examined the contribution of diminished learning and memorization of simple rules to complex rule induction test performance in patients with amnestic mild cognitive impairment (aMCI) or Alzheimer's dementia (AD). Twenty-six aMCI patients, 39 AD patients, and 32 control participants were included. A task was used in which the memory load and the complexity of the rules were independently manipulated. This task consisted of three conditions: a simple two-rule learning condition (Condition 1), a simple four-rule learning condition (inducing an increase in memory load, Condition 2), and a complex biconditional four-rule learning condition-inducing an increase in complexity and, hence, executive control load (Condition 3). Performance of AD patients declined disproportionately when the number of simple rules that had to be memorized increased (from Condition 1 to 2). An additional increment in complexity (from Condition 2 to 3) did not, however, disproportionately affect performance of the patients. Performance of the aMCI patients did not differ from that of the control participants. In the patient group, correlation analysis showed that memory performance correlated with Condition 1 performance, whereas executive task performance correlated with Condition 2 performance. These results indicate that the reduced learning and memorization of underlying task rules explains a significant part of the diminished complex rule induction performance commonly reported in AD, although results from the correlation analysis suggest involvement of executive control functions as well. Taken together, these findings suggest that care is needed when interpreting rule induction task performance in terms of executive function deficits in these patients.
Bayesian learning and the psychology of rule induction
Endress, Ansgar D.
2014-01-01
In recent years, Bayesian learning models have been applied to an increasing variety of domains. While such models have been criticized on theoretical grounds, the underlying assumptions and predictions are rarely made concrete and tested experimentally. Here, I use Frank and Tenenbaum's (2011) Bayesian model of rule-learning as a case study to spell out the underlying assumptions, and to confront them with the empirical results Frank and Tenenbaum (2011) propose to simulate, as well as with novel experiments. While rule-learning is arguably well suited to rational Bayesian approaches, I show that their models are neither psychologically plausible nor ideal observer models. Further, I show that their central assumption is unfounded: humans do not always preferentially learn more specific rules, but, at least in some situations, those rules that happen to be more salient. Even when granting the unsupported assumptions, I show that all of the experiments modeled by Frank and Tenenbaum (2011) either contradict their models, or have a large number of more plausible interpretations. I provide an alternative account of the experimental data based on simple psychological mechanisms, and show that this account both describes the data better, and is easier to falsify. I conclude that, despite the recent surge in Bayesian models of cognitive phenomena, psychological phenomena are best understood by developing and testing psychological theories rather than models that can be fit to virtually any data. PMID:23454791
FAF-Drugs2: free ADME/tox filtering tool to assist drug discovery and chemical biology projects.
Lagorce, David; Sperandio, Olivier; Galons, Hervé; Miteva, Maria A; Villoutreix, Bruno O
2008-09-24
Drug discovery and chemical biology are exceedingly complex and demanding enterprises. In recent years there are been increasing awareness about the importance of predicting/optimizing the absorption, distribution, metabolism, excretion and toxicity (ADMET) properties of small chemical compounds along the search process rather than at the final stages. Fast methods for evaluating ADMET properties of small molecules often involve applying a set of simple empirical rules (educated guesses) and as such, compound collections' property profiling can be performed in silico. Clearly, these rules cannot assess the full complexity of the human body but can provide valuable information and assist decision-making. This paper presents FAF-Drugs2, a free adaptable tool for ADMET filtering of electronic compound collections. FAF-Drugs2 is a command line utility program (e.g., written in Python) based on the open source chemistry toolkit OpenBabel, which performs various physicochemical calculations, identifies key functional groups, some toxic and unstable molecules/functional groups. In addition to filtered collections, FAF-Drugs2 can provide, via Gnuplot, several distribution diagrams of major physicochemical properties of the screened compound libraries. We have developed FAF-Drugs2 to facilitate compound collection preparation, prior to (or after) experimental screening or virtual screening computations. Users can select to apply various filtering thresholds and add rules as needed for a given project. As it stands, FAF-Drugs2 implements numerous filtering rules (23 physicochemical rules and 204 substructure searching rules) that can be easily tuned.
Measuring 'virtue' in medicine.
Kotzee, Ben; Ignatowicz, Agnieszka
2016-06-01
Virtue-approaches to medical ethics are becoming ever more influential. Virtue theorists advocate redefining right or good action in medicine in terms of the character of the doctor performing the action (rather than adherence to rules or principles). In medical education, too, calls are growing to reconceive medical education as a form of character formation (rather than instruction in rules or principles). Empirical studies of doctors' ethics from a virtue-perspective, however, are few and far between. In this respect, theoretical and empirical study of medical ethics are out of alignment. In this paper, we survey the empirical study of medical ethics and find that most studies of doctors' ethics are rules- or principles-based and not virtue-based. We outline the challenges that exist for studying medical ethics empirically from a virtue-based perspective and canvas the runners and riders in the effort to find virtue-based assessments of medical ethics.
Sonographic Diagnosis of Tubal Cancer with IOTA Simple Rules Plus Pattern Recognition
Tongsong, Theera; Wanapirak, Chanane; Tantipalakorn, Charuwan; Tinnangwattana, Dangcheewan
2017-01-01
Objective: To evaluate diagnostic performance of IOTA simple rules plus pattern recognition in predicting tubal cancer. Methods: Secondary analysis was performed on prospective database of our IOTA project. The patients recruited in the project were those who were scheduled for pelvic surgery due to adnexal masses. The patients underwent ultrasound examinations within 24 hours before surgery. On ultrasound examination, the masses were evaluated using the well-established IOTA simple rules plus pattern recognition (sausage-shaped appearance, incomplete septum, visible ipsilateral ovaries) to predict tubal cancer. The gold standard diagnosis was based on histological findings or operative findings. Results: A total of 482 patients, including 15 cases of tubal cancer, were evaluated by ultrasound preoperatively. The IOTA simple rules plus pattern recognition gave a sensitivity of 86.7% (13 in 15) and specificity of 97.4%. Sausage-shaped appearance was identified in nearly all cases (14 in 15). Incomplete septa and normal ovaries could be identified in 33.3% and 40%, respectively. Conclusion: IOTA simple rules plus pattern recognition is relatively effective in predicting tubal cancer. Thus, we propose the simple scheme in diagnosis of tubal cancer as follows. First of all, the adnexal masses are evaluated with IOTA simple rules. If the B-rules could be applied, tubal cancer is reliably excluded. If the M-rules could be applied or the result is inconclusive, careful delineation of the mass with pattern recognition should be performed. PMID:29172273
Sonographic Diagnosis of Tubal Cancer with IOTA Simple Rules Plus Pattern Recognition
Tongsong, Theera; Wanapirak, Chanane; Tantipalakorn, Charuwan; Tinnangwattana, Dangcheewan
2017-11-26
Objective: To evaluate diagnostic performance of IOTA simple rules plus pattern recognition in predicting tubal cancer. Methods: Secondary analysis was performed on prospective database of our IOTA project. The patients recruited in the project were those who were scheduled for pelvic surgery due to adnexal masses. The patients underwent ultrasound examinations within 24 hours before surgery. On ultrasound examination, the masses were evaluated using the well-established IOTA simple rules plus pattern recognition (sausage-shaped appearance, incomplete septum, visible ipsilateral ovaries) to predict tubal cancer. The gold standard diagnosis was based on histological findings or operative findings. Results: A total of 482 patients, including 15 cases of tubal cancer, were evaluated by ultrasound preoperatively. The IOTA simple rules plus pattern recognition gave a sensitivity of 86.7% (13 in 15) and specificity of 97.4%. Sausage-shaped appearance was identified in nearly all cases (14 in 15). Incomplete septa and normal ovaries could be identified in 33.3% and 40%, respectively. Conclusion: IOTA simple rules plus pattern recognition is relatively effective in predicting tubal cancer. Thus, we propose the simple scheme in diagnosis of tubal cancer as follows. First of all, the adnexal masses are evaluated with IOTA simple rules. If the B-rules could be applied, tubal cancer is reliably excluded. If the M-rules could be applied or the result is inconclusive, careful delineation of the mass with pattern recognition should be performed. Creative Commons Attribution License
A detailed comparison of optimality and simplicity in perceptual decision-making
Shen, Shan; Ma, Wei Ji
2017-01-01
Two prominent ideas in the study of decision-making have been that organisms behave near-optimally, and that they use simple heuristic rules. These principles might be operating in different types of tasks, but this possibility cannot be fully investigated without a direct, rigorous comparison within a single task. Such a comparison was lacking in most previous studies, because a) the optimal decision rule was simple; b) no simple suboptimal rules were considered; c) it was unclear what was optimal, or d) a simple rule could closely approximate the optimal rule. Here, we used a perceptual decision-making task in which the optimal decision rule is well-defined and complex, and makes qualitatively distinct predictions from many simple suboptimal rules. We find that all simple rules tested fail to describe human behavior, that the optimal rule accounts well for the data, and that several complex suboptimal rules are indistinguishable from the optimal one. Moreover, we found evidence that the optimal model is close to the true model: first, the better the trial-to-trial predictions of a suboptimal model agree with those of the optimal model, the better that suboptimal model fits; second, our estimate of the Kullback-Leibler divergence between the optimal model and the true model is not significantly different from zero. When observers receive no feedback, the optimal model still describes behavior best, suggesting that sensory uncertainty is implicitly represented and taken into account. Beyond the task and models studied here, our results have implications for best practices of model comparison. PMID:27177259
Identifying and quantifying interactions in a laboratory swarm
NASA Astrophysics Data System (ADS)
Puckett, James; Kelley, Douglas; Ouellette, Nicholas
2013-03-01
Emergent collective behavior, such as in flocks of birds or swarms of bees, is exhibited throughout the animal kingdom. Many models have been developed to describe swarming and flocking behavior using systems of self-propelled particles obeying simple rules or interacting via various potentials. However, due to experimental difficulties and constraints, little empirical data exists for characterizing the exact form of the biological interactions. We study laboratory swarms of flying Chironomus riparius midges, using stereoimaging and particle tracking techniques to record three-dimensional trajectories for all the individuals in the swarm. We describe methods to identify and quantify interactions by examining these trajectories, and report results on interaction magnitude, frequency, and mutuality.
ERIC Educational Resources Information Center
Martz, Carlton
2001-01-01
This issue of "Bill of Rights in Action" explores issues raised by empires and imperial law. The first article, "Clash of Empires: The Fight for North America," looks at the clash of empires and the fight for North America during the 18th century. The second article, "When Roman Law Ruled the Western World," examines…
Performance of technical trading rules: evidence from Southeast Asian stock markets.
Tharavanij, Piyapas; Siraprapasiri, Vasan; Rajchamaha, Kittichai
2015-01-01
This paper examines the profitability of technical trading rules in the five Southeast Asian stock markets. The data cover a period of 14 years from January 2000 to December 2013. The instruments investigated are five Southeast Asian stock market indices: SET index (Thailand), FTSE Bursa Malaysia KLC index (Malaysia), FTSE Straits Times index (Singapore), JSX Composite index (Indonesia), and PSE composite index (the Philippines). Trading strategies investigated include Relative Strength Index, Stochastic oscillator, Moving Average Convergence-Divergence, Directional Movement Indicator and On Balance Volume. Performances are compared to a simple Buy-and-Hold. Statistical tests are also performed. Our empirical results show a strong performance of technical trading rules in an emerging stock market of Thailand but not in a more mature stock market of Singapore. The technical trading rules also generate statistical significant returns in the Malaysian, Indonesian and the Philippine markets. However, after taking transaction costs into account, most technical trading rules do not generate net returns. This fact suggests different levels of market efficiency among Southeast Asian stock markets. This paper finds three new insights. Firstly, technical indicators does not help much in terms of market timing. Basically, traders cannot expect to buy at a relative low price and sell at a relative high price by just using technical trading rules. Secondly, technical trading rules can be beneficial to individual investors as they help them to counter the behavioral bias called disposition effects which is the tendency to sell winning stocks too soon and holding on to losing stocks too long. Thirdly, even profitable strategies could not reliably predict subsequent market directions. They make money from having a higher average profit from profitable trades than an average loss from unprofitable ones.
Using an empirical and rule-based modeling approach to map cause of disturbance in U.S
Todd A. Schroeder; Gretchen G. Moisen; Karen Schleeweis; Chris Toney; Warren B. Cohen; Zhiqiang Yang; Elizabeth A. Freeman
2015-01-01
Recently completing over a decade of research, the NASA/NACP funded North American Forest Dynamics (NAFD) project has led to several important advancements in the way U.S. forest disturbance dynamics are mapped at regional and continental scales. One major contribution has been the development of an empirical and rule-based modeling approach which addresses two of the...
Empirical Analysis and Refinement of Expert System Knowledge Bases
1988-08-31
refinement. Both a simulated case generation program, and a random rule basher were developed to enhance rule refinement experimentation. *Substantial...the second fiscal year 88 objective was fully met. Rule Refinement System Simulated Rule Basher Case Generator Stored Cases Expert System Knowledge...generated until the rule is satisfied. Cases may be randomly generated for a given rule or hypothesis. Rule Basher Given that one has a correct
SIRE: A Simple Interactive Rule Editor for NICBES
NASA Technical Reports Server (NTRS)
Bykat, Alex
1988-01-01
To support evolution of domain expertise, and its representation in an expert system knowledge base, a user-friendly rule base editor is mandatory. The Nickel Cadmium Battery Expert System (NICBES), a prototype of an expert system for the Hubble Space Telescope power storage management system, does not provide such an editor. In the following, a description of a Simple Interactive Rule Base Editor (SIRE) for NICBES is described. The SIRE provides a consistent internal representation of the NICBES knowledge base. It supports knowledge presentation and provides a user-friendly and code language independent medium for rule addition and modification. The SIRE is integrated with NICBES via an interface module. This module provides translation of the internal representation to Prolog-type rules (Horn clauses), latter rule assertion, and a simple mechanism for rule selection for its Prolog inference engine.
Principal Effects of Axial Load on Moment-Distribution Analysis of Rigid Structures
NASA Technical Reports Server (NTRS)
James, Benjamin Wylie
1935-01-01
This thesis presents the method of moment distribution modified to include the effect of axial load upon the bending moments. This modification makes it possible to analyze accurately complex structures, such as rigid fuselage trusses, that heretofore had to be analyzed by approximate formulas and empirical rules. The method is simple enough to be practicable even for complex structures, and it gives a means of analysis for continuous beams that is simpler than the extended three-moment equation now in common use. When the effect of axial load is included, it is found that the basic principles of moment distribution remain unchanged, the only difference being that the factors used, instead of being constants for a given member, become functions of the axial load. Formulas have been developed for these factors, and curves plotted so that their applications requires no more work than moment distribution without axial load. Simple problems have been included to illustrate the use of the curves.
Ranking structures and rank-rank correlations of countries: The FIFA and UEFA cases
NASA Astrophysics Data System (ADS)
Ausloos, Marcel; Cloots, Rudi; Gadomski, Adam; Vitanov, Nikolay K.
2014-04-01
Ranking of agents competing with each other in complex systems may lead to paradoxes according to the pre-chosen different measures. A discussion is presented on such rank-rank, similar or not, correlations based on the case of European countries ranked by UEFA and FIFA from different soccer competitions. The first question to be answered is whether an empirical and simple law is obtained for such (self-) organizations of complex sociological systems with such different measuring schemes. It is found that the power law form is not the best description contrary to many modern expectations. The stretched exponential is much more adequate. Moreover, it is found that the measuring rules lead to some inner structures in both cases.
Tongsong, Theera; Tinnangwattana, Dangcheewan; Vichak-Ururote, Linlada; Tontivuthikul, Paponrad; Charoenratana, Cholaros; Lerthiranwong, Thitikarn
2016-01-01
To compare diagnostic performance in differentiating benign from malignant ovarian masses between IOTA (the International Ovarian Tumor Analysis) simple rules and subjective sonographic assessment. Women scheduled for elective surgery because of ovarian masses were recruited into the study and underwent ultrasound examination within 24 hours of surgery to apply the IOTA simple rules by general gynecologists and to record video clips for subjective assessment by an experienced sonographer. The diagnostic performance of the IOTA rules and subjective assessment for differentiation between benign and malignant masses was compared. The gold standard diagnosis was pathological or operative findings. A total of 150 ovarian masses were covered, comprising 105 (70%) benign and 45 (30%) malignant. Of them, the IOTA simple rules could be applied in 119 (79.3%) and were inconclusive in 31 (20.7%) whereas subjective assessment could be applied in all cases (100%). The sensitivity and the specificity of the IOTA simple rules and subjective assessment were not significantly different, 82.9% vs 86.7% and 94.0% vs 94.3% respectively. The agreement of the two methods in prediction was high with a Kappa index of 0.835. Both techniques had a high diagnostic performance in differentiation between benign and malignant ovarian masses but the IOTA rules had a relatively high rate of inconclusive results. The IOTA rules can be used as an effective screening technique by general gynecologists but when the results are inconclusive they should consult experienced sonographers.
Overview and extensions of a system for routing directed graphs on SIMD architectures
NASA Technical Reports Server (NTRS)
Tomboulian, Sherryl
1988-01-01
Many problems can be described in terms of directed graphs that contain a large number of vertices where simple computations occur using data from adjacent vertices. A method is given for parallelizing such problems on an SIMD machine model that uses only nearest neighbor connections for communication, and has no facility for local indirect addressing. Each vertex of the graph will be assigned to a processor in the machine. Rules for a labeling are introduced that support the use of a simple algorithm for movement of data along the edges of the graph. Additional algorithms are defined for addition and deletion of edges. Modifying or adding a new edge takes the same time as parallel traversal. This combination of architecture and algorithms defines a system that is relatively simple to build and can do fast graph processing. All edges can be traversed in parallel in time O(T), where T is empirically proportional to the average path length in the embedding times the average degree of the graph. Additionally, researchers present an extension to the above method which allows for enhanced performance by allowing some broadcasting capabilities.
Predicting nuclear gene coalescence from mitochondrial data: the three-times rule.
Palumbi, S R; Cipriano, F; Hare, M P
2001-05-01
Coalescence theory predicts when genetic drift at nuclear loci will result in fixation of sequence differences to produce monophyletic gene trees. However, the theory is difficult to apply to particular taxa because it hinges on genetically effective population size, which is generally unknown. Neutral theory also predicts that evolution of monophyly will be four times slower in nuclear than in mitochondrial genes primarily because genetic drift is slower at nuclear loci. Variation in mitochondrial DNA (mtDNA) within and between species has been studied extensively, but can these mtDNA data be used to predict coalescence in nuclear loci? Comparison of neutral theories of coalescence of mitochondrial and nuclear loci suggests a simple rule of thumb. The "three-times rule" states that, on average, most nuclear loci will be monophyletic when the branch length leading to the mtDNA sequences of a species is three times longer than the average mtDNA sequence diversity observed within that species. A test using mitochondrial and nuclear intron data from seven species of whales and dolphins suggests general agreement with predictions of the three-times rule. We define the coalescence ratio as the mitochondrial branch length for a species divided by intraspecific mtDNA diversity. We show that species with high coalescence ratios show nuclear monophyly, whereas species with low ratios have polyphyletic nuclear gene trees. As expected, species with intermediate coalescence ratios show a variety of patterns. Especially at very high or low coalescence ratios, the three-times rule predicts nuclear gene patterns that can help detect the action of selection. The three-times rule may be useful as an empirical benchmark for evaluating evolutionary processes occurring at multiple loci.
Rule-Governed Behavior: Teaching a Preliminary Repertoire of Rule-Following to Children with Autism
ERIC Educational Resources Information Center
Tarbox, Jonathan; Zuckerman, Carrie K.; Bishop, Michele R.; Olive, Melissa L.; O'Hora, Denis P.
2011-01-01
Rule-governed behavior is generally considered an integral component of complex verbal repertoires but has rarely been the subject of empirical research. In particular, little or no previous research has attempted to establish rule-governed behavior in individuals who do not already display the repertoire. This study consists of two experiments…
Timescale analysis of rule-based biochemical reaction networks
Klinke, David J.; Finley, Stacey D.
2012-01-01
The flow of information within a cell is governed by a series of protein-protein interactions that can be described as a reaction network. Mathematical models of biochemical reaction networks can be constructed by repetitively applying specific rules that define how reactants interact and what new species are formed upon reaction. To aid in understanding the underlying biochemistry, timescale analysis is one method developed to prune the size of the reaction network. In this work, we extend the methods associated with timescale analysis to reaction rules instead of the species contained within the network. To illustrate this approach, we applied timescale analysis to a simple receptor-ligand binding model and a rule-based model of Interleukin-12 (IL-12) signaling in näive CD4+ T cells. The IL-12 signaling pathway includes multiple protein-protein interactions that collectively transmit information; however, the level of mechanistic detail sufficient to capture the observed dynamics has not been justified based upon the available data. The analysis correctly predicted that reactions associated with JAK2 and TYK2 binding to their corresponding receptor exist at a pseudo-equilibrium. In contrast, reactions associated with ligand binding and receptor turnover regulate cellular response to IL-12. An empirical Bayesian approach was used to estimate the uncertainty in the timescales. This approach complements existing rank- and flux-based methods that can be used to interrogate complex reaction networks. Ultimately, timescale analysis of rule-based models is a computational tool that can be used to reveal the biochemical steps that regulate signaling dynamics. PMID:21954150
More sound of church bells: Authors' correction
NASA Astrophysics Data System (ADS)
Vogt, Patrik; Kasper, Lutz; Burde, Jan-Philipp
2016-01-01
In the recently published article "The Sound of Church Bells: Tracking Down the Secret of a Traditional Arts and Crafts Trade," the bell frequencies have been erroneously oversimplified. The problem affects Eqs. (2) and (3), which were derived from the elementary "coffee mug model" and in which we used the speed of sound in air. However, this does not make sense from a physical point of view, since air only acts as a sound carrier, not as a sound source in the case of bells. Due to the excellent fit of the theoretical model with the empirical data, we unfortunately failed to notice this error before publication. However, all other equations, e.g., the introduction of the correction factor in Eq. (4) and the estimation of the mass in Eqs. (5) and (6) are not affected by this error, since they represent empirical models. However, it is unfortunate to introduce the speed of sound in air as a constant in Eqs. (4) and (6). Instead, we suggest the following simple rule of thumb for relating the radius of a church bell R to its humming frequency fhum:
Local Structure Theory for Cellular Automata.
NASA Astrophysics Data System (ADS)
Gutowitz, Howard Andrew
The local structure theory (LST) is a generalization of the mean field theory for cellular automata (CA). The mean field theory makes the assumption that iterative application of the rule does not introduce correlations between the states of cells in different positions. This assumption allows the derivation of a simple formula for the limit density of each possible state of a cell. The most striking feature of CA is that they may well generate correlations between the states of cells as they evolve. The LST takes the generation of correlation explicitly into account. It thus has the potential to describe statistical characteristics in detail. The basic assumption of the LST is that though correlation may be generated by CA evolution, this correlation decays with distance. This assumption allows the derivation of formulas for the estimation of the probability of large blocks of states in terms of smaller blocks of states. Given the probabilities of blocks of size n, probabilities may be assigned to blocks of arbitrary size such that these probability assignments satisfy the Kolmogorov consistency conditions and hence may be used to define a measure on the set of all possible (infinite) configurations. Measures defined in this way are called finite (or n-) block measures. A function called the scramble operator of order n maps a measure to an approximating n-block measure. The action of a CA on configurations induces an action on measures on the set of all configurations. The scramble operator is combined with the CA map on measure to form the local structure operator (LSO). The LSO of order n maps the set of n-block measures into itself. It is hypothesised that the LSO applied to n-block measures approximates the rule itself on general measures, and does so increasingly well as n increases. The fundamental advantage of the LSO is that its action is explicitly computable from a finite system of rational recursion equations. Empirical study of a number of CA rules demonstrates the potential of the LST to describe the statistical features of CA. The behavior of some simple rules is derived analytically. Other rules have more complex, chaotic behavior. Even for these rules, the LST yields an accurate portrait of both small and large time statistics.
Complexity, Training Paradigm Design, and the Contribution of Memory Subsystems to Grammar Learning
Ettlinger, Marc; Wong, Patrick C. M.
2016-01-01
Although there is variability in nonnative grammar learning outcomes, the contributions of training paradigm design and memory subsystems are not well understood. To examine this, we presented learners with an artificial grammar that formed words via simple and complex morphophonological rules. Across three experiments, we manipulated training paradigm design and measured subjects' declarative, procedural, and working memory subsystems. Experiment 1 demonstrated that passive, exposure-based training boosted learning of both simple and complex grammatical rules, relative to no training. Additionally, procedural memory correlated with simple rule learning, whereas declarative memory correlated with complex rule learning. Experiment 2 showed that presenting corrective feedback during the test phase did not improve learning. Experiment 3 revealed that structuring the order of training so that subjects are first exposed to the simple rule and then the complex improved learning. The cumulative findings shed light on the contributions of grammatical complexity, training paradigm design, and domain-general memory subsystems in determining grammar learning success. PMID:27391085
Ameye, Lieveke; Fischerova, Daniela; Epstein, Elisabeth; Melis, Gian Benedetto; Guerriero, Stefano; Van Holsbeke, Caroline; Savelli, Luca; Fruscio, Robert; Lissoni, Andrea Alberto; Testa, Antonia Carla; Veldman, Joan; Vergote, Ignace; Van Huffel, Sabine; Bourne, Tom; Valentin, Lil
2010-01-01
Objectives To prospectively assess the diagnostic performance of simple ultrasound rules to predict benignity/malignancy in an adnexal mass and to test the performance of the risk of malignancy index, two logistic regression models, and subjective assessment of ultrasonic findings by an experienced ultrasound examiner in adnexal masses for which the simple rules yield an inconclusive result. Design Prospective temporal and external validation of simple ultrasound rules to distinguish benign from malignant adnexal masses. The rules comprised five ultrasonic features (including shape, size, solidity, and results of colour Doppler examination) to predict a malignant tumour (M features) and five to predict a benign tumour (B features). If one or more M features were present in the absence of a B feature, the mass was classified as malignant. If one or more B features were present in the absence of an M feature, it was classified as benign. If both M features and B features were present, or if none of the features was present, the simple rules were inconclusive. Setting 19 ultrasound centres in eight countries. Participants 1938 women with an adnexal mass examined with ultrasound by the principal investigator at each centre with a standardised research protocol. Reference standard Histological classification of the excised adnexal mass as benign or malignant. Main outcome measures Diagnostic sensitivity and specificity. Results Of the 1938 patients with an adnexal mass, 1396 (72%) had benign tumours, 373 (19.2%) had primary invasive tumours, 111 (5.7%) had borderline malignant tumours, and 58 (3%) had metastatic tumours in the ovary. The simple rules yielded a conclusive result in 1501 (77%) masses, for which they resulted in a sensitivity of 92% (95% confidence interval 89% to 94%) and a specificity of 96% (94% to 97%). The corresponding sensitivity and specificity of subjective assessment were 91% (88% to 94%) and 96% (94% to 97%). In the 357 masses for which the simple rules yielded an inconclusive result and with available results of CA-125 measurements, the sensitivities were 89% (83% to 93%) for subjective assessment, 50% (42% to 58%) for the risk of malignancy index, 89% (83% to 93%) for logistic regression model 1, and 82% (75% to 87%) for logistic regression model 2; the corresponding specificities were 78% (72% to 83%), 84% (78% to 88%), 44% (38% to 51%), and 48% (42% to 55%). Use of the simple rules as a triage test and subjective assessment for those masses for which the simple rules yielded an inconclusive result gave a sensitivity of 91% (88% to 93%) and a specificity of 93% (91% to 94%), compared with a sensitivity of 90% (88% to 93%) and a specificity of 93% (91% to 94%) when subjective assessment was used in all masses. Conclusions The use of the simple rules has the potential to improve the management of women with adnexal masses. In adnexal masses for which the rules yielded an inconclusive result, subjective assessment of ultrasonic findings by an experienced ultrasound examiner was the most accurate diagnostic test; the risk of malignancy index and the two regression models were not useful. PMID:21156740
Role of Prefrontal Cortex in Learning and Generalizing Hierarchical Rules in 8-Month-Old Infants.
Werchan, Denise M; Collins, Anne G E; Frank, Michael J; Amso, Dima
2016-10-05
Recent research indicates that adults and infants spontaneously create and generalize hierarchical rule sets during incidental learning. Computational models and empirical data suggest that, in adults, this process is supported by circuits linking prefrontal cortex (PFC) with striatum and their modulation by dopamine, but the neural circuits supporting this form of learning in infants are largely unknown. We used near-infrared spectroscopy to record PFC activity in 8-month-old human infants during a simple audiovisual hierarchical-rule-learning task. Behavioral results confirmed that infants adopted hierarchical rule sets to learn and generalize spoken object-label mappings across different speaker contexts. Infants had increased activity over right dorsal lateral PFC when rule sets switched from one trial to the next, a neural marker related to updating rule sets into working memory in the adult literature. Infants' eye blink rate, a possible physiological correlate of striatal dopamine activity, also increased when rule sets switched from one trial to the next. Moreover, the increase in right dorsolateral PFC activity in conjunction with eye blink rate also predicted infants' generalization ability, providing exploratory evidence for frontostriatal involvement during learning. These findings provide evidence that PFC is involved in rudimentary hierarchical rule learning in 8-month-old infants, an ability that was previously thought to emerge later in life in concert with PFC maturation. Hierarchical rule learning is a powerful learning mechanism that allows rules to be selected in a context-appropriate fashion and transferred or reused in novel contexts. Data from computational models and adults suggests that this learning mechanism is supported by dopamine-innervated interactions between prefrontal cortex (PFC) and striatum. Here, we provide evidence that PFC also supports hierarchical rule learning during infancy, challenging the current dogma that PFC is an underdeveloped brain system until adolescence. These results add new insights into the neurobiological mechanisms available to support learning and generalization in very early postnatal life, providing evidence that PFC and the frontostriatal circuitry are involved in organizing learning and behavior earlier in life than previously known. Copyright © 2016 the authors 0270-6474/16/3610314-09$15.00/0.
Role of Prefrontal Cortex in Learning and Generalizing Hierarchical Rules in 8-Month-Old Infants
Werchan, Denise M.; Collins, Anne G.E.; Frank, Michael J.
2016-01-01
Recent research indicates that adults and infants spontaneously create and generalize hierarchical rule sets during incidental learning. Computational models and empirical data suggest that, in adults, this process is supported by circuits linking prefrontal cortex (PFC) with striatum and their modulation by dopamine, but the neural circuits supporting this form of learning in infants are largely unknown. We used near-infrared spectroscopy to record PFC activity in 8-month-old human infants during a simple audiovisual hierarchical-rule-learning task. Behavioral results confirmed that infants adopted hierarchical rule sets to learn and generalize spoken object–label mappings across different speaker contexts. Infants had increased activity over right dorsal lateral PFC when rule sets switched from one trial to the next, a neural marker related to updating rule sets into working memory in the adult literature. Infants' eye blink rate, a possible physiological correlate of striatal dopamine activity, also increased when rule sets switched from one trial to the next. Moreover, the increase in right dorsolateral PFC activity in conjunction with eye blink rate also predicted infants' generalization ability, providing exploratory evidence for frontostriatal involvement during learning. These findings provide evidence that PFC is involved in rudimentary hierarchical rule learning in 8-month-old infants, an ability that was previously thought to emerge later in life in concert with PFC maturation. SIGNIFICANCE STATEMENT Hierarchical rule learning is a powerful learning mechanism that allows rules to be selected in a context-appropriate fashion and transferred or reused in novel contexts. Data from computational models and adults suggests that this learning mechanism is supported by dopamine-innervated interactions between prefrontal cortex (PFC) and striatum. Here, we provide evidence that PFC also supports hierarchical rule learning during infancy, challenging the current dogma that PFC is an underdeveloped brain system until adolescence. These results add new insights into the neurobiological mechanisms available to support learning and generalization in very early postnatal life, providing evidence that PFC and the frontostriatal circuitry are involved in organizing learning and behavior earlier in life than previously known. PMID:27707968
Evaluation of IOTA Simple Ultrasound Rules to Distinguish Benign and Malignant Ovarian Tumours.
Garg, Sugandha; Kaur, Amarjit; Mohi, Jaswinder Kaur; Sibia, Preet Kanwal; Kaur, Navkiran
2017-08-01
IOTA stands for International Ovarian Tumour Analysis group. Ovarian cancer is one of the common cancers in women and is diagnosed at later stage in majority. The limiting factor for early diagnosis is lack of standardized terms and procedures in gynaecological sonography. Introduction of IOTA rules has provided some consistency in defining morphological features of ovarian masses through a standardized examination technique. To evaluate the efficacy of IOTA simple ultrasound rules in distinguishing benign and malignant ovarian tumours and establishing their use as a tool in early diagnosis of ovarian malignancy. A hospital based case control prospective study was conducted. Patients with suspected ovarian pathology were evaluated using IOTA ultrasound rules and designated as benign or malignant. Findings were correlated with histopathological findings. Collected data was statistically analysed using chi-square test and kappa statistical method. Out of initial 55 patients, 50 patients were included in the final analysis who underwent surgery. IOTA simple rules were applicable in 45 out of these 50 patients (90%). The sensitivity for the detection of malignancy in cases where IOTA simple rules were applicable was 91.66% and the specificity was 84.84%. Accuracy was 86.66%. Classifying inconclusive cases as malignant, the sensitivity and specificity was 93% and 80% respectively. High level of agreement was found between USG and histopathological diagnosis with Kappa value as 0.323. IOTA simple ultrasound rules were highly sensitive and specific in predicting ovarian malignancy preoperatively yet being reproducible, easy to train and use.
Interpreting Self-Directed Search Profiles: Validity of the "Rule of Eight"
ERIC Educational Resources Information Center
Glavin, Kevin W.; Savickas, Mark L.
2011-01-01
Based on the standard error of measurement, Holland (1985) suggested the "rule of eight" for determining the meaningfulness of differences between two summary scores on the Self Directed Search. The present study empirically examined the rule's validity for practice. The participants were 2397 (1497 females and 900 males) undergraduate…
Evaluation of IOTA Simple Ultrasound Rules to Distinguish Benign and Malignant Ovarian Tumours
Kaur, Amarjit; Mohi, Jaswinder Kaur; Sibia, Preet Kanwal; Kaur, Navkiran
2017-01-01
Introduction IOTA stands for International Ovarian Tumour Analysis group. Ovarian cancer is one of the common cancers in women and is diagnosed at later stage in majority. The limiting factor for early diagnosis is lack of standardized terms and procedures in gynaecological sonography. Introduction of IOTA rules has provided some consistency in defining morphological features of ovarian masses through a standardized examination technique. Aim To evaluate the efficacy of IOTA simple ultrasound rules in distinguishing benign and malignant ovarian tumours and establishing their use as a tool in early diagnosis of ovarian malignancy. Materials and Methods A hospital based case control prospective study was conducted. Patients with suspected ovarian pathology were evaluated using IOTA ultrasound rules and designated as benign or malignant. Findings were correlated with histopathological findings. Collected data was statistically analysed using chi-square test and kappa statistical method. Results Out of initial 55 patients, 50 patients were included in the final analysis who underwent surgery. IOTA simple rules were applicable in 45 out of these 50 patients (90%). The sensitivity for the detection of malignancy in cases where IOTA simple rules were applicable was 91.66% and the specificity was 84.84%. Accuracy was 86.66%. Classifying inconclusive cases as malignant, the sensitivity and specificity was 93% and 80% respectively. High level of agreement was found between USG and histopathological diagnosis with Kappa value as 0.323. Conclusion IOTA simple ultrasound rules were highly sensitive and specific in predicting ovarian malignancy preoperatively yet being reproducible, easy to train and use. PMID:28969237
ERIC Educational Resources Information Center
Nahavandi, Naemeh; Mukundan, Jayakaran
2013-01-01
The present study investigated the impact of textual input enhancement and explicit rule presentation on 93 Iranian EFL learners' intake of simple past tense. Three intact general English classes in Tabriz Azad University were randomly assigned to: 1) a control group; 2) a TIE group; and 3) a TIE plus explicit rule presentation group. All…
NASA Astrophysics Data System (ADS)
Kupka, Teobald
1997-12-01
IR studies were preformed to determine possible transition metal ion binding sites of penicillin. the observed changes in spectral position and shape of characteristic IR bands of cloxacillin in the presence of transition metal ions (both in solutions and in the solid state) indicate formation of M-L complexes with engagement of -COO - and/or -CONH- functional groups. The small shift of νCO towards higher frequencies rules out direct M-L interaction via β-lactam carbonyl. PM3 calculations on simple model compounds (substituted formamide, cyclic ketones, lactams and substituted monocyclic β-lactams) have been performed. All structures were fully optimized and the calculated bond lengths, angles, heats of formation and CO stretching frequencies were discussed to determine the β-lactam binding sites and to explain its susceptibility towards nucleophilic attack (hydrolysis in vitro) and biological activity. The relative changes of calculated values were critically compared with available experimental data and same correlation between structural parameters and in vivo activity was shown.
Density dependence in demography and dispersal generates fluctuating invasion speeds
Li, Bingtuan; Miller, Tom E. X.
2017-01-01
Density dependence plays an important role in population regulation and is known to generate temporal fluctuations in population density. However, the ways in which density dependence affects spatial population processes, such as species invasions, are less understood. Although classical ecological theory suggests that invasions should advance at a constant speed, empirical work is illuminating the highly variable nature of biological invasions, which often exhibit nonconstant spreading speeds, even in simple, controlled settings. Here, we explore endogenous density dependence as a mechanism for inducing variability in biological invasions with a set of population models that incorporate density dependence in demographic and dispersal parameters. We show that density dependence in demography at low population densities—i.e., an Allee effect—combined with spatiotemporal variability in population density behind the invasion front can produce fluctuations in spreading speed. The density fluctuations behind the front can arise from either overcompensatory population growth or density-dependent dispersal, both of which are common in nature. Our results show that simple rules can generate complex spread dynamics and highlight a source of variability in biological invasions that may aid in ecological forecasting. PMID:28442569
Minimizing Significant Figure Fuzziness.
ERIC Educational Resources Information Center
Fields, Lawrence D.; Hawkes, Stephen J.
1986-01-01
Addresses the principles and problems associated with the use of significant figures. Explains uncertainty, the meaning of significant figures, the Simple Rule, the Three Rule, and the 1-5 Rule. Also provides examples of the Rules. (ML)
Simple modification of Oja rule limits L1-norm of weight vector and leads to sparse connectivity.
Aparin, Vladimir
2012-03-01
This letter describes a simple modification of the Oja learning rule, which asymptotically constrains the L1-norm of an input weight vector instead of the L2-norm as in the original rule. This constraining is local as opposed to commonly used instant normalizations, which require the knowledge of all input weights of a neuron to update each one of them individually. The proposed rule converges to a weight vector that is sparser (has more zero weights) than the vector learned by the original Oja rule with or without the zero bound, which could explain the developmental synaptic pruning.
Niklas, Karl J
2006-02-01
Life forms as diverse as unicellular algae, zooplankton, vascular plants, and mammals appear to obey quarter-power scaling rules. Among the most famous of these rules is Kleiber's (i.e. basal metabolic rates scale as the three-quarters power of body mass), which has a botanical analogue (i.e. annual plant growth rates scale as the three-quarters power of total body mass). Numerous theories have tried to explain why these rules exist, but each has been heavily criticized either on conceptual or empirical grounds. N,P-STOICHIOMETRY: Recent models predicting growth rates on the basis of how total cell, tissue, or organism nitrogen and phosphorus are allocated, respectively, to protein and rRNA contents may provide the answer, particularly in light of the observation that annual plant growth rates scale linearly with respect to standing leaf mass and that total leaf mass scales isometrically with respect to nitrogen but as the three-quarters power of leaf phosphorus. For example, when these relationships are juxtaposed with other allometric trends, a simple N,P-stoichiometric model successfully predicts the relative growth rates of 131 diverse C3 and C4 species. The melding of allometric and N,P-stoichiometric theoretical insights provides a robust modelling approach that conceptually links the subcellular 'machinery' of protein/ribosomal metabolism to observed growth rates of uni- and multicellular organisms. Because the operation of this 'machinery' is basic to the biology of all life forms, its allometry may provide a mechanistic explanation for the apparent ubiquity of quarter-power scaling rules.
This Ad is for You: Targeting and the Effect of Alcohol Advertising on Youth Drinking.
Molloy, Eamon
2016-02-01
Endogenous targeting of alcohol advertisements presents a challenge for empirically identifying a causal effect of advertising on drinking. Drinkers prefer a particular media; firms recognize this and target alcohol advertising at these media. This paper overcomes this challenge by utilizing novel data with detailed individual measures of media viewing and alcohol consumption and three separate empirical techniques, which represent significant improvements over previous methods. First, controls for the average audience characteristics of the media an individual views account for attributes of magazines and television programs alcohol firms may consider when deciding where to target advertising. A second specification directly controls for each television program and magazine a person views. The third method exploits variation in advertising exposure due to a 2003 change in an industry-wide rule that governs where firms may advertise. Although the unconditional correlation between advertising and drinking by youth (ages 18-24) is strong, models that include simple controls for targeting imply, at most, a modest advertising effect. Although the coefficients are estimated less precisely, estimates with models including more rigorous controls for targeting indicate no significant effect of advertising on youth drinking. Copyright © 2015 John Wiley & Sons, Ltd.
Ten simple rules for Lightning and PechaKucha presentations.
NASA Astrophysics Data System (ADS)
Lortie, C. J.
2016-12-01
An interesting opportunity has emerged that bridges the gap between lengthy, detailed presentations of scientific findings and `sound bites' appropriate for media reporting - very short presentations often presented in sets. Lightning or Ignite (20 slides @15 seconds each) and PechaKucha (20 slides @20 seconds each) presentations are common formats for short, rapid communications at scientific conferences and public events. The simple rules for making good presentations also apply, but these presentation formats provide both unique communication opportunities and novel challenges. In the spirit of light, quick, and exact (but without the fox), here are ten simple rules for presentation formats that do not wait for the speaker.
Hyper-heuristic Evolution of Dispatching Rules: A Comparison of Rule Representations.
Branke, Jürgen; Hildebrandt, Torsten; Scholz-Reiter, Bernd
2015-01-01
Dispatching rules are frequently used for real-time, online scheduling in complex manufacturing systems. Design of such rules is usually done by experts in a time consuming trial-and-error process. Recently, evolutionary algorithms have been proposed to automate the design process. There are several possibilities to represent rules for this hyper-heuristic search. Because the representation determines the search neighborhood and the complexity of the rules that can be evolved, a suitable choice of representation is key for a successful evolutionary algorithm. In this paper we empirically compare three different representations, both numeric and symbolic, for automated rule design: A linear combination of attributes, a representation based on artificial neural networks, and a tree representation. Using appropriate evolutionary algorithms (CMA-ES for the neural network and linear representations, genetic programming for the tree representation), we empirically investigate the suitability of each representation in a dynamic stochastic job shop scenario. We also examine the robustness of the evolved dispatching rules against variations in the underlying job shop scenario, and visualize what the rules do, in order to get an intuitive understanding of their inner workings. Results indicate that the tree representation using an improved version of genetic programming gives the best results if many candidate rules can be evaluated, closely followed by the neural network representation that already leads to good results for small to moderate computational budgets. The linear representation is found to be competitive only for extremely small computational budgets.
Hektner, Joel M; Brennan, Alison L; Brotherson, Sean E
2013-09-01
The Nurtured Heart Approach to parenting (NHA; Glasser & Easley, 2008) is summarized and evaluated in terms of its alignment with current theoretical perspectives and empirical evidence in family studies and developmental science. Originally conceived and promoted as a behavior management approach for parents of difficult children (i.e., with behavior disorders), NHA is increasingly offered as a valuable strategy for parents of any children, despite a lack of published empirical support. Parents using NHA are trained to minimize attention to undesired behaviors, provide positive attention and praise for compliance with rules, help children be successful by scaffolding and shaping desired behavior, and establish a set of clear rules and consequences. Many elements of the approach have strong support in the theoretical and empirical literature; however, some of the assumptions are more questionable, such as that negative child behavior can always be attributed to unintentional positive reinforcement by parents responding with negative attention. On balance, NHA appears to promote effective and validated parenting practices, but its effectiveness now needs to be tested empirically. © FPI, Inc.
SAW based systems for mobile communications satellites
NASA Technical Reports Server (NTRS)
Peach, R. C.; Miller, N.; Lee, M.
1993-01-01
Modern mobile communications satellites, such as INMARSAT 3, EMS, and ARTEMIS, use advanced onboard processing to make efficient use of the available L-band spectrum. In all of these cases, high performance surface acoustic wave (SAW) devices are used. SAW filters can provide high selectivity (100-200 kHz transition widths), combined with flat amplitude and linear phase characteristics; their simple construction and radiation hardness also makes them especially suitable for space applications. An overview of the architectures used in the above systems, describing the technologies employed, and the use of bandwidth switchable SAW filtering (BSSF) is given. The tradeoffs to be considered when specifying a SAW based system are analyzed, using both theoretical and experimental data. Empirical rules for estimating SAW filter performance are given. Achievable performance is illustrated using data from the INMARSAT 3 engineering model (EM) processors.
The simple rules of social contagion.
Hodas, Nathan O; Lerman, Kristina
2014-03-11
It is commonly believed that information spreads between individuals like a pathogen, with each exposure by an informed friend potentially resulting in a naive individual becoming infected. However, empirical studies of social media suggest that individual response to repeated exposure to information is far more complex. As a proxy for intervention experiments, we compare user responses to multiple exposures on two different social media sites, Twitter and Digg. We show that the position of exposing messages on the user-interface strongly affects social contagion. Accounting for this visibility significantly simplifies the dynamics of social contagion. The likelihood an individual will spread information increases monotonically with exposure, while explicit feedback about how many friends have previously spread it increases the likelihood of a response. We provide a framework for unifying information visibility, divided attention, and explicit social feedback to predict the temporal dynamics of user behavior.
The Simple Rules of Social Contagion
NASA Astrophysics Data System (ADS)
Hodas, Nathan O.; Lerman, Kristina
2014-03-01
It is commonly believed that information spreads between individuals like a pathogen, with each exposure by an informed friend potentially resulting in a naive individual becoming infected. However, empirical studies of social media suggest that individual response to repeated exposure to information is far more complex. As a proxy for intervention experiments, we compare user responses to multiple exposures on two different social media sites, Twitter and Digg. We show that the position of exposing messages on the user-interface strongly affects social contagion. Accounting for this visibility significantly simplifies the dynamics of social contagion. The likelihood an individual will spread information increases monotonically with exposure, while explicit feedback about how many friends have previously spread it increases the likelihood of a response. We provide a framework for unifying information visibility, divided attention, and explicit social feedback to predict the temporal dynamics of user behavior.
A Preliminary Investigation of the Effect of Rules on Employee Performance
ERIC Educational Resources Information Center
Squires, James; Wilder, David A.
2010-01-01
The way in which rules impact workplace performance has been a topic of discussion in the Organizational Behavior Management literature for some time. Despite this interest, there is a dearth of empirical research on the topic. The purpose of this study was to examine the effect of rules and goal setting in the workplace. Participants included two…
ERIC Educational Resources Information Center
Frenette, Micheline
Trying to change the predictive rule for the sinking and floating phenomena, students have a great difficulty in understanding density and they are insensitive to empirical counter-examples designed to challenge their own rule. The purpose of this study is to examine the process whereby students from sixth and seventh grades relinquish their…
Hierarchy-associated semantic-rule inference framework for classifying indoor scenes
NASA Astrophysics Data System (ADS)
Yu, Dan; Liu, Peng; Ye, Zhipeng; Tang, Xianglong; Zhao, Wei
2016-03-01
Typically, the initial task of classifying indoor scenes is challenging, because the spatial layout and decoration of a scene can vary considerably. Recent efforts at classifying object relationships commonly depend on the results of scene annotation and predefined rules, making classification inflexible. Furthermore, annotation results are easily affected by external factors. Inspired by human cognition, a scene-classification framework was proposed using the empirically based annotation (EBA) and a match-over rule-based (MRB) inference system. The semantic hierarchy of images is exploited by EBA to construct rules empirically for MRB classification. The problem of scene classification is divided into low-level annotation and high-level inference from a macro perspective. Low-level annotation involves detecting the semantic hierarchy and annotating the scene with a deformable-parts model and a bag-of-visual-words model. In high-level inference, hierarchical rules are extracted to train the decision tree for classification. The categories of testing samples are generated from the parts to the whole. Compared with traditional classification strategies, the proposed semantic hierarchy and corresponding rules reduce the effect of a variable background and improve the classification performance. The proposed framework was evaluated on a popular indoor scene dataset, and the experimental results demonstrate its effectiveness.
Shimp, Charles P
2004-06-30
Research on categorization has changed over time, and some of these changes resemble how Wittgenstein's views changed from his Tractatus Logico-Philosophicus to his Philosophical Investigations. Wittgenstein initially focused on unambiguous, abstract, parsimonious, logical propositions and rules, and on independent, static, "atomic facts." This approach subsequently influenced the development of logical positivism and thereby may have indirectly influenced method and theory in research on categorization: much animal research on categorization has focused on learning simple, static, logical rules unambiguously interrelating small numbers of independent features. He later rejected logical simplicity and rigor and focused instead on Gestalt ideas about figure-ground reversals and context, the ambiguity of family resemblance, and the function of details of everyday language. Contemporary contextualism has been influenced by this latter position, some features of which appear in contemporary empirical research on categorization. These developmental changes are illustrated by research on avian local and global levels of visual perceptual analysis, categorization of rectangles and moving objects, and artificial grammar learning. Implications are described for peer review of quantitative theory in which ambiguity, logical rigor, simplicity, or dynamics are designed to play important roles.
Learning temporal rules to forecast instability in continuously monitored patients
Dubrawski, Artur; Wang, Donghan; Hravnak, Marilyn; Clermont, Gilles; Pinsky, Michael R
2017-01-01
Inductive machine learning, and in particular extraction of association rules from data, has been successfully used in multiple application domains, such as market basket analysis, disease prognosis, fraud detection, and protein sequencing. The appeal of rule extraction techniques stems from their ability to handle intricate problems yet produce models based on rules that can be comprehended by humans, and are therefore more transparent. Human comprehension is a factor that may improve adoption and use of data-driven decision support systems clinically via face validity. In this work, we explore whether we can reliably and informatively forecast cardiorespiratory instability (CRI) in step-down unit (SDU) patients utilizing data from continuous monitoring of physiologic vital sign (VS) measurements. We use a temporal association rule extraction technique in conjunction with a rule fusion protocol to learn how to forecast CRI in continuously monitored patients. We detail our approach and present and discuss encouraging empirical results obtained using continuous multivariate VS data from the bedside monitors of 297 SDU patients spanning 29 346 hours (3.35 patient-years) of observation. We present example rules that have been learned from data to illustrate potential benefits of comprehensibility of the extracted models, and we analyze the empirical utility of each VS as a potential leading indicator of an impending CRI event. PMID:27274020
Building gene expression profile classifiers with a simple and efficient rejection option in R.
Benso, Alfredo; Di Carlo, Stefano; Politano, Gianfranco; Savino, Alessandro; Hafeezurrehman, Hafeez
2011-01-01
The collection of gene expression profiles from DNA microarrays and their analysis with pattern recognition algorithms is a powerful technology applied to several biological problems. Common pattern recognition systems classify samples assigning them to a set of known classes. However, in a clinical diagnostics setup, novel and unknown classes (new pathologies) may appear and one must be able to reject those samples that do not fit the trained model. The problem of implementing a rejection option in a multi-class classifier has not been widely addressed in the statistical literature. Gene expression profiles represent a critical case study since they suffer from the curse of dimensionality problem that negatively reflects on the reliability of both traditional rejection models and also more recent approaches such as one-class classifiers. This paper presents a set of empirical decision rules that can be used to implement a rejection option in a set of multi-class classifiers widely used for the analysis of gene expression profiles. In particular, we focus on the classifiers implemented in the R Language and Environment for Statistical Computing (R for short in the remaining of this paper). The main contribution of the proposed rules is their simplicity, which enables an easy integration with available data analysis environments. Since in the definition of a rejection model tuning of the involved parameters is often a complex and delicate task, in this paper we exploit an evolutionary strategy to automate this process. This allows the final user to maximize the rejection accuracy with minimum manual intervention. This paper shows how the use of simple decision rules can be used to help the use of complex machine learning algorithms in real experimental setups. The proposed approach is almost completely automated and therefore a good candidate for being integrated in data analysis flows in labs where the machine learning expertise required to tune traditional classifiers might not be available.
Fateen, Seif-Eddeen K; Khalil, Menna M; Elnabawy, Ahmed O
2013-03-01
Peng-Robinson equation of state is widely used with the classical van der Waals mixing rules to predict vapor liquid equilibria for systems containing hydrocarbons and related compounds. This model requires good values of the binary interaction parameter kij . In this work, we developed a semi-empirical correlation for kij partly based on the Huron-Vidal mixing rules. We obtained values for the adjustable parameters of the developed formula for over 60 binary systems and over 10 categories of components. The predictions of the new equation system were slightly better than the constant-kij model in most cases, except for 10 systems whose predictions were considerably improved with the new correlation.
A Left-Hand Rule for Faraday's Law
ERIC Educational Resources Information Center
Salu, Yehuda
2014-01-01
A left-hand rule for Faraday's law is presented here. This rule provides a simple and quick way of finding directional relationships between variables of Faraday's law without using Lenz's rule.
Simple and accurate sum rules for highly relativistic systems
NASA Astrophysics Data System (ADS)
Cohen, Scott M.
2005-03-01
In this paper, I consider the Bethe and Thomas-Reiche-Kuhn sum rules, which together form the foundation of Bethe's theory of energy loss from fast charged particles to matter. For nonrelativistic target systems, the use of closure leads directly to simple expressions for these quantities. In the case of relativistic systems, on the other hand, the calculation of sum rules is fraught with difficulties. Various perturbative approaches have been used over the years to obtain relativistic corrections, but these methods fail badly when the system in question is very strongly bound. Here, I present an approach that leads to relatively simple expressions yielding accurate sums, even for highly relativistic many-electron systems. I also offer an explanation for the difference between relativistic and nonrelativistic sum rules in terms of the Zitterbewegung of the electrons.
Optimal Sequential Rules for Computer-Based Instruction.
ERIC Educational Resources Information Center
Vos, Hans J.
1998-01-01
Formulates sequential rules for adapting the appropriate amount of instruction to learning needs in the context of computer-based instruction. Topics include Bayesian decision theory, threshold and linear-utility structure, psychometric model, optimal sequential number of test questions, and an empirical example of sequential instructional…
Scaling rules for the final decline to extinction
Griffen, Blaine D.; Drake, John M.
2009-01-01
Space–time scaling rules are ubiquitous in ecological phenomena. Current theory postulates three scaling rules that describe the duration of a population's final decline to extinction, although these predictions have not previously been empirically confirmed. We examine these scaling rules across a broader set of conditions, including a wide range of density-dependent patterns in the underlying population dynamics. We then report on tests of these predictions from experiments using the cladoceran Daphnia magna as a model. Our results support two predictions that: (i) the duration of population persistence is much greater than the duration of the final decline to extinction and (ii) the duration of the final decline to extinction increases with the logarithm of the population's estimated carrying capacity. However, our results do not support a third prediction that the duration of the final decline scales inversely with population growth rate. These findings not only support the current standard theory of population extinction but also introduce new empirical anomalies awaiting a theoretical explanation. PMID:19141422
Exploring Empirical Rank-Frequency Distributions Longitudinally through a Simple Stochastic Process
Finley, Benjamin J.; Kilkki, Kalevi
2014-01-01
The frequent appearance of empirical rank-frequency laws, such as Zipf’s law, in a wide range of domains reinforces the importance of understanding and modeling these laws and rank-frequency distributions in general. In this spirit, we utilize a simple stochastic cascade process to simulate several empirical rank-frequency distributions longitudinally. We focus especially on limiting the process’s complexity to increase accessibility for non-experts in mathematics. The process provides a good fit for many empirical distributions because the stochastic multiplicative nature of the process leads to an often observed concave rank-frequency distribution (on a log-log scale) and the finiteness of the cascade replicates real-world finite size effects. Furthermore, we show that repeated trials of the process can roughly simulate the longitudinal variation of empirical ranks. However, we find that the empirical variation is often less that the average simulated process variation, likely due to longitudinal dependencies in the empirical datasets. Finally, we discuss the process limitations and practical applications. PMID:24755621
Exploring empirical rank-frequency distributions longitudinally through a simple stochastic process.
Finley, Benjamin J; Kilkki, Kalevi
2014-01-01
The frequent appearance of empirical rank-frequency laws, such as Zipf's law, in a wide range of domains reinforces the importance of understanding and modeling these laws and rank-frequency distributions in general. In this spirit, we utilize a simple stochastic cascade process to simulate several empirical rank-frequency distributions longitudinally. We focus especially on limiting the process's complexity to increase accessibility for non-experts in mathematics. The process provides a good fit for many empirical distributions because the stochastic multiplicative nature of the process leads to an often observed concave rank-frequency distribution (on a log-log scale) and the finiteness of the cascade replicates real-world finite size effects. Furthermore, we show that repeated trials of the process can roughly simulate the longitudinal variation of empirical ranks. However, we find that the empirical variation is often less that the average simulated process variation, likely due to longitudinal dependencies in the empirical datasets. Finally, we discuss the process limitations and practical applications.
Uncertainty plus prior equals rational bias: an intuitive Bayesian probability weighting function.
Fennell, John; Baddeley, Roland
2012-10-01
Empirical research has shown that when making choices based on probabilistic options, people behave as if they overestimate small probabilities, underestimate large probabilities, and treat positive and negative outcomes differently. These distortions have been modeled using a nonlinear probability weighting function, which is found in several nonexpected utility theories, including rank-dependent models and prospect theory; here, we propose a Bayesian approach to the probability weighting function and, with it, a psychological rationale. In the real world, uncertainty is ubiquitous and, accordingly, the optimal strategy is to combine probability statements with prior information using Bayes' rule. First, we show that any reasonable prior on probabilities leads to 2 of the observed effects; overweighting of low probabilities and underweighting of high probabilities. We then investigate 2 plausible kinds of priors: informative priors based on previous experience and uninformative priors of ignorance. Individually, these priors potentially lead to large problems of bias and inefficiency, respectively; however, when combined using Bayesian model comparison methods, both forms of prior can be applied adaptively, gaining the efficiency of empirical priors and the robustness of ignorance priors. We illustrate this for the simple case of generic good and bad options, using Internet blogs to estimate the relevant priors of inference. Given this combined ignorant/informative prior, the Bayesian probability weighting function is not only robust and efficient but also matches all of the major characteristics of the distortions found in empirical research. PsycINFO Database Record (c) 2012 APA, all rights reserved.
Rule Systems for Runtime Verification: A Short Tutorial
NASA Astrophysics Data System (ADS)
Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex
In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.
ERIC Educational Resources Information Center
Isik, Hasan
2016-01-01
Arab and Turkish people lived together for nearly four hundred years under the rule of the Ottoman Empire, during which time both sides inevitably adopted a certain kind of attitude and view toward the other. This study is an attempt to explore Arab people's views toward Turks, the Ottoman Empire, and the Republic of Turkey. Through a case study…
Qualitative Discovery in Medical Databases
NASA Technical Reports Server (NTRS)
Maluf, David A.
2000-01-01
Implication rules have been used in uncertainty reasoning systems to confirm and draw hypotheses or conclusions. However a major bottleneck in developing such systems lies in the elicitation of these rules. This paper empirically examines the performance of evidential inferencing with implication networks generated using a rule induction tool called KAT. KAT utilizes an algorithm for the statistical analysis of empirical case data, and hence reduces the knowledge engineering efforts and biases in subjective implication certainty assignment. The paper describes several experiments in which real-world diagnostic problems were investigated; namely, medical diagnostics. In particular, it attempts to show that: (1) with a limited number of case samples, KAT is capable of inducing implication networks useful for making evidential inferences based on partial observations, and (2) observation driven by a network entropy optimization mechanism is effective in reducing the uncertainty of predicted events.
NASA Astrophysics Data System (ADS)
Guo, Lei; Obot, Ime Bassey; Zheng, Xingwen; Shen, Xun; Qiang, Yujie; Kaya, Savaş; Kaya, Cemal
2017-06-01
Steel is an important material in industry. Adding heterocyclic organic compounds have proved to be very efficient for steel protection. There exists an empirical rule that the general trend in the inhibition efficiencies of molecules containing heteroatoms is such that O < N < S. However, an atomic-level insight into the inhibition mechanism is still lacked. Thus, in this work, density functional theory calculations was used to investigate the adsorption of three typical heterocyclic molecules, i.e., pyrrole, furan, and thiophene, on Fe(110) surface. The approach is illustrated by carrying out geometric optimization of inhibitors on the stable and most exposed plane of α-Fe. Some salient features such as charge density difference, changes of work function, density of states were detailedly described. The present study is helpful to understand the afore-mentioned experiment rule.
How Harvard Rules: Reason in the Service of Empire.
ERIC Educational Resources Information Center
Trumpbour, John, Ed.
This collection of 26 essays examines the historical position of Harvard University as one of the nation's most influential institutions. Included are: (1) "Introducing Harvard: A Social, Philosophical, and Political Profile" (John Trumpbour); (2) "How Harvard is Ruled: Administration and Governance at the Corporate University"…
Tanaka, Takuma; Aoyagi, Toshio; Kaneko, Takeshi
2012-10-01
We propose a new principle for replicating receptive field properties of neurons in the primary visual cortex. We derive a learning rule for a feedforward network, which maintains a low firing rate for the output neurons (resulting in temporal sparseness) and allows only a small subset of the neurons in the network to fire at any given time (resulting in population sparseness). Our learning rule also sets the firing rates of the output neurons at each time step to near-maximum or near-minimum levels, resulting in neuronal reliability. The learning rule is simple enough to be written in spatially and temporally local forms. After the learning stage is performed using input image patches of natural scenes, output neurons in the model network are found to exhibit simple-cell-like receptive field properties. When the output of these simple-cell-like neurons are input to another model layer using the same learning rule, the second-layer output neurons after learning become less sensitive to the phase of gratings than the simple-cell-like input neurons. In particular, some of the second-layer output neurons become completely phase invariant, owing to the convergence of the connections from first-layer neurons with similar orientation selectivity to second-layer neurons in the model network. We examine the parameter dependencies of the receptive field properties of the model neurons after learning and discuss their biological implications. We also show that the localized learning rule is consistent with experimental results concerning neuronal plasticity and can replicate the receptive fields of simple and complex cells.
Procedures for Empirical Determination of En-Route Criterion Levels.
ERIC Educational Resources Information Center
Moncrief, Michael H.
En-route Criterion Levels (ECLs) are defined as decision rules for predicting pupil readiness to advance through an instructional sequence. This study investigated the validity of present ELCs in an individualized mathematics program and tested procedures for empirically determining optimal ECLs. Retest scores and subsequent progress were…
Molecular-dynamics simulation of mutual diffusion in nonideal liquid mixtures
NASA Astrophysics Data System (ADS)
Rowley, R. L.; Stoker, J. M.; Giles, N. F.
1991-05-01
The mutual-diffusion coefficients, D 12, of n-hexane, n-heptane, and n-octane in chloroform were modeled using equilibrium molecular-dynamics (MD) simulations of simple Lennard-Jones (LJ) fluids. Pure-component LJ parameters were obtained by comparison of simulations to experimental self-diffusion coefficients. While values of “effective” LJ parameters are not expected to simulate accurately diverse thermophysical properties over a wide range of conditions, it was recently shown that effective parameters obtained from pure self-diffusion coefficients can accurately model mutual diffusion in ideal, liquid mixtures. In this work, similar simulations are used to model diffusion in nonideal mixtures. The same combining rules used in the previous study for the cross-interaction parameters were found to be adequate to represent the composition dependence of D 12. The effect of alkane chain length on D 12 is also correctly predicted by the simulations. A commonly used assumption in empirical correlations of D 12, that its kinetic portion is a simple, compositional average of the intradiffusion coefficients, is inconsistent with the simulation results. In fact, the value of the kinetic portion of D 12 was often outside the range of values bracketed by the two intradiffusion coefficients for the nonideal system modeled here.
A Simple Demonstration of a General Rule for the Variation of Magnetic Field with Distance
ERIC Educational Resources Information Center
Kodama, K.
2009-01-01
We describe a simple experiment demonstrating the variation in magnitude of a magnetic field with distance. The method described requires only an ordinary magnetic compass and a permanent magnet. The proposed graphical analysis illustrates a unique method for deducing a general rule of magnetostatics. (Contains 1 table and 6 figures.)
78 FR 49721 - Petition for Rulemaking To Adopt Revised Competitive Switching Rules
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-15
... DEPARTMENT OF TRANSPORTATION Surface Transportation Board 49 CFR Chapter X [Docket No. EP 711] Petition for Rulemaking To Adopt Revised Competitive Switching Rules AGENCY: Surface Transportation Board... Board sought empirical information about the impact of the proposal if it were to be adopted. The Board...
An Inductive Logic Programming Approach to Validate Hexose Binding Biochemical Knowledge.
Nassif, Houssam; Al-Ali, Hassan; Khuri, Sawsan; Keirouz, Walid; Page, David
2010-01-01
Hexoses are simple sugars that play a key role in many cellular pathways, and in the regulation of development and disease mechanisms. Current protein-sugar computational models are based, at least partially, on prior biochemical findings and knowledge. They incorporate different parts of these findings in predictive black-box models. We investigate the empirical support for biochemical findings by comparing Inductive Logic Programming (ILP) induced rules to actual biochemical results. We mine the Protein Data Bank for a representative data set of hexose binding sites, non-hexose binding sites and surface grooves. We build an ILP model of hexose-binding sites and evaluate our results against several baseline machine learning classifiers. Our method achieves an accuracy similar to that of other black-box classifiers while providing insight into the discriminating process. In addition, it confirms wet-lab findings and reveals a previously unreported Trp-Glu amino acids dependency.
A Variational Statistical-Field Theory for Polar Liquid Mixtures
NASA Astrophysics Data System (ADS)
Zhuang, Bilin; Wang, Zhen-Gang
Using a variational field-theoretic approach, we derive a molecularly-based theory for polar liquid mixtures. The resulting theory consists of simple algebraic expressions for the free energy of mixing and the dielectric constant as functions of mixture composition. Using only the dielectric constants and the molar volumes of the pure liquid constituents, the theory evaluates the mixture dielectric constants in good agreement with the experimental values for a wide range of liquid mixtures, without using adjustable parameters. In addition, the theory predicts that liquids with similar dielectric constants and molar volumes dissolve well in each other, while sufficient disparity in these parameters result in phase separation. The calculated miscibility map on the dielectric constant-molar volume axes agrees well with known experimental observations for a large number of liquid pairs. Thus the theory provides a quantification for the well-known empirical ``like-dissolves-like'' rule. Bz acknowledges the A-STAR fellowship for the financial support.
The Simple Rules of Social Contagion
Hodas, Nathan O.; Lerman, Kristina
2014-01-01
It is commonly believed that information spreads between individuals like a pathogen, with each exposure by an informed friend potentially resulting in a naive individual becoming infected. However, empirical studies of social media suggest that individual response to repeated exposure to information is far more complex. As a proxy for intervention experiments, we compare user responses to multiple exposures on two different social media sites, Twitter and Digg. We show that the position of exposing messages on the user-interface strongly affects social contagion. Accounting for this visibility significantly simplifies the dynamics of social contagion. The likelihood an individual will spread information increases monotonically with exposure, while explicit feedback about how many friends have previously spread it increases the likelihood of a response. We provide a framework for unifying information visibility, divided attention, and explicit social feedback to predict the temporal dynamics of user behavior. PMID:24614301
Pyro-synthesis of functional nanocrystals.
Gim, Jihyeon; Mathew, Vinod; Lim, Jinsub; Song, Jinju; Baek, Sora; Kang, Jungwon; Ahn, Docheon; Song, Sun-Ju; Yoon, Hyeonseok; Kim, Jaekook
2012-01-01
Despite nanomaterials with unique properties playing a vital role in scientific and technological advancements of various fields including chemical and electrochemical applications, the scope for exploration of nano-scale applications is still wide open. The intimate correlation between material properties and synthesis in combination with the urgency to enhance the empirical understanding of nanomaterials demand the evolution of new strategies to promising materials. Herein we introduce a rapid pyro-synthesis that produces highly crystalline functional nanomaterials under reaction times of a few seconds in open-air conditions. The versatile technique may facilitate the development of a variety of nanomaterials and, in particular, carbon-coated metal phosphates with appreciable physico-chemical properties benefiting energy storage applications. The present strategy may present opportunities to develop "design rules" not only to produce nanomaterials for various applications but also to realize cost-effective and simple nanomaterial production beyond lab-scale limitations.
NASA Astrophysics Data System (ADS)
Mbaye, A.
2016-02-01
Fishery resources has always been an administrative management faced with the supposed irrationality of artisanal fishermen and the state has always had a monopoly over such management. The state rules well established, synonyms of denial local populations knowledge on management, and expropriation of their fisheries territories, came into conflict with the existing rules thus weakening the traditional management system.However, aware of the threats to their survival because of the limitations of state rules and technicist perception of management, some populations of fishermen tried to organize and implement management measures.These measures are implemented on the basis of their own knowledge of the environmentsThis is the case in Kayar, Nianing, Bétenty, where local management initiatives began to bear fruit despite some difficulties.These examples of successful local management have prompted the Senegalese administration to have more consideration for the knowledge and know-how of fishermen and to be open to co-management of the fisheries resource. his communication shows how this is implemented new co-management approach in the governance of the Senegalese artisanal fisheries through the consideration of empirical knowledge of fishermen.
Consensus on diagnosis and empiric antibiotic therapy of febrile neutropenia
Giurici, Nagua; Zanazzo, Giulio A.
2011-01-01
Controversial issues on the management of empiric therapy and diagnosis of febrile neutropenia (FN) were faced by a Consensus Group of the Italian Association of Pediatric Hematology-Oncology (AIEOP). In this paper we report the suggestions of the consensus process regarding the role of aminoglycosides, glycopeptides and oral antibiotics in empiric therapy of FN, the rules for changing or discontinuing the therapy as well as the timing of the blood cultures. PMID:21647277
Electricity and Empire in 1920s Palestine under British Rule.
Shamir, Ronen
2016-12-01
This article examines some techno-political aspects of the early years of electrification in British-ruled 1920s Palestine. It emphasizes the importance of local technical, topographical and hydrological forms of knowledge for understanding the dynamics of electrification. Situating the analysis in a general colonial context of electrification, the study shows that British colonial rulers lagged behind both German firms and local entrepreneurs in understanding the specific conditions pertaining to electrification in Palestine. Subsequently, the study shows that the British had limited control of the actual electrification process and its declared/professed developmental purposes, thereby complicating assumptions about electrification as a tool of the Empire/tool of empire. Finding some similarities between the cases of electrifying Palestine and India, the article's findings may shed further light on the importance of micro-politics of knowledge for understanding the trajectory of electrification in the colonies.
Cross-Field Comparison of Ethics Education: Golden Rules and Particulars.
Mulhearn, Tyler J; Watts, Logan L; Torrence, Brett S; Todd, E Michelle; Turner, Megan R; Connelly, Shane; Mumford, Michael D
2017-01-01
Research misconduct negatively impacts the scientific community and society in general. Providing training in the responsible conduct of research (RCR) to researchers is one viable approach to minimizing research misconduct. Although recent evidence suggests ethics training can indeed be effective, little empirical work has examined the similarities and differences across fields. In the present study, we analyzed 62 empirical studies in engineering, biomedical science, social science, and mixed fields. The findings suggest certain instructional principles, or "golden rules," apply generally to all fields. These golden rules include maintaining a field-specific or field-general approach and emphasizing processes in training. The findings also suggest that content areas contributing to instructional effectiveness vary as a function of field. Generally, it appears that all fields may benefit from taking a multi-pronged approach to ethics education wherein the salient field issues are covered. Implications for RCR education are discussed.
Crawford, E D; Batuello, J T; Snow, P; Gamito, E J; McLeod, D G; Partin, A W; Stone, N; Montie, J; Stock, R; Lynch, J; Brandt, J
2000-05-01
The current study assesses artificial intelligence methods to identify prostate carcinoma patients at low risk for lymph node spread. If patients can be assigned accurately to a low risk group, unnecessary lymph node dissections can be avoided, thereby reducing morbidity and costs. A rule-derivation technology for simple decision-tree analysis was trained and validated using patient data from a large database (4,133 patients) to derive low risk cutoff values for Gleason sum and prostate specific antigen (PSA) level. An empiric analysis was used to derive a low risk cutoff value for clinical TNM stage. These cutoff values then were applied to 2 additional, smaller databases (227 and 330 patients, respectively) from separate institutions. The decision-tree protocol derived cutoff values of < or = 6 for Gleason sum and < or = 10.6 ng/mL for PSA. The empiric analysis yielded a clinical TNM stage low risk cutoff value of < or = T2a. When these cutoff values were applied to the larger database, 44% of patients were classified as being at low risk for lymph node metastases (0.8% false-negative rate). When the same cutoff values were applied to the smaller databases, between 11 and 43% of patients were classified as low risk with a false-negative rate of between 0.0 and 0.7%. The results of the current study indicate that a population of prostate carcinoma patients at low risk for lymph node metastases can be identified accurately using a simple decision algorithm that considers preoperative PSA, Gleason sum, and clinical TNM stage. The risk of lymph node metastases in these patients is < or = 1%; therefore, pelvic lymph node dissection may be avoided safely. The implications of these findings in surgical and nonsurgical treatment are significant.
Aktipis, C. Athena
2011-01-01
The evolution of cooperation through partner choice mechanisms is often thought to involve relatively complex cognitive abilities. Using agent-based simulations I model a simple partner choice rule, the ‘Walk Away’ rule, where individuals stay in groups that provide higher returns (by virtue of having more cooperators), and ‘Walk Away’ from groups providing low returns. Implementing this conditional movement rule in a public goods game leads to a number of interesting findings: 1) cooperators have a selective advantage when thresholds are high, corresponding to low tolerance for defectors, 2) high thresholds lead to high initial rates of movement and low final rates of movement (after selection), and 3) as cooperation is selected, the population undergoes a spatial transition from high migration (and a many small and ephemeral groups) to low migration (and large and stable groups). These results suggest that the very simple ‘Walk Away’ rule of leaving uncooperative groups can favor the evolution of cooperation, and that cooperation can evolve in populations in which individuals are able to move in response to local social conditions. A diverse array of organisms are able to leave degraded physical or social environments. The ubiquitous nature of conditional movement suggests that ‘Walk Away’ dynamics may play an important role in the evolution of social behavior in both cognitively complex and cognitively simple organisms. PMID:21666771
Ten simple rules for making research software more robust
2017-01-01
Software produced for research, published and otherwise, suffers from a number of common problems that make it difficult or impossible to run outside the original institution or even off the primary developer’s computer. We present ten simple rules to make such software robust enough to be run by anyone, anywhere, and thereby delight your users and collaborators. PMID:28407023
ERIC Educational Resources Information Center
Keane, Terence M.; And Others
1984-01-01
Developed empirically based criteria for use of the Minnesota Multiphasic Personality Inventory (MMPI) to aid in the assessment and diagnosis of Posttraumatic Stress Disorder (PTSD) in patients (N=200). Analysis based on an empircally derived decision rule correctly classified 74 percent of the patients in each group. (LLL)
ERIC Educational Resources Information Center
Bae, Hyunhoe
2012-01-01
Recently, there has been a surge in environmental regulations that require information disclosure. However, existing empirical evidence is limited to certain applications and has yet to generalize the effectiveness of this approach as a policy strategy to reduce environmental risks. This study evaluates the disclosure rule of the residential lead…
Division I Student-Athlete Degree Choice Assessment
ERIC Educational Resources Information Center
Terrell, Tony
2012-01-01
Though the NCAA has established rules that require student-athletes to complete their college degree in an expeditious manner, the 40/60/80% rule may impinge on student-athlete academic decisions (i.e., degree choice). Yet limited empirical data exist regarding the nature and prevalence of student-athlete degree impingement. The purpose of this…
Portable design rules for bulk CMOS
NASA Technical Reports Server (NTRS)
Griswold, T. W.
1982-01-01
It is pointed out that for the past several years, one school of IC designers has used a simplified set of nMOS geometric design rules (GDR) which is 'portable', in that it can be used by many different nMOS manufacturers. The present investigation is concerned with a preliminary set of design rules for bulk CMOS which has been verified for simple test structures. The GDR are defined in terms of Caltech Intermediate Form (CIF), which is a geometry-description language that defines simple geometrical objects in layers. The layers are abstractions of physical mask layers. The design rules do not presume the existence of any particular design methodology. Attention is given to p-well and n-well CMOS processes, bulk CMOS and CMOS-SOS, CMOS geometric rules, and a description of the advantages of CMOS technology.
Learning temporal rules to forecast instability in continuously monitored patients.
Guillame-Bert, Mathieu; Dubrawski, Artur; Wang, Donghan; Hravnak, Marilyn; Clermont, Gilles; Pinsky, Michael R
2017-01-01
Inductive machine learning, and in particular extraction of association rules from data, has been successfully used in multiple application domains, such as market basket analysis, disease prognosis, fraud detection, and protein sequencing. The appeal of rule extraction techniques stems from their ability to handle intricate problems yet produce models based on rules that can be comprehended by humans, and are therefore more transparent. Human comprehension is a factor that may improve adoption and use of data-driven decision support systems clinically via face validity. In this work, we explore whether we can reliably and informatively forecast cardiorespiratory instability (CRI) in step-down unit (SDU) patients utilizing data from continuous monitoring of physiologic vital sign (VS) measurements. We use a temporal association rule extraction technique in conjunction with a rule fusion protocol to learn how to forecast CRI in continuously monitored patients. We detail our approach and present and discuss encouraging empirical results obtained using continuous multivariate VS data from the bedside monitors of 297 SDU patients spanning 29 346 hours (3.35 patient-years) of observation. We present example rules that have been learned from data to illustrate potential benefits of comprehensibility of the extracted models, and we analyze the empirical utility of each VS as a potential leading indicator of an impending CRI event. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Perceptions of Rule-Breaking Related to Marine Ecosystem Health
Slater, Matthew J.; Mgaya, Yunus D.; Stead, Selina M.
2014-01-01
Finding effective solutions to manage marine resources is high on political and conservation agendas worldwide. This is made more urgent by the rate of increase in the human population and concomitant resource pressures in coastal areas. This paper links empirical socio-economic data about perceptions of marine resource health to the breaking of marine management rules, using fisheries as a case study. The relationship between perceived rule-breaking (non-compliance with regulations controlling fishing) and perceived health of inshore marine environments was investigated through face-to-face interviews with 299 heads of households in three Tanzanian coastal communities in November and December 2011. Awareness of rules controlling fishing activity was high among all respondents. Fishers were able to describe more specific rules controlling fishing practices than non-fishers (t = 3.5, df = 297, p<0.01). Perceived breaking of fishing regulations was reported by nearly half of all respondents, saying “some” (32% of responses) or “most” (15% of responses) people break fishing rules. Ordinal regression modelling revealed a significant linkage (z = −3.44, p<0.001) in the relationship between respondents' perceptions of deteriorating marine health and their perception of increased rule-breaking. In this paper, inferences from an empirical study are used to identify and argue the potential for using perceptions of ecosystem health and level of rule-breaking as a means to guide management measures. When considering different management options (e.g. Marine Protected Areas), policy makers are advised to take account of and utilise likely egoistic or altruistic decision-making factors used by fishers to determine their marine activities. PMID:24586558
Perceptions of rule-breaking related to marine ecosystem health.
Slater, Matthew J; Mgaya, Yunus D; Stead, Selina M
2014-01-01
Finding effective solutions to manage marine resources is high on political and conservation agendas worldwide. This is made more urgent by the rate of increase in the human population and concomitant resource pressures in coastal areas. This paper links empirical socio-economic data about perceptions of marine resource health to the breaking of marine management rules, using fisheries as a case study. The relationship between perceived rule-breaking (non-compliance with regulations controlling fishing) and perceived health of inshore marine environments was investigated through face-to-face interviews with 299 heads of households in three Tanzanian coastal communities in November and December 2011. Awareness of rules controlling fishing activity was high among all respondents. Fishers were able to describe more specific rules controlling fishing practices than non-fishers (t = 3.5, df = 297, p<0.01). Perceived breaking of fishing regulations was reported by nearly half of all respondents, saying "some" (32% of responses) or "most" (15% of responses) people break fishing rules. Ordinal regression modelling revealed a significant linkage (z= -3.44, p<0.001) in the relationship between respondents' perceptions of deteriorating marine health and their perception of increased rule-breaking. In this paper, inferences from an empirical study are used to identify and argue the potential for using perceptions of ecosystem health and level of rule-breaking as a means to guide management measures. When considering different management options (e.g. Marine Protected Areas), policy makers are advised to take account of and utilise likely egoistic or altruistic decision-making factors used by fishers to determine their marine activities.
Universal rule for the symmetric division of plant cells
Besson, Sébastien; Dumais, Jacques
2011-01-01
The division of eukaryotic cells involves the assembly of complex cytoskeletal structures to exert the forces required for chromosome segregation and cytokinesis. In plants, empirical evidence suggests that tensional forces within the cytoskeleton cause cells to divide along the plane that minimizes the surface area of the cell plate (Errera’s rule) while creating daughter cells of equal size. However, exceptions to Errera’s rule cast doubt on whether a broadly applicable rule can be formulated for plant cell division. Here, we show that the selection of the plane of division involves a competition between alternative configurations whose geometries represent local area minima. We find that the probability of observing a particular division configuration increases inversely with its relative area according to an exponential probability distribution known as the Gibbs measure. Moreover, a comparison across land plants and their most recent algal ancestors confirms that the probability distribution is widely conserved and independent of cell shape and size. Using a maximum entropy formulation, we show that this empirical division rule is predicted by the dynamics of the tense cytoskeletal elements that lead to the positioning of the preprophase band. Based on the fact that the division plane is selected from the sole interaction of the cytoskeleton with cell shape, we posit that the new rule represents the default mechanism for plant cell division when internal or external cues are absent. PMID:21383128
ERIC Educational Resources Information Center
Minkiewicz, Piotr; Darewicz, Malgorzata; Iwaniak, Anna
2018-01-01
A simple equation to calculate the oxidation states (oxidation numbers) of individual atoms in molecules and ions may be introduced instead of rules associated with words alone. The equation includes two of three categories of bonds, classified as proposed by Goodstein: number of bonds with more electronegative atoms and number of bonds with less…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-18
... solicit comments on the proposed rule change from interested persons. \\1\\ 15 U.S.C. 78s(b)(1). \\2\\ 17 CFR..., all tied hedge transactions (regardless of whether the option order is a simple or complex order) are... simple order the execution of the option leg of a tied hedge transaction does not qualify it for any NBBO...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-27
... solicit comments on the proposed rule change from interested persons. \\1\\ 15 U.S.C. 78s(b)(1). \\2\\ 17 CFR.... Purpose The purpose of the proposed rule change is to increase certain Simple Order Fees for Removing... market. Section I Amendments The Exchange proposes to amend the Simple Order fees in Section I, Part A of...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-16
... comments on the proposed rule change from interested persons. \\1\\ 15 U.S.C. 78s(b)(1). \\2\\ 17 CFR 240.19b-4... applicable to simple orders in the options class under Exchange Rule 6.42--Minimum Increments of Bids and..., with the increment of trading being the standard trading increment applicable to simple orders in the...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-01
... to solicit comments on the proposed rule change from interested persons. \\1\\ 15 U.S.C. 78s(b)(1). \\2... recently filed a rule change to amend its transaction fees and rebates for simple,\\6\\ non-complex orders.... \\6\\ C2 defines simple orders to exclude ETFs and indexes. \\7\\ See Securities Exchange Act Release No...
NASA Astrophysics Data System (ADS)
Gehrmann, Andreas; Nagai, Yoshimitsu; Yoshida, Osamu; Ishizu, Syohei
Since management decision-making becomes complex and preferences of the decision-maker frequently becomes inconsistent, multi-attribute decision-making problems were studied. To represent inconsistent preference relation, the concept of evaluation structure was introduced. We can generate simple rules to represent inconsistent preference relation by the evaluation structures. Further rough set theory for the preference relation was studied and the concept of approximation was introduced. One of our main aims of this paper is to introduce a concept of rough evaluation structure for representing inconsistent preference relation. We apply rough set theory to the evaluation structure, and develop a method for generating simple rules for inconsistent preference relations. In this paper, we introduce concepts of totally ordered information system, similarity class of preference relation, upper and lower approximation of preference relations. We also show the properties of rough evaluation structure and provide a simple example. As an application of rough evaluation structure, we analyze questionnaire survey of customer preferences about audio players.
Profitability of simple technical trading rules of Chinese stock exchange indexes
NASA Astrophysics Data System (ADS)
Zhu, Hong; Jiang, Zhi-Qiang; Li, Sai-Ping; Zhou, Wei-Xing
2015-12-01
Although technical trading rules have been widely used by practitioners in financial markets, their profitability still remains controversial. We here investigate the profitability of moving average (MA) and trading range break (TRB) rules by using the Shanghai Stock Exchange Composite Index (SHCI) from May 21, 1992 through December 31, 2013 and Shenzhen Stock Exchange Component Index (SZCI) from April 3, 1991 through December 31, 2013. The t-test is adopted to check whether the mean returns which are conditioned on the trading signals are significantly different from unconditioned returns and whether the mean returns conditioned on the buy signals are significantly different from the mean returns conditioned on the sell signals. We find that TRB rules outperform MA rules and short-term variable moving average (VMA) rules outperform long-term VMA rules. By applying White's Reality Check test and accounting for the data snooping effects, we find that the best trading rule outperforms the buy-and-hold strategy when transaction costs are not taken into consideration. Once transaction costs are included, trading profits will be eliminated completely. Our analysis suggests that simple trading rules like MA and TRB cannot beat the standard buy-and-hold strategy for the Chinese stock exchange indexes.
On the predictability of land surface fluxes from meteorological variables
NASA Astrophysics Data System (ADS)
Haughton, Ned; Abramowitz, Gab; Pitman, Andy J.
2018-01-01
Previous research has shown that land surface models (LSMs) are performing poorly when compared with relatively simple empirical models over a wide range of metrics and environments. Atmospheric driving data appear to provide information about land surface fluxes that LSMs are not fully utilising. Here, we further quantify the information available in the meteorological forcing data that are used by LSMs for predicting land surface fluxes, by interrogating FLUXNET data, and extending the benchmarking methodology used in previous experiments. We show that substantial performance improvement is possible for empirical models using meteorological data alone, with no explicit vegetation or soil properties, thus setting lower bounds on a priori expectations on LSM performance. The process also identifies key meteorological variables that provide predictive power. We provide an ensemble of empirical benchmarks that are simple to reproduce and provide a range of behaviours and predictive performance, acting as a baseline benchmark set for future studies. We reanalyse previously published LSM simulations and show that there is more diversity between LSMs than previously indicated, although it remains unclear why LSMs are broadly performing so much worse than simple empirical models.
NASA Technical Reports Server (NTRS)
Lamers, H. J. G. L. M.; Gathier, R.; Snow, T. P.
1980-01-01
From a study of the UV lines in the spectra of 25 stars from 04 to B1, the empirical relations between the mean density in the wind and the ionization fractions of O VI, N V, Si IV, and the excited C III (2p 3P0) level were derived. Using these empirical relations, a simple relation was derived between the mass-loss rate and the column density of any of these four ions. This relation can be used for a simple determination of the mass-loss rate from O4 to B1 stars.
An Axiomatic Theory of Cognition and Writing.
ERIC Educational Resources Information Center
Grunig, James E.; And Others
Noting that although a great deal of empirical research has been done to investigate the writing rules commonly taught, this paper points out that no one has yet constructed a deep theory of the relationship between cognition and writing that confirms the writing rules and explains how they work. The paper then uses theories and research in the…
Emotional display rules as work unit norms: a multilevel analysis of emotional labor among nurses.
Diefendorff, James M; Erickson, Rebecca J; Grandey, Alicia A; Dahling, Jason J
2011-04-01
Emotional labor theory has conceptualized emotional display rules as shared norms governing the expression of emotions at work. Using a sample of registered nurses working in different units of a hospital system, we provided the first empirical evidence that display rules can be represented as shared, unit-level beliefs. Additionally, controlling for the influence of dispositional affectivity, individual-level display rule perceptions, and emotion regulation, we found that unit-level display rules are associated with individual-level job satisfaction. We also showed that unit-level display rules relate to burnout indirectly through individual-level display rule perceptions and emotion regulation strategies. Finally, unit-level display rules also interacted with individual-level dispositional affectivity to predict employee use of emotion regulation strategies. We discuss how future research on emotional labor and display rules, particularly in the health care setting, can build on these findings.
On the Discriminant Analysis in the 2-Populations Case
NASA Astrophysics Data System (ADS)
Rublík, František
2008-01-01
The empirical Bayes Gaussian rule, which in the normal case yields good values of the probability of total error, may yield high values of the maximum probability error. From this point of view the presented modified version of the classification rule of Broffitt, Randles and Hogg appears to be superior. The modification included in this paper is termed as a WR method, and the choice of its weights is discussed. The mentioned methods are also compared with the K nearest neighbours classification rule.
Ontogeny of collective behavior reveals a simple attraction rule.
Hinz, Robert C; de Polavieja, Gonzalo G
2017-02-28
The striking patterns of collective animal behavior, including ant trails, bird flocks, and fish schools, can result from local interactions among animals without centralized control. Several of these rules of interaction have been proposed, but it has proven difficult to discriminate which ones are implemented in nature. As a method to better discriminate among interaction rules, we propose to follow the slow birth of a rule of interaction during animal development. Specifically, we followed the development of zebrafish, Danio rerio , and found that larvae turn toward each other from 7 days postfertilization and increase the intensity of interactions until 3 weeks. This developmental dataset allows testing the parameter-free predictions of a simple rule in which animals attract each other part of the time, with attraction defined as turning toward another animal chosen at random. This rule makes each individual likely move to a high density of conspecifics, and moving groups naturally emerge. Development of attraction strength corresponds to an increase in the time spent in attraction behavior. Adults were found to follow the same attraction rule, suggesting a potential significance for adults of other species.
Code of Federal Regulations, 2010 CFR
2010-04-01
...) Application of 10-percent shareholder test to interest paid to a simple trust or grantor trust. Whether interest paid to a simple trust or grantor trust and distributed to or included in the gross income of a... 26 Internal Revenue 9 2010-04-01 2010-04-01 false Rules relating to repeal of tax on interest of...
Calendar methods of fertility regulation: a rule of thumb.
Colombo, B; Scarpa, B
1996-01-01
"[Many] illiterate women, particularly in the third world, find [it] difficult to apply usual calendar methods for the regulation of fertility. Some of them are even unable to make simple subtractions. In this paper we are therefore trying to evaluate the applicability and the efficiency of an extremely simple rule which entails only [the ability to count] a number of days, and always the same way." (SUMMARY IN ITA) excerpt
Silvestre, Liliane; Martins, Wellington P; Candido-Dos-Reis, Francisco J
2015-07-29
This study describes the accuracy of three-dimensional power Doppler (3D-PD) angiography as secondary method for differential diagnosis of ovarian tumors. Seventy-five women scheduled for surgical removal of adnexal masses were assessed by transvaginal ultrasound. Ovarian tumors were classified by IOTA simple rules and two three-dimensional blocks were recorded. In a second step analyses, a 4 cm(3) spherical sample was obtained from the highest vascularized solid area of each stored block. Vascularization index (VI), flow index (FI) and vascularization-flow index (VFI) were calculated. The repeatability was assessed by concordance correlation coefficient (CCC) and limits of agreement (LoA), and diagnostic accuracy by area under ROC curve. IOTA simple rules classified 26 cases as benign, nine as inconclusive and 40 as malignant. There were eight false positive and no false negative. Among the masses classified as inconclusive or malignant by IOTA simple rules, the CCCs were 0.91 for VI, 0.70 for FI, and 0.86 for VFI. The areas under ROC curve were 0.82 for VI, 0.67 for FI and 0.81 for VFI. 3D-PD angiography presented considerable intraobserver variability and low accuracy for identifying false positive results of IOTA simple rules.
Seismic Safety Of Simple Masonry Buildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guadagnuolo, Mariateresa; Faella, Giuseppe
2008-07-08
Several masonry buildings comply with the rules for simple buildings provided by seismic codes. For these buildings explicit safety verifications are not compulsory if specific code rules are fulfilled. In fact it is assumed that their fulfilment ensures a suitable seismic behaviour of buildings and thus adequate safety under earthquakes. Italian and European seismic codes differ in the requirements for simple masonry buildings, mostly concerning the building typology, the building geometry and the acceleration at site. Obviously, a wide percentage of buildings assumed simple by codes should satisfy the numerical safety verification, so that no confusion and uncertainty have tomore » be given rise to designers who must use the codes. This paper aims at evaluating the seismic response of some simple unreinforced masonry buildings that comply with the provisions of the new Italian seismic code. Two-story buildings, having different geometry, are analysed and results from nonlinear static analyses performed by varying the acceleration at site are presented and discussed. Indications on the congruence between code rules and results of numerical analyses performed according to the code itself are supplied and, in this context, the obtained result can provide a contribution for improving the seismic code requirements.« less
ERIC Educational Resources Information Center
Kupperman, Joel J.
1978-01-01
Explores the use of the concept of inhibition in moral philosophy. Argues that there are strong practical reasons for basing moral teaching on simple moral rules and for inculcating inhibitions about breaking these rules. (Author)
New QCD sum rules based on canonical commutation relations
NASA Astrophysics Data System (ADS)
Hayata, Tomoya
2012-04-01
New derivation of QCD sum rules by canonical commutators is developed. It is the simple and straightforward generalization of Thomas-Reiche-Kuhn sum rule on the basis of Kugo-Ojima operator formalism of a non-abelian gauge theory and a suitable subtraction of UV divergences. By applying the method to the vector and axial vector current in QCD, the exact Weinberg’s sum rules are examined. Vector current sum rules and new fractional power sum rules are also discussed.
VanderWeele, Tyler J.; Staudt, Nancy
2014-01-01
In this paper we introduce methodology—causal directed acyclic graphs—that empirical researchers can use to identify causation, avoid bias, and interpret empirical results. This methodology has become popular in a number of disciplines, including statistics, biostatistics, epidemiology and computer science, but has yet to appear in the empirical legal literature. Accordingly we outline the rules and principles underlying this new methodology and then show how it can assist empirical researchers through both hypothetical and real-world examples found in the extant literature. While causal directed acyclic graphs are certainly not a panacea for all empirical problems, we show they have potential to make the most basic and fundamental tasks, such as selecting covariate controls, relatively easy and straightforward. PMID:25685055
A study of some nine-element decision rules. [for multispectral recognition of remote sensing
NASA Technical Reports Server (NTRS)
Richardson, W.
1974-01-01
A nine-element rule is one that makes a classification decision for each pixel based on data from that pixel and its eight immediate neighbors. Three such rules, all fast and simple to use, are defined and tested. All performed substantially better on field interiors than the best one-point rule. Qualitative results indicate that fine detail and contradictory testimony tend to be overlooked by the rules.
Models of Quantitative Estimations: Rule-Based and Exemplar-Based Processes Compared
ERIC Educational Resources Information Center
von Helversen, Bettina; Rieskamp, Jorg
2009-01-01
The cognitive processes underlying quantitative estimations vary. Past research has identified task-contingent changes between rule-based and exemplar-based processes (P. Juslin, L. Karlsson, & H. Olsson, 2008). B. von Helversen and J. Rieskamp (2008), however, proposed a simple rule-based model--the mapping model--that outperformed the…
ERIC Educational Resources Information Center
Lareau, Annette; Adia Evans, Shani; Yee, April
2016-01-01
Empirical research on cultural and social capital has generally ignored the key role of institutions in setting standards that determine the contingent value of this capital. Furthermore, many studies presume that the yielding of profit from cultural, social, and economic capital is automatic. Bourdieu's concept of field highlights the ''rules of…
Defying Intuition: Demonstrating the Importance of the Empirical Technique.
ERIC Educational Resources Information Center
Kohn, Art
1992-01-01
Describes a classroom activity featuring a simple stay-switch probability game. Contends that the exercise helps students see the importance of empirically validating beliefs. Includes full instructions for conducting and discussing the exercise. (CFR)
Adding Only One Priority Rule Allows Extending CIP Rules to Supramolecular Systems.
Alkorta, Ibon; Elguero, José; Cintas, Pedro
2015-05-01
There are frequent situations both in supramolecular chemistry and in crystallography that result in stereogenic centers, whose absolute configuration needs to be specified. With this aim we propose the inclusion of one simple additional rule to the Cahn-Ingold-Prelog (CIP) system of priority rules stating that noncovalent interactions have a fictitious number between 0 and 1. © 2015 Wiley Periodicals, Inc.
Congestion relaxation due to density-dependent junction rules in TASEP network
NASA Astrophysics Data System (ADS)
Tannai, Takahiro; Nishinari, Katsuhiro
2017-09-01
We now consider a small network module of Totally Asymmetric Simple Exclusion Process with branching and aggregation points, and rules of junctions dependent on the densities of segments of the network module. We also focus on the interaction among junctions which are branching and aggregation. The interaction among junctions with density-dependent rules possesses more complexity than those with density-independent rules studied in the previous papers. In conclusion, we confirm the result that density-dependent rules enable vehicles to move more effectively than the density-independent rules.
Myths and legends in learning classification rules
NASA Technical Reports Server (NTRS)
Buntine, Wray
1990-01-01
A discussion is presented of machine learning theory on empirically learning classification rules. Six myths are proposed in the machine learning community that address issues of bias, learning as search, computational learning theory, Occam's razor, universal learning algorithms, and interactive learning. Some of the problems raised are also addressed from a Bayesian perspective. Questions are suggested that machine learning researchers should be addressing both theoretically and experimentally.
On the Link Between Kolmogorov Microscales and Friction in Wall-Bounded Flow of Viscoplastic Fluids
NASA Astrophysics Data System (ADS)
Ramos, Fabio; Anbarlooei, Hamid; Cruz, Daniel; Silva Freire, Atila; Santos, Cecilia M.
2017-11-01
Most discussions in literature on the friction coefficient of turbulent flows of fluids with complex rheology are empirical. As a rule, theoretical frameworks are not available even for some relatively simple constitutive models. In this work, we present a new family of formulas for the evaluation of the friction coefficient of turbulent flows of a large family of viscoplastic fluids. The developments combine an unified analysis for the description of the Kolmogorov's micro-scales and the phenomenological turbulence model of Gioia and Chakraborty. The resulting Blasius-type friction equation has only Blasius' constant as a parameter, and tests against experimental data show excellent agreement over a significant range of Hedstrom and Reynolds numbers. The limits of the proposed model are also discussed. We also comment on the role of the new formula as a possible benchmark test for the convergence of DNS simulations of viscoplastic flows. The friction formula also provides limits for the Maximum Drag Reduction (MDR) for viscoplastic flows, which resembles MDR asymptote for viscoelastic flows.
Zhang, Ziyu; Yuan, Lang; Lee, Peter D; Jones, Eric; Jones, Julian R
2014-01-01
Bone augmentation implants are porous to allow cellular growth, bone formation and fixation. However, the design of the pores is currently based on simple empirical rules, such as minimum pore and interconnects sizes. We present a three-dimensional (3D) transient model of cellular growth based on the Navier–Stokes equations that simulates the body fluid flow and stimulation of bone precursor cellular growth, attachment, and proliferation as a function of local flow shear stress. The model's effectiveness is demonstrated for two additive manufactured (AM) titanium scaffold architectures. The results demonstrate that there is a complex interaction of flow rate and strut architecture, resulting in partially randomized structures having a preferential impact on stimulating cell migration in 3D porous structures for higher flow rates. This novel result demonstrates the potential new insights that can be gained via the modeling tool developed, and how the model can be used to perform what-if simulations to design AM structures to specific functional requirements. PMID:24664988
Fractal dimension and universality in avascular tumor growth
NASA Astrophysics Data System (ADS)
Ribeiro, Fabiano L.; dos Santos, Renato Vieira; Mata, Angélica S.
2017-04-01
For years, the comprehension of the tumor growth process has been intriguing scientists. New research has been constantly required to better understand the complexity of this phenomenon. In this paper, we propose a mathematical model that describes the properties, already known empirically, of avascular tumor growth. We present, from an individual-level (microscopic) framework, an explanation of some phenomenological (macroscopic) aspects of tumors, such as their spatial form and the way they develop. Our approach is based on competitive interaction between the cells. This simple rule makes the model able to reproduce evidence observed in real tumors, such as exponential growth in their early stage followed by power-law growth. The model also reproduces (i) the fractal-space distribution of tumor cells and (ii) the universal growth behavior observed in both animals and tumors. Our analyses suggest that the universal similarity between tumor and animal growth comes from the fact that both can be described by the same dynamic equation—the Bertalanffy-Richards model—even if they do not necessarily share the same biological properties.
Effect of social group dynamics on contagion
NASA Astrophysics Data System (ADS)
Zhao, Zhenyuan; Calderón, J. P.; Xu, Chen; Zhao, Guannan; Fenn, Dan; Sornette, Didier; Crane, Riley; Hui, Pak Ming; Johnson, Neil F.
2010-05-01
Despite the many works on contagion phenomena in both well-mixed systems and heterogeneous networks, there is still a lack of understanding of the intermediate regime where social group structures evolve on a similar time scale to individual-level transmission. We address this question by considering the process of transmission through a model population comprising social groups which follow simple dynamical rules for growth and breakup. Despite the simplicity of our model, the profiles produced bear a striking resemblance to a wide variety of real-world examples—in particular, empirical data that we have obtained for social (i.e., YouTube), financial (i.e., currency markets), and biological (i.e., colds in schools) systems. The observation of multiple resurgent peaks and abnormal decay times is qualitatively reproduced within the model simply by varying the time scales for group coalescence and fragmentation. We provide an approximate analytic treatment of the system and highlight a novel transition which arises as a result of the social group dynamics.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-04
... solicit comments on the proposed rule change from interested persons. \\1\\ 15 U.S.C. 78s(b)(1). \\2\\ 17 CFR... Executive Officer in Rule 2(c) represents a simple oversight in the 2006 amendments and seeks to correct it... investors and the public interest by allowing CHX to amend its rules to permit any Officer of the Exchange...
DOT National Transportation Integrated Search
1979-01-01
Presented is a relatively simple empirical equation that reasonably approximates the relationship between mesoscale carbon monoxide (CO) concentrations, areal vehicular CO emission rates, and the meteorological factors of wind speed and mixing height...
Forty years of Clar's aromatic π-sextet rule
Solà, Miquel
2013-01-01
In 1972 Erich Clar formulated his aromatic π-sextet rule that allows discussing qualitatively the aromatic character of benzenoid species. Now, 40 years later, Clar's aromatic π-sextet rule is still a source of inspiration for many chemists. This simple rule has been validated both experimentally and theoretically. In this review, we select some particular examples to highlight the achievement of Clar's aromatic π-sextet rule in many situations and we discuss two recent successful cases of its application. PMID:24790950
Borg, Johan; Bergman, Anna-Karin; Östergren, Per-Olof
2013-11-15
Legal empowerment of the poor is highly relevant to public health as it aims to relieve income poverty, a main determinant of health. The Commission on Legal Empowerment of the Poor (CLEP) has proposed legal empowerment measures in the following four domains: access to justice and the rule of law, property, labor, and business rights. Despite being overrepresented among the poor, CLEP has not explicitly considered the situation of people with disabilities. To examine the empirical evidence for the relevance of the CLEP legal empowerment measures to people with disabilities in low- and lower middle-income countries, and to evaluate the extent to which the Convention on the Rights of Persons with Disabilities (CRPD) addresses those measures. Critical literature review of empirical studies and a checklist assessment of the CRPD. Fourteen included articles confirm that people with disabilities experience problems in the domains of access to justice and the rule of law, labor rights, and business rights. No texts on property rights were found. Evidence for the effectiveness of the proposed measures is insufficient. Overall, the CRPD fully or partially supports two-thirds of the proposed measures (seven out of nine measures for access to justice and the rule of law, none of the five measures for property rights, all seven measures for labor rights, and six out of nine measures for business rights). Although most of the domains of the CLEP legal empowerment measures are relevant to people with disabilities from both empirical and normative perspectives, it is uncertain whether the devised measures are of immediate relevance to them. Further research is warranted in this regard.
Borg, Johan; Bergman, Anna-Karin; Östergren, Per-Olof
2013-01-01
Background Legal empowerment of the poor is highly relevant to public health as it aims to relieve income poverty, a main determinant of health. The Commission on Legal Empowerment of the Poor (CLEP) has proposed legal empowerment measures in the following four domains: access to justice and the rule of law, property, labor, and business rights. Despite being overrepresented among the poor, CLEP has not explicitly considered the situation of people with disabilities. Objectives To examine the empirical evidence for the relevance of the CLEP legal empowerment measures to people with disabilities in low- and lower middle-income countries, and to evaluate the extent to which the Convention on the Rights of Persons with Disabilities (CRPD) addresses those measures. Methods Critical literature review of empirical studies and a checklist assessment of the CRPD. Results Fourteen included articles confirm that people with disabilities experience problems in the domains of access to justice and the rule of law, labor rights, and business rights. No texts on property rights were found. Evidence for the effectiveness of the proposed measures is insufficient. Overall, the CRPD fully or partially supports two-thirds of the proposed measures (seven out of nine measures for access to justice and the rule of law, none of the five measures for property rights, all seven measures for labor rights, and six out of nine measures for business rights). Conclusions Although most of the domains of the CLEP legal empowerment measures are relevant to people with disabilities from both empirical and normative perspectives, it is uncertain whether the devised measures are of immediate relevance to them. Further research is warranted in this regard. PMID:24241720
Neutron matter within QCD sum rules
NASA Astrophysics Data System (ADS)
Cai, Bao-Jun; Chen, Lie-Wen
2018-05-01
The equation of state (EOS) of pure neutron matter (PNM) is studied in QCD sum rules (QCDSRs ). It is found that the QCDSR results on the EOS of PNM are in good agreement with predictions by current advanced microscopic many-body theories. Moreover, the higher-order density terms in quark condensates are shown to be important to describe the empirical EOS of PNM in the density region around and above nuclear saturation density although they play a minor role at subsaturation densities. The chiral condensates in PNM are also studied, and our results indicate that the higher-order density terms in quark condensates, which are introduced to reasonably describe the empirical EOS of PNM at suprasaturation densities, tend to hinder the appearance of chiral symmetry restoration in PNM at high densities.
Multistate modelling extended by behavioural rules: An application to migration.
Klabunde, Anna; Zinn, Sabine; Willekens, Frans; Leuchter, Matthias
2017-10-01
We propose to extend demographic multistate models by adding a behavioural element: behavioural rules explain intentions and thus transitions. Our framework is inspired by the Theory of Planned Behaviour. We exemplify our approach with a model of migration from Senegal to France. Model parameters are determined using empirical data where available. Parameters for which no empirical correspondence exists are determined by calibration. Age- and period-specific migration rates are used for model validation. Our approach adds to the toolkit of demographic projection by allowing for shocks and social influence, which alter behaviour in non-linear ways, while sticking to the general framework of multistate modelling. Our simulations yield that higher income growth in Senegal leads to higher emigration rates in the medium term, while a decrease in fertility yields lower emigration rates.
Learning Problem-Solving Rules as Search through a Hypothesis Space
ERIC Educational Resources Information Center
Lee, Hee Seung; Betts, Shawn; Anderson, John R.
2016-01-01
Learning to solve a class of problems can be characterized as a search through a space of hypotheses about the rules for solving these problems. A series of four experiments studied how different learning conditions affected the search among hypotheses about the solution rule for a simple computational problem. Experiment 1 showed that a problem…
Cerebellar Deep Nuclei Involvement in Cognitive Adaptation and Automaticity
ERIC Educational Resources Information Center
Callu, Delphine; Lopez, Joelle; El Massioui, Nicole
2013-01-01
To determine the role of the interpositus nuclei of cerebellum in rule-based learning and optimization processes, we studied (1) successive transfers of an initially acquired response rule in a cross maze and (2) behavioral strategies in learning a simple response rule in a T maze in interpositus lesioned rats (neurotoxic or electrolytic lesions).…
Versloot, Judith; Grudniewicz, Agnes; Chatterjee, Ananda; Hayden, Leigh; Kastner, Monika; Bhattacharyya, Onil
2015-06-01
We present simple formatting rules derived from an extensive literature review that can improve the format of clinical practice guidelines (CPGs), and potentially increase the likelihood of being used. We recently conducted a review of the literature from medicine, psychology, design, and human factors engineering on characteristics of guidelines that are associated with their use in practice, covering both the creation and communication of content. The formatting rules described in this article are derived from that review. The formatting rules are grouped into three categories that can be easily applied to CPGs: first, Vivid: make it stand out; second, Intuitive: match it to the audience's expectations, and third, Visual: use alternatives to text. We highlight rules supported by our broad literature review and provide specific 'how to' recommendations for individuals and groups developing evidence-based materials for clinicians. The way text documents are formatted influences their accessibility and usability. Optimizing the formatting of CPGs is a relatively inexpensive intervention and can be used to facilitate the dissemination of evidence in healthcare. Applying simple formatting principles to make documents more vivid, intuitive, and visual is a practical approach that has the potential to influence the usability of guidelines and to influence the extent to which guidelines are read, remembered, and used in practice.
ERIC Educational Resources Information Center
Coloma, Roland Sintos
2009-01-01
The article brings together the fields of curriculum studies, history of education, and ethnic studies to chart a transnational history of race, empire, and curriculum. Drawing from a larger study on the history of education in the Philippines under U.S. rule in the early 1900s, it argues that race played a pivotal role in the discursive…
Hidden patterns of reciprocity.
Syi
2014-03-21
Reciprocity can help the evolution of cooperation. To model both types of reciprocity, we need the concept of strategy. In the case of direct reciprocity there are four second-order action rules (Simple Tit-for-tat, Contrite Tit-for-tat, Pavlov, and Grim Trigger), which are able to promote cooperation. In the case of indirect reciprocity the key component of cooperation is the assessment rule. There are, again, four elementary second-order assessment rules (Image Scoring, Simple Standing, Stern Judging, and Shunning). The eight concepts can be formalized in an ontologically thin way we need only an action predicate and a value function, two agent concepts, and the constant of goodness. The formalism helps us to discover that the action and assessment rules can be paired, and that they show the same patterns. The logic of these patterns can be interpreted with the concept of punishment that has an inherent paradoxical nature. Copyright © 2013 Elsevier Ltd. All rights reserved.
Competency-Based Curriculum Development: A Pragmatic Approach
ERIC Educational Resources Information Center
Broski, David; And Others
1977-01-01
Examines the concept of competency-based education, describes an experience-based model for its development, and discusses some empirically derived rules-of-thumb for its application in allied health. (HD)
2008-11-01
T or more words, where T is a threshold that is empirically set to 300 in the experiment. The second rule aims to remove pornographic documents...Some blog documents are embedded with pornographic words to attract search traffic. We identify a list of pornographic words. Given a blog document, all...document, this document is considered pornographic spam, and is discarded. The third rule removes documents written in foreign languages. We count the
Myths and legends in learning classification rules
NASA Technical Reports Server (NTRS)
Buntine, Wray
1990-01-01
This paper is a discussion of machine learning theory on empirically learning classification rules. The paper proposes six myths in the machine learning community that address issues of bias, learning as search, computational learning theory, Occam's razor, 'universal' learning algorithms, and interactive learnings. Some of the problems raised are also addressed from a Bayesian perspective. The paper concludes by suggesting questions that machine learning researchers should be addressing both theoretically and experimentally.
Matsumoto, David; Yoo, Seung Hee; Hirayama, Satoko; Petrova, Galina
2005-03-01
As one component of emotion regulation, display rules, which reflect the regulation of expressive behavior, have been the topic of many studies. Despite their theoretical and empirical importance, however, to date there is no measure of display rules that assesses a full range of behavioral responses that are theoretically possible when emotion is elicited. This article reports the development of a new measure of display rules that surveys 5 expressive modes: expression, deamplification, amplification, qualification, and masking. Two studies provide evidence for its internal and temporal reliability and for its content, convergent, discriminant, external, and concurrent predictive validity. Additionally, Study 1, involving American, Russian, and Japanese participants, demonstrated predictable cultural differences on each of the expressive modes. Copyright 2005 APA, all rights reserved.
Compactness Aromaticity of Atoms in Molecules
Putz, Mihai V.
2010-01-01
A new aromaticity definition is advanced as the compactness formulation through the ratio between atoms-in-molecule and orbital molecular facets of the same chemical reactivity property around the pre- and post-bonding stabilization limit, respectively. Geometrical reactivity index of polarizability was assumed as providing the benchmark aromaticity scale, since due to its observable character; with this occasion new Hydrogenic polarizability quantum formula that recovers the exact value of 4.5 a03 for Hydrogen is provided, where a0 is the Bohr radius; a polarizability based–aromaticity scale enables the introduction of five referential aromatic rules (Aroma 1 to 5 Rules). With the help of these aromatic rules, the aromaticity scales based on energetic reactivity indices of electronegativity and chemical hardness were computed and analyzed within the major semi-empirical and ab initio quantum chemical methods. Results show that chemical hardness based-aromaticity is in better agreement with polarizability based-aromaticity than the electronegativity-based aromaticity scale, while the most favorable computational environment appears to be the quantum semi-empirical for the first and quantum ab initio for the last of them, respectively. PMID:20480020
Foraging Ecology Predicts Learning Performance in Insectivorous Bats
Clarin, Theresa M. A.; Ruczyński, Ireneusz; Page, Rachel A.
2013-01-01
Bats are unusual among mammals in showing great ecological diversity even among closely related species and are thus well suited for studies of adaptation to the ecological background. Here we investigate whether behavioral flexibility and simple- and complex-rule learning performance can be predicted by foraging ecology. We predict faster learning and higher flexibility in animals hunting in more complex, variable environments than in animals hunting in more simple, stable environments. To test this hypothesis, we studied three closely related insectivorous European bat species of the genus Myotis that belong to three different functional groups based on foraging habitats: M. capaccinii, an open water forager, M. myotis, a passive listening gleaner, and M. emarginatus, a clutter specialist. We predicted that M. capaccinii would show the least flexibility and slowest learning reflecting its relatively unstructured foraging habitat and the stereotypy of its natural foraging behavior, while the other two species would show greater flexibility and more rapid learning reflecting the complexity of their natural foraging tasks. We used a purposefully unnatural and thus species-fair crawling maze to test simple- and complex-rule learning, flexibility and re-learning performance. We found that M. capaccinii learned a simple rule as fast as the other species, but was slower in complex rule learning and was less flexible in response to changes in reward location. We found no differences in re-learning ability among species. Our results corroborate the hypothesis that animals’ cognitive skills reflect the demands of their ecological niche. PMID:23755146
Bayesian design of decision rules for failure detection
NASA Technical Reports Server (NTRS)
Chow, E. Y.; Willsky, A. S.
1984-01-01
The formulation of the decision making process of a failure detection algorithm as a Bayes sequential decision problem provides a simple conceptualization of the decision rule design problem. As the optimal Bayes rule is not computable, a methodology that is based on the Bayesian approach and aimed at a reduced computational requirement is developed for designing suboptimal rules. A numerical algorithm is constructed to facilitate the design and performance evaluation of these suboptimal rules. The result of applying this design methodology to an example shows that this approach is potentially a useful one.
ERIC Educational Resources Information Center
Endress, Ansgar D.; Hauser, Marc D.
2011-01-01
Rules, and exceptions to such rules, are ubiquitous in many domains, including language. Here we used simple artificial grammars to investigate the influence of 2 factors on the acquisition of rules and their exceptions, namely type frequency (the relative numbers of different exceptions to different regular items) and token frequency (the number…
A novel methodology for building robust design rules by using design based metrology (DBM)
NASA Astrophysics Data System (ADS)
Lee, Myeongdong; Choi, Seiryung; Choi, Jinwoo; Kim, Jeahyun; Sung, Hyunju; Yeo, Hyunyoung; Shim, Myoungseob; Jin, Gyoyoung; Chung, Eunseung; Roh, Yonghan
2013-03-01
This paper addresses a methodology for building robust design rules by using design based metrology (DBM). Conventional method for building design rules has been using a simulation tool and a simple pattern spider mask. At the early stage of the device, the estimation of simulation tool is poor. And the evaluation of the simple pattern spider mask is rather subjective because it depends on the experiential judgment of an engineer. In this work, we designed a huge number of pattern situations including various 1D and 2D design structures. In order to overcome the difficulties of inspecting many types of patterns, we introduced Design Based Metrology (DBM) of Nano Geometry Research, Inc. And those mass patterns could be inspected at a fast speed with DBM. We also carried out quantitative analysis on PWQ silicon data to estimate process variability. Our methodology demonstrates high speed and accuracy for building design rules. All of test patterns were inspected within a few hours. Mass silicon data were handled with not personal decision but statistical processing. From the results, robust design rules are successfully verified and extracted. Finally we found out that our methodology is appropriate for building robust design rules.
A Theoretical Exploration of Lawrence of Arabia’s Inner Meanings on Guerrilla Warfare
2011-07-05
surpassed perhaps only by China. After the fall of Constantinople in 1453, the Ottoman Empire extended from Baku in Azerbaijan to Algiers in North Africa...control Arabia, but as they had only a hundred thousand they were destined to fail. Consequently , in 1...For over four centuries, the Ottoman Empire had ruled the Arab Middle-East stretching its influence from Constantinople to Mecca and Yemen. But
Rands, Sean A.
2011-01-01
Functional explanations of behaviour often propose optimal strategies for organisms to follow. These ‘best’ strategies could be difficult to perform given biological constraints such as neural architecture and physiological constraints. Instead, simple heuristics or ‘rules-of-thumb’ that approximate these optimal strategies may instead be performed. From a modelling perspective, rules-of-thumb are also useful tools for considering how group behaviour is shaped by the behaviours of individuals. Using simple rules-of-thumb reduces the complexity of these models, but care needs to be taken to use rules that are biologically relevant. Here, we investigate the similarity between the outputs of a two-player dynamic foraging game (which generated optimal but complex solutions) and a computational simulation of the behaviours of the two members of a foraging pair, who instead followed a rule-of-thumb approximation of the game's output. The original game generated complex results, and we demonstrate here that the simulations following the much-simplified rules-of-thumb also generate complex results, suggesting that the rule-of-thumb was sufficient to make some of the model outcomes unpredictable. There was some agreement between both modelling techniques, but some differences arose – particularly when pair members were not identical in how they gained and lost energy. We argue that exploring how rules-of-thumb perform in comparison to their optimal counterparts is an important exercise for biologically validating the output of agent-based models of group behaviour. PMID:21765938
Rands, Sean A
2011-01-01
Functional explanations of behaviour often propose optimal strategies for organisms to follow. These 'best' strategies could be difficult to perform given biological constraints such as neural architecture and physiological constraints. Instead, simple heuristics or 'rules-of-thumb' that approximate these optimal strategies may instead be performed. From a modelling perspective, rules-of-thumb are also useful tools for considering how group behaviour is shaped by the behaviours of individuals. Using simple rules-of-thumb reduces the complexity of these models, but care needs to be taken to use rules that are biologically relevant. Here, we investigate the similarity between the outputs of a two-player dynamic foraging game (which generated optimal but complex solutions) and a computational simulation of the behaviours of the two members of a foraging pair, who instead followed a rule-of-thumb approximation of the game's output. The original game generated complex results, and we demonstrate here that the simulations following the much-simplified rules-of-thumb also generate complex results, suggesting that the rule-of-thumb was sufficient to make some of the model outcomes unpredictable. There was some agreement between both modelling techniques, but some differences arose - particularly when pair members were not identical in how they gained and lost energy. We argue that exploring how rules-of-thumb perform in comparison to their optimal counterparts is an important exercise for biologically validating the output of agent-based models of group behaviour.
An empirical and model study on automobile market in Taiwan
NASA Astrophysics Data System (ADS)
Tang, Ji-Ying; Qiu, Rong; Zhou, Yueping; He, Da-Ren
2006-03-01
We have done an empirical investigation on automobile market in Taiwan including the development of the possession rate of the companies in the market from 1979 to 2003, the development of the largest possession rate, and so on. A dynamic model for describing the competition between the companies is suggested based on the empirical study. In the model each company is given a long-term competition factor (such as technology, capital and scale) and a short-term competition factor (such as management, service and advertisement). Then the companies play games in order to obtain more possession rate in the market under certain rules. Numerical simulation based on the model display a competition developing process, which qualitatively and quantitatively agree with our empirical investigation results.
29 CFR 1206.8 - Amendment or rescission of rules in this part.
Code of Federal Regulations, 2010 CFR
2010-07-01
... interested person may petition the Board, in writing, for the issuance, amendment, or repeal of a rule or... denied in whole or in part, prompt notice shall be given of the denial, accompanied by a simple statement...
Balázs, P
2006-03-01
According to standard textbooks, the last episode of European New Age plague pandemic died out by 1720 in Marseilles. Despite this allegation, the pandemic continued in well-documented new outbreaks, which attacked and devastated Central and Eastern Europe throughout the first half of the 18th century. At the beginning, military campaigns spread the infection out of the Ottoman Empire. Later on commercial goods took over this role via land or sea from Asia or out of the eastern Mediterranean region. Finally, the plague in Europe--except Russia and the Ottoman Empire--"died out" virtually by the end of the 18th century. Explaining this, there many scientific reasons were suggested: 1. Oriental rat fleas as main vectors of infection could not tolerate any more the European weather conditions (although there were no virtual climate changes in the last 300 years). 2. Black rats that lived in close proximity to man, were being outplayed by brown rats living rather outside of human habitats; 3. There emerged less virulent Yersinia strains that caused natural human immunisation. In spite of these suggestions, which may have contributed to the success, joint civil and military health authorities blocked the plague indeed, as a result of disciplined and relentless law enforcement. In Hungary, respectively in the Hapsburg Empire, well-advised health legislation backed up the effectiveness of local authorities. Following the last great devastation in 1738-1740, the General Norm of Health Service--a voluminous decree--summed up by 1770 all the time honoured empiric rules of foregoing centuries. It can be excellently demonstrated, how exactly the empiric rules discovered a century later met scientific facts of physiology and microbiology.
26 CFR 1.401(k)-4 - SIMPLE 401(k) plan requirements.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 26 Internal Revenue 5 2012-04-01 2011-04-01 true SIMPLE 401(k) plan requirements. 1.401(k)-4 Section 1.401(k)-4 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED...(k)-4 SIMPLE 401(k) plan requirements. (a) General rule. A cash or deferred arrangement satisfies the...
26 CFR 1.401(k)-4 - SIMPLE 401(k) plan requirements.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 26 Internal Revenue 5 2014-04-01 2014-04-01 false SIMPLE 401(k) plan requirements. 1.401(k)-4 Section 1.401(k)-4 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED...(k)-4 SIMPLE 401(k) plan requirements. (a) General rule. A cash or deferred arrangement satisfies the...
26 CFR 1.401(k)-4 - SIMPLE 401(k) plan requirements.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 26 Internal Revenue 5 2011-04-01 2011-04-01 false SIMPLE 401(k) plan requirements. 1.401(k)-4 Section 1.401(k)-4 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED...(k)-4 SIMPLE 401(k) plan requirements. (a) General rule. A cash or deferred arrangement satisfies the...
26 CFR 1.401(k)-4 - SIMPLE 401(k) plan requirements.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 26 Internal Revenue 5 2013-04-01 2013-04-01 false SIMPLE 401(k) plan requirements. 1.401(k)-4 Section 1.401(k)-4 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED...(k)-4 SIMPLE 401(k) plan requirements. (a) General rule. A cash or deferred arrangement satisfies the...
Crises and Collective Socio-Economic Phenomena: Simple Models and Challenges
NASA Astrophysics Data System (ADS)
Bouchaud, Jean-Philippe
2013-05-01
Financial and economic history is strewn with bubbles and crashes, booms and busts, crises and upheavals of all sorts. Understanding the origin of these events is arguably one of the most important problems in economic theory. In this paper, we review recent efforts to include heterogeneities and interactions in models of decision. We argue that the so-called Random Field Ising model ( rfim) provides a unifying framework to account for many collective socio-economic phenomena that lead to sudden ruptures and crises. We discuss different models that can capture potentially destabilizing self-referential feedback loops, induced either by herding, i.e. reference to peers, or trending, i.e. reference to the past, and that account for some of the phenomenology missing in the standard models. We discuss some empirically testable predictions of these models, for example robust signatures of rfim-like herding effects, or the logarithmic decay of spatial correlations of voting patterns. One of the most striking result, inspired by statistical physics methods, is that Adam Smith's invisible hand can fail badly at solving simple coordination problems. We also insist on the issue of time-scales, that can be extremely long in some cases, and prevent socially optimal equilibria from being reached. As a theoretical challenge, the study of so-called "detailed-balance" violating decision rules is needed to decide whether conclusions based on current models (that all assume detailed-balance) are indeed robust and generic.
Opinion evolution based on cellular automata rules in small world networks
NASA Astrophysics Data System (ADS)
Shi, Xiao-Ming; Shi, Lun; Zhang, Jie-Fang
2010-03-01
In this paper, we apply cellular automata rules, which can be given by a truth table, to human memory. We design each memory as a tracking survey mode that keeps the most recent three opinions. Each cellular automata rule, as a personal mechanism, gives the final ruling in one time period based on the data stored in one's memory. The key focus of the paper is to research the evolution of people's attitudes to the same question. Based on a great deal of empirical observations from computer simulations, all the rules can be classified into 20 groups. We highlight the fact that the phenomenon shown by some rules belonging to the same group will be altered within several steps by other rules in different groups. It is truly amazing that, compared with the last hundreds of presidential voting in America, the eras of important events in America's history coincide with the simulation results obtained by our model.
NASA Astrophysics Data System (ADS)
Fuchs, Christopher A.; Schack, Rüdiger
2013-10-01
In the quantum-Bayesian interpretation of quantum theory (or QBism), the Born rule cannot be interpreted as a rule for setting measurement-outcome probabilities from an objective quantum state. But if not, what is the role of the rule? In this paper, the argument is given that it should be seen as an empirical addition to Bayesian reasoning itself. Particularly, it is shown how to view the Born rule as a normative rule in addition to usual Dutch-book coherence. It is a rule that takes into account how one should assign probabilities to the consequences of various intended measurements on a physical system, but explicitly in terms of prior probabilities for and conditional probabilities consequent upon the imagined outcomes of a special counterfactual reference measurement. This interpretation is exemplified by representing quantum states in terms of probabilities for the outcomes of a fixed, fiducial symmetric informationally complete measurement. The extent to which the general form of the new normative rule implies the full state-space structure of quantum mechanics is explored.
Molnets: An Artificial Chemistry Based on Neural Networks
NASA Technical Reports Server (NTRS)
Colombano, Silvano; Luk, Johnny; Segovia-Juarez, Jose L.; Lohn, Jason; Clancy, Daniel (Technical Monitor)
2002-01-01
The fundamental problem in the evolution of matter is to understand how structure-function relationships are formed and increase in complexity from the molecular level all the way to a genetic system. We have created a system where structure-function relationships arise naturally and without the need of ad hoc function assignments to given structures. The idea was inspired by neural networks, where the structure of the net embodies specific computational properties. In this system networks interact with other networks to create connections between the inputs of one net and the outputs of another. The newly created net then recomputes its own synaptic weights, based on anti-hebbian rules. As a result some connections may be cut, and multiple nets can emerge as products of a 'reaction'. The idea is to study emergent reaction behaviors, based on simple rules that constitute a pseudophysics of the system. These simple rules are parameterized to produce behaviors that emulate chemical reactions. We find that these simple rules show a gradual increase in the size and complexity of molecules. We have been building a virtual artificial chemistry laboratory for discovering interesting reactions and for testing further ideas on the evolution of primitive molecules. Some of these ideas include the potential effect of membranes and selective diffusion according to molecular size.
ERIC Educational Resources Information Center
Mitchell, Paul; Kemp, Nenagh; Bryant, Peter
2011-01-01
The purpose of this research was to examine whether adults rely on morphemic spelling rules or word-specific knowledge when spelling simple words. We examined adults' knowledge of two of the simplest and most reliable rules in English spelling concerning the morphological word ending -s. This spelling is required for regular plural nouns (e.g.,…
Some properties of the two-body effective interaction in the /sup 208/Pb region
DOE Office of Scientific and Technical Information (OSTI.GOV)
Groleau, R.
The (/sup 3/He,d) and (/sup 4/He,t) single proton transfer reactions on /sup 208/Pb and /sup 209/Bi were studied using 30 and 40 MeV He beams from the Princeton Cyclotron Laboratory. The outgoing d and t were detected by a position sensitive proportional counter in the focal plane of a Q-3D spectrometer. The resolution varied between 10 and 14 keV (FWHM). Using the ratio of the cross-sections for the (/sup 3/He,d) and (/sup 4/He,t) reactions to determine the magnitude of the angular momentum transfers, the spectroscopic factors for the reaction on /sup 209/Bi have been measured relative to the transitions tomore » the single particle states in these reactions on /sup 208/Pb. Sum rules as developed by Bansal and French are used to study the configurations vertical bar h/sub 9/2 x h/sub 9/2/>, vertical bar h/sub 9/2/ x f/sub 7/2/>, vertical bar h/sub 9/2 x i/sub 13/2/>, vertical bar h/sub 9/2/ x f/sub 5/2/>and part of vertical bar h/sub 9/2/ x p/sub 3/2/> and vertical bar h/sub 9/2/ x p/sub 1/2>. Using the linear energy weighted sum rule, the diagonal matrix elements of the effective interaction between valence protons around the /sup 208/Pb core are deduced. The matrix elements obtained from a simple empirical interaction V/sub I//sup T=1/ of a pure Wigner type are compared to the extracted matrix elements. The interaction is characterized by an attractive short-range (0.82j and a repulsive long-range (8.2fm) potential: V/sub I//sup T = 1/ (MeV =-/96 e/sup - (r/0.82) /sup 2// + 0.51 e/sup -(r/8.2)/sup 2/. The core polarization is studied using the experimental static electric quadrupole and magnetic dipole moments of the nuclei in the /sup 208/Pb region. In general, the magnetic moments of multiple valence nucleon nuclei are well predicted by simple rules of Racah algebra. The three and four valence proton spectra (/sup 211/At and /sup 212/Rn) calculated with the experimental two particle matrix elements agree well with the experimental spectra.« less
Using Paperclips to Explain Empirical Formulas to Students
ERIC Educational Resources Information Center
Nassiff, Peter; Czerwinski, Wendy A.
2014-01-01
Early in their chemistry education, students learn to do empirical formula calculations by rote without an understanding of the historical context behind them or the reason why their calculations work. In these activities, students use paperclip "atoms", construct a series of simple compounds representing real molecules, and discover,…
NASA Astrophysics Data System (ADS)
Colaiori, Francesca; Castellano, Claudio; Cuskley, Christine F.; Loreto, Vittorio; Pugliese, Martina; Tria, Francesca
2015-01-01
Empirical evidence shows that the rate of irregular usage of English verbs exhibits discontinuity as a function of their frequency: the most frequent verbs tend to be totally irregular. We aim to qualitatively understand the origin of this feature by studying simple agent-based models of language dynamics, where each agent adopts an inflectional state for a verb and may change it upon interaction with other agents. At the same time, agents are replaced at some rate by new agents adopting the regular form. In models with only two inflectional states (regular and irregular), we observe that either all verbs regularize irrespective of their frequency, or a continuous transition occurs between a low-frequency state, where the lemma becomes fully regular, and a high-frequency one, where both forms coexist. Introducing a third (mixed) state, wherein agents may use either form, we find that a third, qualitatively different behavior may emerge, namely, a discontinuous transition in frequency. We introduce and solve analytically a very general class of three-state models that allows us to fully understand these behaviors in a unified framework. Realistic sets of interaction rules, including the well-known naming game (NG) model, result in a discontinuous transition, in agreement with recent empirical findings. We also point out that the distinction between speaker and hearer in the interaction has no effect on the collective behavior. The results for the general three-state model, although discussed in terms of language dynamics, are widely applicable.
Adaptive group coordination and role differentiation.
Roberts, Michael E; Goldstone, Robert L
2011-01-01
Many real world situations (potluck dinners, academic departments, sports teams, corporate divisions, committees, seminar classes, etc.) involve actors adjusting their contributions in order to achieve a mutually satisfactory group goal, a win-win result. However, the majority of human group research has involved situations where groups perform poorly because task constraints promote either individual maximization behavior or diffusion of responsibility, and even successful tasks generally involve the propagation of one correct solution through a group. Here we introduce a group task that requires complementary actions among participants in order to reach a shared goal. Without communication, group members submit numbers in an attempt to collectively sum to a randomly selected target number. After receiving group feedback, members adjust their submitted numbers until the target number is reached. For all groups, performance improves with task experience, and group reactivity decreases over rounds. Our empirical results provide evidence for adaptive coordination in human groups, and as the coordination costs increase with group size, large groups adapt through spontaneous role differentiation and self-consistency among members. We suggest several agent-based models with different rules for agent reactions, and we show that the empirical results are best fit by a flexible, adaptive agent strategy in which agents decrease their reactions when the group feedback changes. The task offers a simple experimental platform for studying the general problem of group coordination while maximizing group returns, and we distinguish the task from several games in behavioral game theory.
Spin-dependent sum rules connecting real and virtual Compton scattering verified
NASA Astrophysics Data System (ADS)
Lensky, Vadim; Pascalutsa, Vladimir; Vanderhaeghen, Marc; Kao, Chung Wen
2017-04-01
We present a detailed derivation of the two sum rules relating the spin polarizabilities measured in real, virtual, and doubly virtual Compton scattering. For example, the polarizability δL T , accessed in inclusive electron scattering, is related to the spin polarizability γE 1 E 1 and the slope of generalized polarizabilities P(M 1 ,M 1 )1-P(L 1 ,L 1 )1 , measured in, respectively, the real and the virtual Compton scattering. We verify these sum rules in different variants of chiral perturbation theory, discuss their empirical verification for the proton, and prospect their use in studies of the nucleon spin structure.
Olympic Scoring of English Compositions
ERIC Educational Resources Information Center
Follman, John; Panther, Edward
1974-01-01
Examines empirically the efficacy of utilizing Olympic diving and gymnastic scoring systems for grading graduate students' English compositions. Results indicated that such scoring rules do not produce ratings different in reliability or in level from conventional letter grades. (ED)
A rule-based software test data generator
NASA Technical Reports Server (NTRS)
Deason, William H.; Brown, David B.; Chang, Kai-Hsiung; Cross, James H., II
1991-01-01
Rule-based software test data generation is proposed as an alternative to either path/predicate analysis or random data generation. A prototype rule-based test data generator for Ada programs is constructed and compared to a random test data generator. Four Ada procedures are used in the comparison. Approximately 2000 rule-based test cases and 100,000 randomly generated test cases are automatically generated and executed. The success of the two methods is compared using standard coverage metrics. Simple statistical tests showing that even the primitive rule-based test data generation prototype is significantly better than random data generation are performed. This result demonstrates that rule-based test data generation is feasible and shows great promise in assisting test engineers, especially when the rule base is developed further.
A SIMPLE CELLULAR AUTOMATON MODEL FOR HIGH-LEVEL VEGETATION DYNAMICS
We have produced a simple two-dimensional (ground-plan) cellular automata model of vegetation dynamics specifically to investigate high-level community processes. The model is probabilistic, with individual plant behavior determined by physiologically-based rules derived from a w...
Beyond Molecular Codes: Simple Rules to Wire Complex Brains
Hassan, Bassem A.; Hiesinger, P. Robin
2015-01-01
Summary Molecular codes, like postal zip codes, are generally considered a robust way to ensure the specificity of neuronal target selection. However, a code capable of unambiguously generating complex neural circuits is difficult to conceive. Here, we re-examine the notion of molecular codes in the light of developmental algorithms. We explore how molecules and mechanisms that have been considered part of a code may alternatively implement simple pattern formation rules sufficient to ensure wiring specificity in neural circuits. This analysis delineates a pattern-based framework for circuit construction that may contribute to our understanding of brain wiring. PMID:26451480
The Empirical Attitude, Material Practice and Design Activities
ERIC Educational Resources Information Center
Apedoe, Xornam; Ford, Michael
2010-01-01
This article is an argument about something that is both important and severely underemphasized in most current science curricula. The empirical attitude, fundamental to science since Galileo, is a habit of mind that motivates an active search for feedback on our ideas from the material world. Although more simple views of science manifest the…
Making the Cut: Lattice Kirigami Rules
NASA Astrophysics Data System (ADS)
Castle, Toen; Cho, Yigil; Gong, Xingting; Jung, Euiyeon; Sussman, Daniel M.; Yang, Shu; Kamien, Randall D.
2014-12-01
In this Letter we explore and develop a simple set of rules that apply to cutting, pasting, and folding honeycomb lattices. We consider origami-like structures that are extrinsically flat away from zero-dimensional sources of Gaussian curvature and one-dimensional sources of mean curvature, and our cutting and pasting rules maintain the intrinsic bond lengths on both the lattice and its dual lattice. We find that a small set of rules is allowed providing a framework for exploring and building kirigami—folding, cutting, and pasting the edges of paper.
Bureaucracy, Safety and Software: a Potentially Lethal Cocktail
NASA Astrophysics Data System (ADS)
Hatton, Les
This position paper identifies a potential problem with the evolution of software controlled safety critical systems. It observes that the rapid growth of bureaucracy in society quickly spills over into rules for behaviour. Whether the need for the rules comes first or there is simple anticipation of the need for a rule by a bureaucrat is unclear in many cases. Many such rules lead to draconian restrictions and often make the existing situation worse due to the presence of unintended consequences as will be shown with a number of examples.
Semi-empirical master curve concept describing the rate capability of lithium insertion electrodes
NASA Astrophysics Data System (ADS)
Heubner, C.; Seeba, J.; Liebmann, T.; Nickol, A.; Börner, S.; Fritsch, M.; Nikolowski, K.; Wolter, M.; Schneider, M.; Michaelis, A.
2018-03-01
A simple semi-empirical master curve concept, describing the rate capability of porous insertion electrodes for lithium-ion batteries, is proposed. The model is based on the evaluation of the time constants of lithium diffusion in the liquid electrolyte and the solid active material. This theoretical approach is successfully verified by comprehensive experimental investigations of the rate capability of a large number of porous insertion electrodes with various active materials and design parameters. It turns out, that the rate capability of all investigated electrodes follows a simple master curve governed by the time constant of the rate limiting process. We demonstrate that the master curve concept can be used to determine optimum design criteria meeting specific requirements in terms of maximum gravimetric capacity for a desired rate capability. The model further reveals practical limits of the electrode design, attesting the empirically well-known and inevitable tradeoff between energy and power density.
Influence of dispatching rules on average production lead time for multi-stage production systems.
Hübl, Alexander; Jodlbauer, Herbert; Altendorfer, Klaus
2013-08-01
In this paper the influence of different dispatching rules on the average production lead time is investigated. Two theorems based on covariance between processing time and production lead time are formulated and proved theoretically. Theorem 1 links the average production lead time to the "processing time weighted production lead time" for the multi-stage production systems analytically. The influence of different dispatching rules on average lead time, which is well known from simulation and empirical studies, can be proved theoretically in Theorem 2 for a single stage production system. A simulation study is conducted to gain more insight into the influence of dispatching rules on average production lead time in a multi-stage production system. We find that the "processing time weighted average production lead time" for a multi-stage production system is not invariant of the applied dispatching rule and can be used as a dispatching rule independent indicator for single-stage production systems.
Simple methods of exploiting the underlying structure of rule-based systems
NASA Technical Reports Server (NTRS)
Hendler, James
1986-01-01
Much recent work in the field of expert systems research has aimed at exploiting the underlying structures of the rule base for reasons of analysis. Such techniques as Petri-nets and GAGs have been proposed as representational structures that will allow complete analysis. Much has been made of proving isomorphisms between the rule bases and the mechanisms, and in examining the theoretical power of this analysis. In this paper we describe some early work in a new system which has much simpler (and thus, one hopes, more easily achieved) aims and less formality. The technique being examined is a very simple one: OPS5 programs are analyzed in a purely syntactic way and a FSA description is generated. In this paper we describe the technique and some user interface tools which exploit this structure.
Hierarchy of Certain Types of DNA Splicing Systems
NASA Astrophysics Data System (ADS)
Yusof, Yuhani; Sarmin, Nor Haniza; Goode, T. Elizabeth; Mahmud, Mazri; Heng, Fong Wan
A Head splicing system (H-system)consists of a finite set of strings (words) written over a finite alphabet, along with a finite set of rules that acts on the strings by iterated cutting and pasting to create a splicing language. Any interpretation that is aligned with Tom Head's original idea is one in which the strings represent double-stranded deoxyribonucleic acid (dsDNA) and the rules represent the cutting and pasting action of restriction enzymes and ligase, respectively. A new way of writing the rule sets is adopted so as to make the biological interpretation transparent. This approach is used in a formal language- theoretic analysis of the hierarchy of certain classes of splicing systems, namely simple, semi-simple and semi-null splicing systems. The relations between such systems and their associated languages are given as theorems, corollaries and counterexamples.
A New Approach for Resolving Conflicts in Actionable Behavioral Rules
Zhu, Dan; Zeng, Daniel
2014-01-01
Knowledge is considered actionable if users can take direct actions based on such knowledge to their advantage. Among the most important and distinctive actionable knowledge are actionable behavioral rules that can directly and explicitly suggest specific actions to take to influence (restrain or encourage) the behavior in the users' best interest. However, in mining such rules, it often occurs that different rules may suggest the same actions with different expected utilities, which we call conflicting rules. To resolve the conflicts, a previous valid method was proposed. However, inconsistency of the measure for rule evaluating may hinder its performance. To overcome this problem, we develop a new method that utilizes rule ranking procedure as the basis for selecting the rule with the highest utility prediction accuracy. More specifically, we propose an integrative measure, which combines the measures of the support and antecedent length, to evaluate the utility prediction accuracies of conflicting rules. We also introduce a tunable weight parameter to allow the flexibility of integration. We conduct several experiments to test our proposed approach and evaluate the sensitivity of the weight parameter. Empirical results indicate that our approach outperforms those from previous research. PMID:25162054
26 CFR 1.652(a)-1 - Simple trusts; inclusion of amounts in income of beneficiaries.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 26 Internal Revenue 8 2012-04-01 2012-04-01 false Simple trusts; inclusion of amounts in income of beneficiaries. 1.652(a)-1 Section 1.652(a)-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE... Only § 1.652(a)-1 Simple trusts; inclusion of amounts in income of beneficiaries. Subject to the rules...
26 CFR 1.652(a)-1 - Simple trusts; inclusion of amounts in income of beneficiaries.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 26 Internal Revenue 8 2014-04-01 2014-04-01 false Simple trusts; inclusion of amounts in income of beneficiaries. 1.652(a)-1 Section 1.652(a)-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE... Only § 1.652(a)-1 Simple trusts; inclusion of amounts in income of beneficiaries. Subject to the rules...
26 CFR 1.652(a)-1 - Simple trusts; inclusion of amounts in income of beneficiaries.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 26 Internal Revenue 8 2013-04-01 2013-04-01 false Simple trusts; inclusion of amounts in income of beneficiaries. 1.652(a)-1 Section 1.652(a)-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE... Only § 1.652(a)-1 Simple trusts; inclusion of amounts in income of beneficiaries. Subject to the rules...
26 CFR 1.652(a)-1 - Simple trusts; inclusion of amounts in income of beneficiaries.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 26 Internal Revenue 8 2011-04-01 2011-04-01 false Simple trusts; inclusion of amounts in income of beneficiaries. 1.652(a)-1 Section 1.652(a)-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE... Only § 1.652(a)-1 Simple trusts; inclusion of amounts in income of beneficiaries. Subject to the rules...
The Aromaticity of Pericyclic Reaction Transition States
ERIC Educational Resources Information Center
Rzepa, Henry S.
2007-01-01
An approach is presented that starts from two fundamental concepts in organic chemistry, chirality and aromaticity, and combines them into a simple rule for stating selection rules for pericyclic reactions in terms of achiral Huckel-aromatic and chiral Mobius-aromatic transition states. This is illustrated using an example that leads to apparent…
Learning Non-Adjacent Regularities at Age 0 ; 7
ERIC Educational Resources Information Center
Gervain, Judit; Werker, Janet F.
2013-01-01
One important mechanism suggested to underlie the acquisition of grammar is rule learning. Indeed, infants aged 0 ; 7 are able to learn rules based on simple identity relations (adjacent repetitions, ABB: "wo fe fe" and non-adjacent repetitions, ABA: "wo fe wo", respectively; Marcus et al., 1999). One unexplored issue is…
Load Capacity Estimation of Foil Air Journal Bearings for Oil-Free Turbomachinery Applications
NASA Technical Reports Server (NTRS)
DellaCorte, Christopher; Valco, Mark J.
2000-01-01
This paper introduces a simple "Rule of Thumb" (ROT) method to estimate the load capacity of foil air journal bearings, which are self-acting compliant-surface hydrodynamic bearings being considered for Oil-Free turbo-machinery applications such as gas turbine engines. The ROT is based on first principles and data available in the literature and it relates bearing load capacity to the bearing size and speed through an empirically based load capacity coefficient, D. It is shown that load capacity is a linear function of bearing surface velocity and bearing projected area. Furthermore, it was found that the load capacity coefficient, D, is related to the design features of the bearing compliant members and operating conditions (speed and ambient temperature). Early bearing designs with basic or "first generation" compliant support elements have relatively low load capacity. More advanced bearings, in which the compliance of the support structure is tailored, have load capacities up to five times those of simpler designs. The ROT enables simplified load capacity estimation for foil air journal bearings and can guide development of new Oil-Free turbomachinery systems.
Modelling Of Flotation Processes By Classical Mathematical Methods - A Review
NASA Astrophysics Data System (ADS)
Jovanović, Ivana; Miljanović, Igor
2015-12-01
Flotation process modelling is not a simple task, mostly because of the process complexity, i.e. the presence of a large number of variables that (to a lesser or a greater extent) affect the final outcome of the mineral particles separation based on the differences in their surface properties. The attempts toward the development of the quantitative predictive model that would fully describe the operation of an industrial flotation plant started in the middle of past century and it lasts to this day. This paper gives a review of published research activities directed toward the development of flotation models based on the classical mathematical rules. The description and systematization of classical flotation models were performed according to the available references, with emphasize exclusively given to the flotation process modelling, regardless of the model application in a certain control system. In accordance with the contemporary considerations, models were classified as the empirical, probabilistic, kinetic and population balance types. Each model type is presented through the aspects of flotation modelling at the macro and micro process levels.
Network-based model of the growth of termite nests
NASA Astrophysics Data System (ADS)
Eom, Young-Ho; Perna, Andrea; Fortunato, Santo; Darrouzet, Eric; Theraulaz, Guy; Jost, Christian
2015-12-01
We present a model for the growth of the transportation network inside nests of the social insect subfamily Termitinae (Isoptera, termitidae). These nests consist of large chambers (nodes) connected by tunnels (edges). The model based on the empirical analysis of the real nest networks combined with pruning (edge removal, either random or weighted by betweenness centrality) and a memory effect (preferential growth from the latest added chambers) successfully predicts emergent nest properties (degree distribution, size of the largest connected component, average path lengths, backbone link ratios, and local graph redundancy). The two pruning alternatives can be associated with different genuses in the subfamily. A sensitivity analysis on the pruning and memory parameters indicates that Termitinae networks favor fast internal transportation over efficient defense strategies against ant predators. Our results provide an example of how complex network organization and efficient network properties can be generated from simple building rules based on local interactions and contribute to our understanding of the mechanisms that come into play for the formation of termite networks and of biological transportation networks in general.
Zhang, Ziyu; Yuan, Lang; Lee, Peter D; Jones, Eric; Jones, Julian R
2014-11-01
Bone augmentation implants are porous to allow cellular growth, bone formation and fixation. However, the design of the pores is currently based on simple empirical rules, such as minimum pore and interconnects sizes. We present a three-dimensional (3D) transient model of cellular growth based on the Navier-Stokes equations that simulates the body fluid flow and stimulation of bone precursor cellular growth, attachment, and proliferation as a function of local flow shear stress. The model's effectiveness is demonstrated for two additive manufactured (AM) titanium scaffold architectures. The results demonstrate that there is a complex interaction of flow rate and strut architecture, resulting in partially randomized structures having a preferential impact on stimulating cell migration in 3D porous structures for higher flow rates. This novel result demonstrates the potential new insights that can be gained via the modeling tool developed, and how the model can be used to perform what-if simulations to design AM structures to specific functional requirements. © 2014 Wiley Periodicals, Inc.
Structural and configurational properties of nanoconfined monolayer ice from first principles
Corsetti, Fabiano; Matthews, Paul; Artacho, Emilio
2016-01-01
Understanding the structural tendencies of nanoconfined water is of great interest for nanoscience and biology, where nano/micro-sized objects may be separated by very few layers of water. Here we investigate the properties of ice confined to a quasi-2D monolayer by a featureless, chemically neutral potential, in order to characterize its intrinsic behaviour. We use density-functional theory simulations with a non-local van der Waals density functional. An ab initio random structure search reveals all the energetically competitive monolayer configurations to belong to only two of the previously-identified families, characterized by a square or honeycomb hydrogen-bonding network, respectively. We discuss the modified ice rules needed for each network, and propose a simple point dipole 2D lattice model that successfully explains the energetics of the square configurations. All identified stable phases for both networks are found to be non-polar (but with a topologically non-trivial texture for the square) and, hence, non-ferroelectric, in contrast to previous predictions from a five-site empirical force-field model. Our results are in good agreement with very recently reported experimental observations. PMID:26728125
Structural and configurational properties of nanoconfined monolayer ice from first principles
NASA Astrophysics Data System (ADS)
Corsetti, Fabiano; Matthews, Paul; Artacho, Emilio
2016-01-01
Understanding the structural tendencies of nanoconfined water is of great interest for nanoscience and biology, where nano/micro-sized objects may be separated by very few layers of water. Here we investigate the properties of ice confined to a quasi-2D monolayer by a featureless, chemically neutral potential, in order to characterize its intrinsic behaviour. We use density-functional theory simulations with a non-local van der Waals density functional. An ab initio random structure search reveals all the energetically competitive monolayer configurations to belong to only two of the previously-identified families, characterized by a square or honeycomb hydrogen-bonding network, respectively. We discuss the modified ice rules needed for each network, and propose a simple point dipole 2D lattice model that successfully explains the energetics of the square configurations. All identified stable phases for both networks are found to be non-polar (but with a topologically non-trivial texture for the square) and, hence, non-ferroelectric, in contrast to previous predictions from a five-site empirical force-field model. Our results are in good agreement with very recently reported experimental observations.
Dependence of two-proton radioactivity on nuclear pairing models
NASA Astrophysics Data System (ADS)
Oishi, Tomohiro; Kortelainen, Markus; Pastore, Alessandro
2017-10-01
Sensitivity of two-proton emitting decay to nuclear pairing correlation is discussed within a time-dependent three-body model. We focus on the 6Be nucleus assuming α +p +p configuration, and its decay process is described as a time evolution of the three-body resonance state. For a proton-proton subsystem, a schematic density-dependent contact (SDDC) pairing model is employed. From the time-dependent calculation, we observed the exponential decay rule of a two-proton emission. It is shown that the density dependence does not play a major role in determining the decay width, which can be controlled only by the asymptotic strength of the pairing interaction. This asymptotic pairing sensitivity can be understood in terms of the dynamics of the wave function driven by the three-body Hamiltonian, by monitoring the time-dependent density distribution. With this simple SDDC pairing model, there remains an impossible trinity problem: it cannot simultaneously reproduce the empirical Q value, decay width, and the nucleon-nucleon scattering length. This problem suggests that a further sophistication of the theoretical pairing model is necessary, utilizing the two-proton radioactivity data as the reference quantities.
Empirical Observations on the Sensitivity of Hot Cathode Ionization Type Vacuum Gages
NASA Technical Reports Server (NTRS)
Summers, R. L.
1969-01-01
A study of empirical methods of predicting tile relative sensitivities of hot cathode ionization gages is presented. Using previously published gage sensitivities, several rules for predicting relative sensitivity are tested. The relative sensitivity to different gases is shown to be invariant with gage type, in the linear range of gage operation. The total ionization cross section, molecular and molar polarizability, and refractive index are demonstrated to be useful parameters for predicting relative gage sensitivity. Using data from the literature, the probable error of predictions of relative gage sensitivity based on these molecular properties is found to be about 10 percent. A comprehensive table of predicted relative sensitivities, based on empirical methods, is presented.
Children's Criteria for Representational Adequacy in the Perception of Simple Sonic Stimuli
ERIC Educational Resources Information Center
Verschaffel, Lieven; Reybrouck, Mark; Jans, Christine; Van Dooren, Wim
2010-01-01
This study investigates children's metarepresentational competence with regard to listening to and making sense of simple sonic stimuli. Using diSessa's (2003) work on metarepresentational competence in mathematics and sciences as theoretical and empirical background, it aims to assess children's criteria for representational adequacy of graphical…
A Simple Estimation Method for Aggregate Government Outsourcing
ERIC Educational Resources Information Center
Minicucci, Stephen; Donahue, John D.
2004-01-01
The scholarly and popular debate on the delegation to the private sector of governmental tasks rests on an inadequate empirical foundation, as no systematic data are collected on direct versus indirect service delivery. We offer a simple method for approximating levels of service outsourcing, based on relatively straightforward combinations of and…
Using pattern enumeration to accelerate process development and ramp yield
NASA Astrophysics Data System (ADS)
Zhuang, Linda; Pang, Jenny; Xu, Jessy; Tsai, Mengfeng; Wang, Amy; Zhang, Yifan; Sweis, Jason; Lai, Ya-Chieh; Ding, Hua
2016-03-01
During a new technology node process setup phase, foundries do not initially have enough product chip designs to conduct exhaustive process development. Different operational teams use manually designed simple test keys to set up their process flows and recipes. When the very first version of the design rule manual (DRM) is ready, foundries enter the process development phase where new experiment design data is manually created based on these design rules. However, these IP/test keys contain very uniform or simple design structures. This kind of design normally does not contain critical design structures or process unfriendly design patterns that pass design rule checks but are found to be less manufacturable. It is desired to have a method to generate exhaustive test patterns allowed by design rules at development stage to verify the gap of design rule and process. This paper presents a novel method of how to generate test key patterns which contain known problematic patterns as well as any constructs which designers could possibly draw based on current design rules. The enumerated test key patterns will contain the most critical design structures which are allowed by any particular design rule. A layout profiling method is used to do design chip analysis in order to find potential weak points on new incoming products so fab can take preemptive action to avoid yield loss. It can be achieved by comparing different products and leveraging the knowledge learned from previous manufactured chips to find possible yield detractors.
Evaluation of rules to distinguish unique female grizzly bears with cubs in Yellowstone
Schwartz, C.C.; Haroldson, M.A.; Cherry, S.; Keating, K.A.
2008-01-01
The United States Fish and Wildlife Service uses counts of unduplicated female grizzly bears (Ursus arctos) with cubs-of-the-year to establish limits of sustainable mortality in the Greater Yellowstone Ecosystem, USA. Sightings are dustered into observations of unique bears based on an empirically derived rule set. The method has never been tested or verified. To evaluate the rule set, we used data from radiocollared females obtained during 1975-2004 to simulate populations under varying densities, distributions, and sighting frequencies. We tested individual rules and rule-set performance, using custom software to apply the rule-set and duster sightings. Results indicated most rules were violated to some degree, and rule-based dustering consistently underestimated the minimum number of females and total population size derived from a nonparametric estimator (Chao2). We conclude that the current rule set returns conservative estimates, but with minor improvements, counts of unduplicated females-with-cubs can serve as a reasonable index of population size useful for establishing annual mortality limits. For the Yellowstone population, the index is more practical and cost-effective than capture-mark-recapture using either DNA hair snagging or aerial surveys with radiomarked bears. The method has useful application in other ecosystems, but we recommend rules used to distinguish unique females be adapted to local conditions and tested.
Some Myths You May Have Heard about First Language Acquisition.
ERIC Educational Resources Information Center
Gathercole, Virginia C.
1988-01-01
Reviews research and empirical evidence to refute three first language acquisition myths: (1) comprehension precedes production; (2) children acquire language in a systematic, rule-governed way; and (3) the impetus behind first language acquisition is communicative need. (Author/CB)
NASA Technical Reports Server (NTRS)
DellaCorte, Christopher
2010-01-01
Foil gas bearings are a key technology in many commercial and emerging Oil-Free turbomachinery systems. These bearings are non-linear and have been difficult to analytically model in terms of performance characteristics such as load capacity, power loss, stiffness and damping. Previous investigations led to an empirically derived method, a rule-of-thumb, to estimate load capacity. This method has been a valuable tool in system development. The current paper extends this tool concept to include rules for stiffness and damping coefficient estimation. It is expected that these rules will further accelerate the development and deployment of advanced Oil-Free machines operating on foil gas bearings
Mining Distance Based Outliers in Near Linear Time with Randomization and a Simple Pruning Rule
NASA Technical Reports Server (NTRS)
Bay, Stephen D.; Schwabacher, Mark
2003-01-01
Defining outliers by their distance to neighboring examples is a popular approach to finding unusual examples in a data set. Recently, much work has been conducted with the goal of finding fast algorithms for this task. We show that a simple nested loop algorithm that in the worst case is quadratic can give near linear time performance when the data is in random order and a simple pruning rule is used. We test our algorithm on real high-dimensional data sets with millions of examples and show that the near linear scaling holds over several orders of magnitude. Our average case analysis suggests that much of the efficiency is because the time to process non-outliers, which are the majority of examples, does not depend on the size of the data set.
26 CFR 1.652(a)-1 - Simple trusts; inclusion of amounts in income of beneficiaries.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 26 Internal Revenue 8 2010-04-01 2010-04-01 false Simple trusts; inclusion of amounts in income of beneficiaries. 1.652(a)-1 Section 1.652(a)-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE....652(a)-1 Simple trusts; inclusion of amounts in income of beneficiaries. Subject to the rules in §§ 1...
Zero-knowledge cooperation in dilemma games.
Huck, Steffen; Normann, Hans Theo; Oechssler, Jorg
2003-01-07
We consider a very simple adaptive rule that induces cooperative behavior in a large class of dilemma games. The rule has a Pavlovian flavor and can be described as win-continue, lose-reverse. It assumes no knowledge about the underlying structure of the environment (the "rules of the game") and requires very little cognitive effort. Both features make it an appealing candidate for explaining the emergence of cooperative behavior in non-human species. Copyright 2003 Elsevier Science Ltd.
Diagonalizing Tensor Covariants, Light-Cone Commutators, and Sum Rules
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lo, C. Y.
We derive fixed-mass sum rules for virtual Compton scattering the forward direction. We use the methods of both Dicus, Jackiw, and Teplitz (for the absorptive parts) and Heimann, Hey, and Mandula (for the real parts). We find a set of tensor covariansa such that the corresponding scalar amplitudes are proportional to simple t-channel parity-conserving helicity amplitudes. We give a relatively complete discussion of the convergence of the sum rules in a Regge model. (auth)
Optimal Government Subsidies to Universities in the Face of Tuition and Enrollment Constraints
ERIC Educational Resources Information Center
Easton, Stephen T.; Rockerbie, Duane W.
2008-01-01
This paper develops a simple static model of an imperfectly competitive university operating under government-imposed constraints on the ability to raise tuition fees and increase enrollments. The model has particular applicability to Canadian universities. Assuming an average cost pricing rule, rules for adequate government subsidies (operating…
A Simple Derivation of Chemically Important Classical Observables and Superselection Rules.
ERIC Educational Resources Information Center
Muller-Herold, U.
1985-01-01
Explores the question "Why are so many stationary states allowed by traditional quantum mechanics not realized in nature?" through discussion of classical observables and superselection rules. Three examples are given that can be used in introductory courses (including the fermion/boson property and the mass of a "nonrelativistic" particle). (JN)
Children's Task-Switching Efficiency: Missing Our Cue?
ERIC Educational Resources Information Center
Holt, Anna E.; Deák, Gedeon
2015-01-01
In simple rule-switching tests, 3- and 4-year-olds can follow each of two sorting rules but sometimes make perseverative errors when switching. Older children make few errors but respond slowly when switching. These age-related changes might reflect the maturation of executive functions (e.g., inhibition). However, they might also reflect…
Eliciting Systematic Rule Use in Covariation Judgment [the Early Years].
ERIC Educational Resources Information Center
Shaklee, Harriet; Paszek, Donald
Related research suggests that children may show some simple understanding of event covariations by the early elementary school years. The present experiments use a rule analysis methodology to investigate covariation judgments of children in this age range. In Experiment 1, children in second, third and fourth grade judged covariations on 12…
When Simple Things Are Meaningful: Working Memory Strength Predicts Children's Cognitive Flexibility
ERIC Educational Resources Information Center
Blackwell, Katharine A.; Cepeda, Nicholas J.; Munakata, Yuko
2009-01-01
People often perseverate, repeating outdated behaviors despite correctly answering questions about rules they should be following. Children who perseverate are slower to respond to such questions than children who successfully switch to new rules, even after controlling for age and processing speed. Thus, switchers may have stronger working memory…
Context-Sensitive Rules and Word Naming in Italian Children
ERIC Educational Resources Information Center
Barca, Laura; Ellis, Andrew W.; Burani, Cristina
2007-01-01
The present study examines the role of orthographic complexity on Italian children's word reading. Two experiments are reported in which elementary school children (3rd and 5th graders) read aloud words containing simple or contextual letter-sound conversion rules. In Experiment 1, both groups of participants read words containing contextual rules…
An Evaluation of the Good Behavior Game in Kindergarten Classrooms
ERIC Educational Resources Information Center
Donaldson, Jeanne M.; Vollmer, Timothy R.; Krous, Tangala; Downs, Susan; Berard, Kerri P.
2011-01-01
The good behavior game (GBG) is a classwide group contingency that involves dividing the class into two teams, creating simple rules, and arranging contingencies for breaking or following those rules. Five kindergarten teachers and classrooms participated in this evaluation of the GBG. Disruptive behavior markedly decreased in all five classrooms…
Atomic clusters and atomic surfaces in icosahedral quasicrystals.
Quiquandon, Marianne; Portier, Richard; Gratias, Denis
2014-05-01
This paper presents the basic tools commonly used to describe the atomic structures of quasicrystals with a specific focus on the icosahedral phases. After a brief recall of the main properties of quasiperiodic objects, two simple physical rules are discussed that lead one to eventually obtain a surprisingly small number of atomic structures as ideal quasiperiodic models for real quasicrystals. This is due to the fact that the atomic surfaces (ASs) used to describe all known icosahedral phases are located on high-symmetry special points in six-dimensional space. The first rule is maximizing the density using simple polyhedral ASs that leads to two possible sets of ASs according to the value of the six-dimensional lattice parameter A between 0.63 and 0.79 nm. The second rule is maximizing the number of complete orbits of high symmetry to construct as large as possible atomic clusters similar to those observed in complex intermetallic structures and approximant phases. The practical use of these two rules together is demonstrated on two typical examples of icosahedral phases, i-AlMnSi and i-CdRE (RE = Gd, Ho, Tm).
Implementing a Commercial Rule Base as a Medication Order Safety Net
Reichley, Richard M.; Seaton, Terry L.; Resetar, Ervina; Micek, Scott T.; Scott, Karen L.; Fraser, Victoria J.; Dunagan, W. Claiborne; Bailey, Thomas C.
2005-01-01
A commercial rule base (Cerner Multum) was used to identify medication orders exceeding recommended dosage limits at five hospitals within BJC HealthCare, an integrated health care system. During initial testing, clinical pharmacists determined that there was an excessive number of nuisance and clinically insignificant alerts, with an overall alert rate of 9.2%. A method for customizing the commercial rule base was implemented to increase rule specificity for problematic rules. The system was subsequently deployed at two facilities and achieved alert rates of less than 1%. Pharmacists screened these alerts and contacted ordering physicians in 21% of cases. Physicians made therapeutic changes in response to 38% of alerts presented to them. By applying simple techniques to customize rules, commercial rule bases can be used to rapidly deploy a safety net to screen drug orders for excessive dosages, while preserving the rule architecture for later implementations of more finely tuned clinical decision support. PMID:15802481
Jia, Xiuqin; Liang, Peipeng; Shi, Lin; Wang, Defeng; Li, Kuncheng
2015-01-01
In neuroimaging studies, increased task complexity can lead to increased activation in task-specific regions or to activation of additional regions. How the brain adapts to increased rule complexity during inductive reasoning remains unclear. In the current study, three types of problems were created: simple rule induction (i.e., SI, with rule complexity of 1), complex rule induction (i.e., CI, with rule complexity of 2), and perceptual control. Our findings revealed that increased activations accompany increased rule complexity in the right dorsal lateral prefrontal cortex (DLPFC) and medial posterior parietal cortex (precuneus). A cognitive model predicted both the behavioral and brain imaging results. The current findings suggest that neural activity in frontal and parietal regions is modulated by rule complexity, which may shed light on the neural mechanisms of inductive reasoning. Copyright © 2014. Published by Elsevier Ltd.
Sea-level rise and shoreline retreat: time to abandon the Bruun Rule
NASA Astrophysics Data System (ADS)
Cooper, J. Andrew G.; Pilkey, Orrin H.
2004-11-01
In the face of a global rise in sea level, understanding the response of the shoreline to changes in sea level is a critical scientific goal to inform policy makers and managers. A body of scientific information exists that illustrates both the complexity of the linkages between sea-level rise and shoreline response, and the comparative lack of understanding of these linkages. In spite of the lack of understanding, many appraisals have been undertaken that employ a concept known as the "Bruun Rule". This is a simple two-dimensional model of shoreline response to rising sea level. The model has seen near global application since its original formulation in 1954. The concept provided an advance in understanding of the coastal system at the time of its first publication. It has, however, been superseded by numerous subsequent findings and is now invalid. Several assumptions behind the Bruun Rule are known to be false and nowhere has the Bruun Rule been adequately proven; on the contrary several studies disprove it in the field. No universally applicable model of shoreline retreat under sea-level rise has yet been developed. Despite this, the Bruun Rule is in widespread contemporary use at a global scale both as a management tool and as a scientific concept. The persistence of this concept beyond its original assumption base is attributed to the following factors: Appeal of a simple, easy to use analytical model that is in widespread use. Difficulty of determining the relative validity of 'proofs' and 'disproofs'. Ease of application. Positive advocacy by some scientists. Application by other scientists without critical appraisal. The simple numerical expression of the model. Lack of easy alternatives. The Bruun Rule has no power for predicting shoreline behaviour under rising sea level and should be abandoned. It is a concept whose time has passed. The belief by policy makers that it offers a prediction of future shoreline position may well have stifled much-needed research into the coastal response to sea-level rise.
ERIC Educational Resources Information Center
Fraser, Hugh W.; Anderson, Mary E.
1982-01-01
This exploratory study attempted to identify variables in need of further investigation. Those to emerge included heuristics or rules of thumb used by administrators in decision making, personality variables, and methods for evaluating alternatives. (Author/JM)
Generating Concise Rules for Human Motion Retrieval
NASA Astrophysics Data System (ADS)
Mukai, Tomohiko; Wakisaka, Ken-Ichi; Kuriyama, Shigeru
This paper proposes a method for retrieving human motion data with concise retrieval rules based on the spatio-temporal features of motion appearance. Our method first converts motion clip into a form of clausal language that represents geometrical relations between body parts and their temporal relationship. A retrieval rule is then learned from the set of manually classified examples using inductive logic programming (ILP). ILP automatically discovers the essential rule in the same clausal form with a user-defined hypothesis-testing procedure. All motions are indexed using this clausal language, and the desired clips are retrieved by subsequence matching using the rule. Such rule-based retrieval offers reasonable performance and the rule can be intuitively edited in the same language form. Consequently, our method enables efficient and flexible search from a large dataset with simple query language.
26 CFR 1.1441-0 - Outline of regulation provisions for section 1441.
Code of Federal Regulations, 2010 CFR
2010-04-01
...) Proof that tax liability has been satisfied. (iii) Liability for interest and penalties. (iv) Special...) General rule. (B) Foreign partnerships. (C) Foreign simple trusts and foreign grantor trusts. (D) Other... amounts. (23) Flow-through entity. (24) Foreign simple trust. (25) Foreign complex trust. (26) Foreign...
The Box-and-Dot Method: A Simple Strategy for Counting Significant Figures
NASA Astrophysics Data System (ADS)
Stephenson, W. Kirk
2009-08-01
A visual method for counting significant digits is presented. This easy-to-learn (and easy-to-teach) method, designated the box-and-dot method, uses the device of "boxing" significant figures based on two simple rules, then counting the number of digits in the boxes.
Reward-Modulated Hebbian Plasticity as Leverage for Partially Embodied Control in Compliant Robotics
Burms, Jeroen; Caluwaerts, Ken; Dambre, Joni
2015-01-01
In embodied computation (or morphological computation), part of the complexity of motor control is offloaded to the body dynamics. We demonstrate that a simple Hebbian-like learning rule can be used to train systems with (partial) embodiment, and can be extended outside of the scope of traditional neural networks. To this end, we apply the learning rule to optimize the connection weights of recurrent neural networks with different topologies and for various tasks. We then apply this learning rule to a simulated compliant tensegrity robot by optimizing static feedback controllers that directly exploit the dynamics of the robot body. This leads to partially embodied controllers, i.e., hybrid controllers that naturally integrate the computations that are performed by the robot body into a neural network architecture. Our results demonstrate the universal applicability of reward-modulated Hebbian learning. Furthermore, they demonstrate the robustness of systems trained with the learning rule. This study strengthens our belief that compliant robots should or can be seen as computational units, instead of dumb hardware that needs a complex controller. This link between compliant robotics and neural networks is also the main reason for our search for simple universal learning rules for both neural networks and robotics. PMID:26347645
Butz, Markus; van Ooyen, Arjen
2013-01-01
Lasting alterations in sensory input trigger massive structural and functional adaptations in cortical networks. The principles governing these experience-dependent changes are, however, poorly understood. Here, we examine whether a simple rule based on the neurons' need for homeostasis in electrical activity may serve as driving force for cortical reorganization. According to this rule, a neuron creates new spines and boutons when its level of electrical activity is below a homeostatic set-point and decreases the number of spines and boutons when its activity exceeds this set-point. In addition, neurons need a minimum level of activity to form spines and boutons. Spine and bouton formation depends solely on the neuron's own activity level, and synapses are formed by merging spines and boutons independently of activity. Using a novel computational model, we show that this simple growth rule produces neuron and network changes as observed in the visual cortex after focal retinal lesions. In the model, as in the cortex, the turnover of dendritic spines was increased strongest in the center of the lesion projection zone, while axonal boutons displayed a marked overshoot followed by pruning. Moreover, the decrease in external input was compensated for by the formation of new horizontal connections, which caused a retinotopic remapping. Homeostatic regulation may provide a unifying framework for understanding cortical reorganization, including network repair in degenerative diseases or following focal stroke. PMID:24130472
Assessing predation risk: optimal behaviour and rules of thumb.
Welton, Nicky J; McNamara, John M; Houston, Alasdair I
2003-12-01
We look at a simple model in which an animal makes behavioural decisions over time in an environment in which all parameters are known to the animal except predation risk. In the model there is a trade-off between gaining information about predation risk and anti-predator behaviour. All predator attacks lead to death for the prey, so that the prey learns about predation risk by virtue of the fact that it is still alive. We show that it is not usually optimal to behave as if the current unbiased estimate of the predation risk is its true value. We consider two different ways to model reproduction; in the first scenario the animal reproduces throughout its life until it dies, and in the second scenario expected reproductive success depends on the level of energy reserves the animal has gained by some point in time. For both of these scenarios we find results on the form of the optimal strategy and give numerical examples which compare optimal behaviour with behaviour under simple rules of thumb. The numerical examples suggest that the value of the optimal strategy over the rules of thumb is greatest when there is little current information about predation risk, learning is not too costly in terms of predation, and it is energetically advantageous to learn about predation. We find that for the model and parameters investigated, a very simple rule of thumb such as 'use the best constant control' performs well.
Human anatomy nomenclature rules for the computer age.
Neumann, Paul E; Baud, Robert; Sprumont, Pierre
2017-04-01
Information systems are increasing in importance in biomedical sciences and medical practice. The nomenclature rules of human anatomy were reviewed for adequacy with respect to modern needs. New rules are proposed here to ensure that each Latin term is uniquely associated with an anatomical entity, as short and simple as possible, and machine-interpretable. Observance of these recommendations will also benefit students and translators of the Latin terms into other languages. Clin. Anat. 30:300-302, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
A Taxonomy of Network Centric Warfare Architectures
2008-01-01
mound structure emerges as a result of the termites following very simple rules, and exchanging very simple pheromone signals (Solé & Goodwin 2000...only fairly simple decisions.” For example, in far northern Australia, “magnetic termites ” build large termite mounds which are oriented north-south...and contain a complex ventilation system which controls temperature, humidity, and oxygen levels. But termite brains are too small to store a plan
Mapping Network Centric Operational Architectures to C2 and Software Architectures
2007-06-01
Instead, the termite mound structure emerges as a result of the termites following very simple rules, and exchanging very simple pheromone signals...Each worker need make only fairly simple decisions.” For example, in far northern Australia, “magnetic termites ” build large termite mounds which are...oriented north-south and contain a complex ventilation system which controls temperature, humidity, and oxygen levels. But termite brains are too
A Consistent Set of Oxidation Number Rules for Intelligent Computer Tutoring
NASA Astrophysics Data System (ADS)
Holder, Dale A.; Johnson, Benny G.; Karol, Paul J.
2002-04-01
We have developed a method for assigning oxidation numbers that eliminates the inconsistencies and ambiguities found in most conventional textbook rules, yet remains simple enough for beginning students to use. It involves imposition of a two-level hierarchy on a set of rules similar to those already being taught. We recommend emphasizing that the oxidation number method is an approximate model and cannot always be successfully applied. This proper perspective will lead students to apply the rules more carefully in all problems. Whenever failure does occur, it will indicate the limitations of the oxidation number concept itself, rather than merely the failure of a poorly constructed set of rules. We have used these improved rules as the basis for an intelligent tutoring program on oxidation numbers.
[Case finding in early prevention networks - a heuristic for ambulatory care settings].
Barth, Michael; Belzer, Florian
2016-06-01
One goal of early prevention is the support of families with small children up to three years who are exposed to psychosocial risks. The identification of these cases is often complex and not well-directed, especially in the ambulatory care setting. Development of a model of a feasible and empirical based strategy for case finding in ambulatory care. Based on the risk factors of postpartal depression, lack of maternal responsiveness, parental stress with regulation disorders and poverty a lexicographic and non-compensatory heuristic model with simple decision rules, will be constructed and empirically tested. Therefore the original data set from an evaluation of the pediatric documentary form on psychosocial issues of families with small children in well-child visits will be used and reanalyzed. The first diagnostic step in the non-compensatory and hierarchical classification process is the assessment of postpartal depression followed by maternal responsiveness, parental stress and poverty. The classification model identifies 89.0 % cases from the original study. Compared to the original study the decision process becomes clearer and more concise. The evidence-based and data-driven model exemplifies a strategy for the assessment of psychosocial risk factors in ambulatory care settings. It is based on four evidence-based risk factors and offers a quick and reliable classification. A further advantage of this model is that after a risk factor is identified the diagnostic procedure will be stopped and the counselling process can commence. For further validation of the model studies, in well suited early prevention networks are needed.
Funk, Christopher S; Cohen, K Bretonnel; Hunter, Lawrence E; Verspoor, Karin M
2016-09-09
Gene Ontology (GO) terms represent the standard for annotation and representation of molecular functions, biological processes and cellular compartments, but a large gap exists between the way concepts are represented in the ontology and how they are expressed in natural language text. The construction of highly specific GO terms is formulaic, consisting of parts and pieces from more simple terms. We present two different types of manually generated rules to help capture the variation of how GO terms can appear in natural language text. The first set of rules takes into account the compositional nature of GO and recursively decomposes the terms into their smallest constituent parts. The second set of rules generates derivational variations of these smaller terms and compositionally combines all generated variants to form the original term. By applying both types of rules, new synonyms are generated for two-thirds of all GO terms and an increase in F-measure performance for recognition of GO on the CRAFT corpus from 0.498 to 0.636 is observed. Additionally, we evaluated the combination of both types of rules over one million full text documents from Elsevier; manual validation and error analysis show we are able to recognize GO concepts with reasonable accuracy (88 %) based on random sampling of annotations. In this work we present a set of simple synonym generation rules that utilize the highly compositional and formulaic nature of the Gene Ontology concepts. We illustrate how the generated synonyms aid in improving recognition of GO concepts on two different biomedical corpora. We discuss other applications of our rules for GO ontology quality assurance, explore the issue of overgeneration, and provide examples of how similar methodologies could be applied to other biomedical terminologies. Additionally, we provide all generated synonyms for use by the text-mining community.
Empirical Analysis and Refinement of Expert System Knowledge Bases
1990-03-31
the number of hidden units and the error rates is listed in Figure 6. 3.3. Cancer Data A data qet for eva!ukting th.- Frognosis of breast cancer ...Alternative Rule Induction Methods A data set for evaluating the prognosis of breast cancer recurrence was analyzed by Michalski’s AQI5 rule induction program...AQ15 7 2 32% PVM 2 1 23% Figure 6-3: Comparative Summa-y for AQI5 and PVM on Breast Cancer Data 6.2.2. Alternative Decision Tree Induction Methods
Government mandates and employer-sponsored health insurance: who is still not covered?
Vanness, David J; Wolfe, Barbara L
2002-06-01
We characterize employer-sponsored health insurance offering strategies in light of benefit non-discrimination and minimum wage regulation when workers have heterogeneous earnings and partially unobservable demand for (and cost of) insurance. We then empirically examine how earnings and expected medical expenses are associated with low wage workers' ability to obtain insurance before and after enactment of federal benefit non-discrimination rules. We find no evidence that the non-discrimination rules helped low wage workers (especially those with high own or children's expected medical expenses) to obtain insurance.
NASA Technical Reports Server (NTRS)
Barlow, Douglas A.; Baird, James K.; Su, Ching-Hua
2003-01-01
More than 75 years ago, von Weimarn summarized his observations of the dependence of the average crystal size on the initial relative concentration supersaturation prevailing in a solution from which crystals were growing. Since then, his empirically derived rules have become part of the lore of crystal growth. The first of these rules asserts that the average crystal size measured at the end of a crystallization increases as the initial value of the relative supersaturation decreases. The second rule states that for a given crystallization time, the average crystal size passes through a maximum as a function of the initial relative supersaturation. Using a theory of nucleation and growth due to Buyevich and Mansurov, we calculate the average crystal size as a function of the initial relative supersaturation. We confirm the von Weimarn rules for the case where the nucleation rate is proportional to the third power or higher of the relative supersaturation.
Molecular implementation of simple logic programs.
Ran, Tom; Kaplan, Shai; Shapiro, Ehud
2009-10-01
Autonomous programmable computing devices made of biomolecules could interact with a biological environment and be used in future biological and medical applications. Biomolecular implementations of finite automata and logic gates have already been developed. Here, we report an autonomous programmable molecular system based on the manipulation of DNA strands that is capable of performing simple logical deductions. Using molecular representations of facts such as Man(Socrates) and rules such as Mortal(X) <-- Man(X) (Every Man is Mortal), the system can answer molecular queries such as Mortal(Socrates)? (Is Socrates Mortal?) and Mortal(X)? (Who is Mortal?). This biomolecular computing system compares favourably with previous approaches in terms of expressive power, performance and precision. A compiler translates facts, rules and queries into their molecular representations and subsequently operates a robotic system that assembles the logical deductions and delivers the result. This prototype is the first simple programming language with a molecular-scale implementation.
When push comes to shove: Exclusion processes with nonlocal consequences
NASA Astrophysics Data System (ADS)
Almet, Axel A.; Pan, Michael; Hughes, Barry D.; Landman, Kerry A.
2015-11-01
Stochastic agent-based models are useful for modelling collective movement of biological cells. Lattice-based random walk models of interacting agents where each site can be occupied by at most one agent are called simple exclusion processes. An alternative motility mechanism to simple exclusion is formulated, in which agents are granted more freedom to move under the compromise that interactions are no longer necessarily local. This mechanism is termed shoving. A nonlinear diffusion equation is derived for a single population of shoving agents using mean-field continuum approximations. A continuum model is also derived for a multispecies problem with interacting subpopulations, which either obey the shoving rules or the simple exclusion rules. Numerical solutions of the derived partial differential equations compare well with averaged simulation results for both the single species and multispecies processes in two dimensions, while some issues arise in one dimension for the multispecies case.
Self-Organized Dynamic Flocking Behavior from a Simple Deterministic Map
NASA Astrophysics Data System (ADS)
Krueger, Wesley
2007-10-01
Coherent motion exhibiting large-scale order, such as flocking, swarming, and schooling behavior in animals, can arise from simple rules applied to an initial random array of self-driven particles. We present a completely deterministic dynamic map that exhibits emergent, collective, complex motion for a group of particles. Each individual particle is driven with a constant speed in two dimensions adopting the average direction of a fixed set of non-spatially related partners. In addition, the particle changes direction by π as it reaches a circular boundary. The dynamical patterns arising from these rules range from simple circular-type convective motion to highly sophisticated, complex, collective behavior which can be easily interpreted as flocking, schooling, or swarming depending on the chosen parameters. We present the results as a series of short movies and we also explore possible order parameters and correlation functions capable of quantifying the resulting coherence.
Technology: Digital Photography in an Inner-City Fifth Grade, Part 2
ERIC Educational Resources Information Center
Riner, Phil
2005-01-01
Last month, Phil Riner began discussing his project of teaching digital photography and prosocial behavior skills to inner-city fifth-graders. This work led him to generate some very specific procedures for camera care and use. Phil also taught the students some simple rules for taking better photos. These rules fell into four broad categories:…
An Uncommon Approach to a Common Algebraic Error
ERIC Educational Resources Information Center
Rossi, Paul S.
2008-01-01
The basic rules of elementary algebra can often appear beyond the grasp of many students. Even though most subjects, including calculus, prove to be more difficult, it is the simple rules of algebra that continue to be the "thorn in the side" of many mathematics students. In this paper we present a result intended to help students achieve a…
ERIC Educational Resources Information Center
Kundey, Shannon M. A.; Strandell, Brittany; Mathis, Heather; Rowan, James D.
2010-01-01
(Hulse and Dorsky, 1977) and (Hulse and Dorsky, 1979) found that rats, like humans, learn sequences following a simple rule-based structure more quickly than those lacking a rule-based structure. Through two experiments, we explored whether two additional species--domesticated horses ("Equus callabus") and chickens ("Gallus domesticus")--would…
The Uyghur Insurgency in Xinjiang: The Success Potential
2015-06-12
Kazakhstan.46 The valley is broad, fertile, well watered, and has historically been the grassland for pastoral nomads . The...confederation of pastoral nomads , lived in modern day Xinjiang for centuries.64 The Han Empire of the Tang Dynasty ruled over parts of Xinjiang from
A retrospective study of two populations to test a simple rule for spirometry.
Ohar, Jill A; Yawn, Barbara P; Ruppel, Gregg L; Donohue, James F
2016-06-04
Chronic lung disease is common and often under-diagnosed. To test a simple rule for conducting spirometry we reviewed spirograms from two populations, occupational medicine evaluations (OME) conducted by Saint Louis and Wake Forest Universities at 3 sites (n = 3260, mean age 64.14 years, 95 % CI 58.94-69.34, 97 % men) and conducted by Wake Forest University preop clinic (POC) at one site (n = 845, mean age 62.10 years, 95 % CI 50.46-73.74, 57 % men). This retrospective review of database information that the first author collected prospectively identified rates, types, sensitivity, specificity and positive and negative predictive value for lung function abnormalities and associated mortality rate found when conducting spirometry based on the 20/40 rule (≥20 years of smoking in those aged ≥ 40 years) in the OME population. To determine the reproducibility of the 20/40 rule for conducting spirometry, the rule was applied to the POC population. A lung function abnormality was found in 74 % of the OME population and 67 % of the POC population. Sensitivity of the rule was 85 % for an obstructive pattern and 77 % for any abnormality on spirometry. Positive and negative predictive values of the rule for a spirometric abnormality were 74 and 55 %, respectively. Patients with an obstructive pattern were at greater risk of coronary heart disease (odds ratio (OR) 1.39 [confidence interval (CI) 1.00-1.93] vs. normal) and death (hazard ratio (HR) 1.53, 95 % CI 1.20-1.84) than subjects with normal spirometry. Restricted spirometry patterns were also associated with greater risk of coronary disease (odds ratio (OR) 1.7 [CI 1.23-2.35]) and death (Hazard ratio 1.40, 95 % CI 1.08-1.72). Smokers (≥ 20 pack years) age ≥ 40 years are at an increased risk for lung function abnormalities and those abnormalities are associated with greater presence of coronary heart disease and increased all-cause mortality. Use of the 20/40 rule could provide a simple method to enhance selection of candidates for spirometry evaluation in the primary care setting.
Evolving fuzzy rules for relaxed-criteria negotiation.
Sim, Kwang Mong
2008-12-01
In the literature on automated negotiation, very few negotiation agents are designed with the flexibility to slightly relax their negotiation criteria to reach a consensus more rapidly and with more certainty. Furthermore, these relaxed-criteria negotiation agents were not equipped with the ability to enhance their performance by learning and evolving their relaxed-criteria negotiation rules. The impetus of this work is designing market-driven negotiation agents (MDAs) that not only have the flexibility of relaxing bargaining criteria using fuzzy rules, but can also evolve their structures by learning new relaxed-criteria fuzzy rules to improve their negotiation outcomes as they participate in negotiations in more e-markets. To this end, an evolutionary algorithm for adapting and evolving relaxed-criteria fuzzy rules was developed. Implementing the idea in a testbed, two kinds of experiments for evaluating and comparing EvEMDAs (MDAs with relaxed-criteria rules that are evolved using the evolutionary algorithm) and EMDAs (MDAs with relaxed-criteria rules that are manually constructed) were carried out through stochastic simulations. Empirical results show that: 1) EvEMDAs generally outperformed EMDAs in different types of e-markets and 2) the negotiation outcomes of EvEMDAs generally improved as they negotiated in more e-markets.
A theoretical model of the relationship between the h-index and other simple citation indicators.
Bertoli-Barsotti, Lucio; Lando, Tommaso
2017-01-01
Of the existing theoretical formulas for the h -index, those recently suggested by Burrell (J Informetr 7:774-783, 2013b) and by Bertoli-Barsotti and Lando (J Informetr 9(4):762-776, 2015) have proved very effective in estimating the actual value of the h -index Hirsch (Proc Natl Acad Sci USA 102:16569-16572, 2005), at least at the level of the individual scientist. These approaches lead (or may lead) to two slightly different formulas, being based, respectively, on a "standard" and a "shifted" version of the geometric distribution. In this paper, we review the genesis of these two formulas-which we shall call the "basic" and "improved" Lambert- W formula for the h -index-and compare their effectiveness with that of a number of instances taken from the well-known Glänzel-Schubert class of models for the h -index (based, instead, on a Paretian model) by means of an empirical study. All the formulas considered in the comparison are "ready-to-use", i.e., functions of simple citation indicators such as: the total number of publications; the total number of citations; the total number of cited paper; the number of citations of the most cited paper. The empirical study is based on citation data obtained from two different sets of journals belonging to two different scientific fields: more specifically, 231 journals from the area of "Statistics and Mathematical Methods" and 100 journals from the area of "Economics, Econometrics and Finance", totaling almost 100,000 and 20,000 publications, respectively. The citation data refer to different publication/citation time windows, different types of "citable" documents, and alternative approaches to the analysis of the citation process ("prospective" and "retrospective"). We conclude that, especially in its improved version, the Lambert- W formula for the h -index provides a quite robust and effective ready-to-use rule that should be preferred to other known formulas if one's goal is (simply) to derive a reliable estimate of the h -index.
Scaling in Ecosystems and the Linkage of Macroecological Laws
NASA Astrophysics Data System (ADS)
Rinaldo, A.
2007-12-01
Are there predictable linkages among macroecological laws regulating size and abundance of organisms that are ubiquitously supported by empirical observations and that ecologists treat traditionally as independent? Do fragmentation of habitats, or reduced supply of energy and matter, result in predictable changes on whole ecosystems as a function of their size? Using a coherent theoretical framework based on scaling theory, it is argued that the answer to both these questions is affirmative. The concern of the talk is with the comparatively simple situation of the steady state behavior of a fully developed ecosystem in which, over evolutionary time, resources are exploited in full, individual and collective metabolic needs are met and enough time has elapsed to produce a rough balance between speciation and extinction and ecological fluxes. While ecological patterns and processes often show great variation when viewed at different scales of space, time, organismic size and organizational complexity, there is also widespread evidence for the existence of scaling regularities as embedded in macroecological "laws" or rules. These laws have commanded considerable attention from the ecological community. Indeed they are central to ecological theory as they describe the features of complex adaptive systems shown by a number of biological systems, and perhaps for the investigation of the dynamic origin of scale invariance of natural forms in general. The species-area and relative species-abundance relations, the scaling of community and species' size spectra, the scaling of population densities with their mean body mass and the scaling of the largest organism with ecosystem size are examples of such laws. Borrowing heavily from earlier successes in physics, it will be shown how simple mathematical scaling arguments, following from dimensional and finite-size scaling analyses, provide theoretical predictions of the inter- relationships among the species abundance relationship, the species-area relationship and community size spectra, in excellent accord with empirical data. The main conclusion is that the proposed scaling framework, along with the questions and predictions it provides, serves as a starting point for a novel approach to macroecological analysis.
A simple approximation for the current-voltage characteristics of high-power, relativistic diodes
Ekdahl, Carl
2016-06-10
A simple approximation for the current-voltage characteristics of a relativistic electron diode is presented. The approximation is accurate from non-relativistic through relativistic electron energies. Although it is empirically developed, it has many of the fundamental properties of the exact diode solutions. Lastly, the approximation is simple enough to be remembered and worked on almost any pocket calculator, so it has proven to be quite useful on the laboratory floor.
Nested subcritical flows within supercritical systems
NASA Technical Reports Server (NTRS)
Hendricks, R. C.; Braun, M. J.; Wheeler, R. L., III; Mullen, R. L.
1985-01-01
In supercritical systems the design inlet and outlet pressures are maintained above the thermaodynamic critical pressure P sub C. Designers rely on this simple rule of thumb to circumvent problems associated with a subcritical pressure regime nested within the supercritical pressure system along with the uncertainties in heat transfer, fluid mechanics, and thermophysical property variations. The simple rule of thumb is adequate in many low-power designs but is inadequate for high-performance turbomachines and linear systems, where nested two-phase regions can exist. Examples for a free-jet expansion with backpressure greater than P sub C and a rotor (bearing) with ambient pressure greater than P sub C illustrate the existence of subcritical pressure regimes nested within supercritical systems.
Generating self-organizing collective behavior using separation dynamics from experimental data
NASA Astrophysics Data System (ADS)
Dieck Kattas, Graciano; Xu, Xiao-Ke; Small, Michael
2012-09-01
Mathematical models for systems of interacting agents using simple local rules have been proposed and shown to exhibit emergent swarming behavior. Most of these models are constructed by intuition or manual observations of real phenomena, and later tuned or verified to simulate desired dynamics. In contrast to this approach, we propose using a model that attempts to follow an averaged rule of the essential distance-dependent collective behavior of real pigeon flocks, which was abstracted from experimental data. By using a simple model to follow the behavioral tendencies of real data, we show that our model can exhibit a wide range of emergent self-organizing dynamics such as flocking, pattern formation, and counter-rotating vortices.
Generating self-organizing collective behavior using separation dynamics from experimental data.
Dieck Kattas, Graciano; Xu, Xiao-Ke; Small, Michael
2012-09-01
Mathematical models for systems of interacting agents using simple local rules have been proposed and shown to exhibit emergent swarming behavior. Most of these models are constructed by intuition or manual observations of real phenomena, and later tuned or verified to simulate desired dynamics. In contrast to this approach, we propose using a model that attempts to follow an averaged rule of the essential distance-dependent collective behavior of real pigeon flocks, which was abstracted from experimental data. By using a simple model to follow the behavioral tendencies of real data, we show that our model can exhibit a wide range of emergent self-organizing dynamics such as flocking, pattern formation, and counter-rotating vortices.
Clinical decision support alert malfunctions: analysis and empirically derived taxonomy.
Wright, Adam; Ai, Angela; Ash, Joan; Wiesen, Jane F; Hickman, Thu-Trang T; Aaron, Skye; McEvoy, Dustin; Borkowsky, Shane; Dissanayake, Pavithra I; Embi, Peter; Galanter, William; Harper, Jeremy; Kassakian, Steve Z; Ramoni, Rachel; Schreiber, Richard; Sirajuddin, Anwar; Bates, David W; Sittig, Dean F
2018-05-01
To develop an empirically derived taxonomy of clinical decision support (CDS) alert malfunctions. We identified CDS alert malfunctions using a mix of qualitative and quantitative methods: (1) site visits with interviews of chief medical informatics officers, CDS developers, clinical leaders, and CDS end users; (2) surveys of chief medical informatics officers; (3) analysis of CDS firing rates; and (4) analysis of CDS overrides. We used a multi-round, manual, iterative card sort to develop a multi-axial, empirically derived taxonomy of CDS malfunctions. We analyzed 68 CDS alert malfunction cases from 14 sites across the United States with diverse electronic health record systems. Four primary axes emerged: the cause of the malfunction, its mode of discovery, when it began, and how it affected rule firing. Build errors, conceptualization errors, and the introduction of new concepts or terms were the most frequent causes. User reports were the predominant mode of discovery. Many malfunctions within our database caused rules to fire for patients for whom they should not have (false positives), but the reverse (false negatives) was also common. Across organizations and electronic health record systems, similar malfunction patterns recurred. Challenges included updates to code sets and values, software issues at the time of system upgrades, difficulties with migration of CDS content between computing environments, and the challenge of correctly conceptualizing and building CDS. CDS alert malfunctions are frequent. The empirically derived taxonomy formalizes the common recurring issues that cause these malfunctions, helping CDS developers anticipate and prevent CDS malfunctions before they occur or detect and resolve them expediently.
Using a Simple Contest to Illustrate Mechanism Design
ERIC Educational Resources Information Center
Blackwell, Calvin
2011-01-01
This article describes a simple classroom activity that illustrates how economic theory can be used for mechanism design. The rules for a set of contests are presented; the results typically obtained from these contests illustrate how the prize structure can be manipulated in order to produce a particular outcome. Specifically, this activity is…
The Box-and-Dot Method: A Simple Strategy for Counting Significant Figures
ERIC Educational Resources Information Center
Stephenson, W. Kirk
2009-01-01
A visual method for counting significant digits is presented. This easy-to-learn (and easy-to-teach) method, designated the box-and-dot method, uses the device of "boxing" significant figures based on two simple rules, then counting the number of digits in the boxes. (Contains 4 notes.)
Simple Identification of Complex ADHD Subtypes Using Current Symptom Counts
ERIC Educational Resources Information Center
Volk, Heather E.; Todorov, Alexandre A.; Hay, David A.; Todd, Richard D.
2009-01-01
The results of the assessment of the accuracy of simple rules based on symptom count for assigning youths to attention deficit hyperactivity disorder subtypes show that having six or more total symptoms and fewer than three hyperactive-impulsive symptoms is an accurate predictor for the latent class sever inattentive subtype.
A Simple and Effective Program to Increase Faculty Knowledge of and Referrals to Counseling Centers
ERIC Educational Resources Information Center
Nolan, Susan A.; Pace, Kristi A.; Iannelli, Richard J.; Palma, Thomas V.; Pakalns, Gail P.
2006-01-01
The authors describe a simple, cost-effective, and empirically supported program to increase faculty referrals of students to counseling centers (CCs). Incoming faculty members at 3 universities received a mailing and personal telephone call from a CC staff member. Faculty assigned to the outreach program had greater knowledge of and rates of…
Medical semiotics in the 18th century: a theory of practice?
Hess, V
1998-06-01
Medical semiotics in the 18th century was more than a premodern form of diagnosis. Its structure allowed for the combination of empirically proven rules of instruction with the theoretical knowledge of the new sciences, employing the relation between the sign and the signified.
DOT National Transportation Integrated Search
2003-07-01
Drilled shaft foundations embedded in weak rock formations (e.g., Denver blue claystone and sandstone) support a significant portion of bridges in Colorado. Since the 1960s, empirical methods and rules of thumb have been used to design drilled shafts...
Religious Conviction, Morality and Social Convention among Finnish Adolescents
ERIC Educational Resources Information Center
Vainio, Annukka
2011-01-01
The assumptions of Kohlberg, Turiel and Shweder regarding the features of moral reasoning were compared empirically. The moral reasoning of Finnish Evangelical Lutheran, Conservative Laestadian and non-religious adolescents was studied using Kohlberg's Moral Judgment Interview and Turiel Rule Transgression Interview methods. Religiosity and choice…
SOURCE PULSE ENHANCEMENT BY DECONVOLUTION OF AN EMPIRICAL GREEN'S FUNCTION.
Mueller, Charles S.
1985-01-01
Observations of the earthquake source-time function are enhanced if path, recording-site, and instrument complexities can be removed from seismograms. Assuming that a small earthquake has a simple source, its seismogram can be treated as an empirical Green's function and deconvolved from the seismogram of a larger and/or more complex earthquake by spectral division. When the deconvolution is well posed, the quotient spectrum represents the apparent source-time function of the larger event. This study shows that with high-quality locally recorded earthquake data it is feasible to Fourier transform the quotient and obtain a useful result in the time domain. In practice, the deconvolution can be stabilized by one of several simple techniques. Application of the method is given. Refs.
Timmerman, Dirk; Van Calster, Ben; Testa, Antonia; Savelli, Luca; Fischerova, Daniela; Froyman, Wouter; Wynants, Laure; Van Holsbeke, Caroline; Epstein, Elisabeth; Franchi, Dorella; Kaijser, Jeroen; Czekierdowski, Artur; Guerriero, Stefano; Fruscio, Robert; Leone, Francesco P G; Rossi, Alberto; Landolfo, Chiara; Vergote, Ignace; Bourne, Tom; Valentin, Lil
2016-04-01
Accurate methods to preoperatively characterize adnexal tumors are pivotal for optimal patient management. A recent metaanalysis concluded that the International Ovarian Tumor Analysis algorithms such as the Simple Rules are the best approaches to preoperatively classify adnexal masses as benign or malignant. We sought to develop and validate a model to predict the risk of malignancy in adnexal masses using the ultrasound features in the Simple Rules. This was an international cross-sectional cohort study involving 22 oncology centers, referral centers for ultrasonography, and general hospitals. We included consecutive patients with an adnexal tumor who underwent a standardized transvaginal ultrasound examination and were selected for surgery. Data on 5020 patients were recorded in 3 phases from 2002 through 2012. The 5 Simple Rules features indicative of a benign tumor (B-features) and the 5 features indicative of malignancy (M-features) are based on the presence of ascites, tumor morphology, and degree of vascularity at ultrasonography. Gold standard was the histopathologic diagnosis of the adnexal mass (pathologist blinded to ultrasound findings). Logistic regression analysis was used to estimate the risk of malignancy based on the 10 ultrasound features and type of center. The diagnostic performance was evaluated by area under the receiver operating characteristic curve, sensitivity, specificity, positive likelihood ratio (LR+), negative likelihood ratio (LR-), positive predictive value (PPV), negative predictive value (NPV), and calibration curves. Data on 4848 patients were analyzed. The malignancy rate was 43% (1402/3263) in oncology centers and 17% (263/1585) in other centers. The area under the receiver operating characteristic curve on validation data was very similar in oncology centers (0.917; 95% confidence interval, 0.901-0.931) and other centers (0.916; 95% confidence interval, 0.873-0.945). Risk estimates showed good calibration. In all, 23% of patients in the validation data set had a very low estimated risk (<1%) and 48% had a high estimated risk (≥30%). For the 1% risk cutoff, sensitivity was 99.7%, specificity 33.7%, LR+ 1.5, LR- 0.010, PPV 44.8%, and NPV 98.9%. For the 30% risk cutoff, sensitivity was 89.0%, specificity 84.7%, LR+ 5.8, LR- 0.13, PPV 75.4%, and NPV 93.9%. Quantification of the risk of malignancy based on the Simple Rules has good diagnostic performance both in oncology centers and other centers. A simple classification based on these risk estimates may form the basis of a clinical management system. Patients with a high risk may benefit from surgery by a gynecological oncologist, while patients with a lower risk may be managed locally. Copyright © 2016 Elsevier Inc. All rights reserved.
The Rules of the Game in an Introductory Literature Class
ERIC Educational Resources Information Center
Jones, Ed
2008-01-01
While focusing on Andrew Marvell's "To His Coy Mistress," the author came up with the Interpretation Game, a game that had a simple set of rules designed to promote engaged academic discussion and, at the same time, to overcome problems that students have in class discussion about literature. In this article, the author narrates a few instances of…
The Game of Life Rules on Penrose Tilings: Still Life and Oscillators
NASA Astrophysics Data System (ADS)
Owens, Nick; Stepney, Susan
John Horton Conway's Game of Life is a simple two-dimensional, two state cellular automaton (CA), remarkable for its complex behaviour. That behaviour is known to be very sensitive to a change in the CA rules. Here we continue our investigations into its sensitivity to changes in the lattice, by the use of an aperiodic Penrose tiling lattice.
Working dogs cooperate among one another by generalised reciprocity.
Gfrerer, Nastassja; Taborsky, Michael
2017-03-06
Cooperation by generalised reciprocity implies that individuals apply the decision rule "help anyone if helped by someone". This mechanism has been shown to generate evolutionarily stable levels of cooperation, but as yet it is unclear how widely this cooperation mechanism is applied among animals. Dogs (Canis familiaris) are highly social animals with considerable cognitive potential and the ability to differentiate between individual social partners. But although dogs can solve complex problems, they may use simple rules for behavioural decisions. Here we show that dogs trained in an instrumental cooperative task to provide food to a social partner help conspecifics more often after receiving help from a dog before. Remarkably, in so doing they show no distinction between partners that had helped them before and completely unfamiliar conspecifics. Apparently, dogs use the simple decision rule characterizing generalised reciprocity, although they are probably capable of using the more complex decision rule of direct reciprocity: "help someone who has helped you". However, generalized reciprocity involves lower information processing costs and is therefore a cheaper cooperation strategy. Our results imply that generalised reciprocity might be applied more commonly than direct reciprocity also in other mutually cooperating animals.
Working dogs cooperate among one another by generalised reciprocity
Gfrerer, Nastassja; Taborsky, Michael
2017-01-01
Cooperation by generalised reciprocity implies that individuals apply the decision rule “help anyone if helped by someone”. This mechanism has been shown to generate evolutionarily stable levels of cooperation, but as yet it is unclear how widely this cooperation mechanism is applied among animals. Dogs (Canis familiaris) are highly social animals with considerable cognitive potential and the ability to differentiate between individual social partners. But although dogs can solve complex problems, they may use simple rules for behavioural decisions. Here we show that dogs trained in an instrumental cooperative task to provide food to a social partner help conspecifics more often after receiving help from a dog before. Remarkably, in so doing they show no distinction between partners that had helped them before and completely unfamiliar conspecifics. Apparently, dogs use the simple decision rule characterizing generalised reciprocity, although they are probably capable of using the more complex decision rule of direct reciprocity: “help someone who has helped you”. However, generalized reciprocity involves lower information processing costs and is therefore a cheaper cooperation strategy. Our results imply that generalised reciprocity might be applied more commonly than direct reciprocity also in other mutually cooperating animals. PMID:28262722
He, ZeFang; Zhao, Long
2014-01-01
An attitude control strategy based on Ziegler-Nichols rules for tuning PD (proportional-derivative) parameters of quadrotor helicopters is presented to solve the problem that quadrotor tends to be instable. This problem is caused by the narrow definition domain of attitude angles of quadrotor helicopters. The proposed controller is nonlinear and consists of a linear part and a nonlinear part. The linear part is a PD controller with PD parameters tuned by Ziegler-Nichols rules and acts on the quadrotor decoupled linear system after feedback linearization; the nonlinear part is a feedback linearization item which converts a nonlinear system into a linear system. It can be seen from the simulation results that the attitude controller proposed in this paper is highly robust, and its control effect is better than the other two nonlinear controllers. The nonlinear parts of the other two nonlinear controllers are the same as the attitude controller proposed in this paper. The linear part involves a PID (proportional-integral-derivative) controller with the PID controller parameters tuned by Ziegler-Nichols rules and a PD controller with the PD controller parameters tuned by GA (genetic algorithms). Moreover, this attitude controller is simple and easy to implement.
Applications of rule-induction in the derivation of quantitative structure-activity relationships.
A-Razzak, M; Glen, R C
1992-08-01
Recently, methods have been developed in the field of Artificial Intelligence (AI), specifically in the expert systems area using rule-induction, designed to extract rules from data. We have applied these methods to the analysis of molecular series with the objective of generating rules which are predictive and reliable. The input to rule-induction consists of a number of examples with known outcomes (a training set) and the output is a tree-structured series of rules. Unlike most other analysis methods, the results of the analysis are in the form of simple statements which can be easily interpreted. These are readily applied to new data giving both a classification and a probability of correctness. Rule-induction has been applied to in-house generated and published QSAR datasets and the methodology, application and results of these analyses are discussed. The results imply that in some cases it would be advantageous to use rule-induction as a complementary technique in addition to conventional statistical and pattern-recognition methods.
Perry, Jeffrey J; Stiell, Ian G
2006-12-01
Traumatic injuries to the ankle/foot, knee, cervical spine, and head are very commonly seen in emergency and accident departments around the world. There has been much interest in the development of clinical decision rules to help guide the investigations of these patients in a standardised and cost-effective manner. In this article we reviewed the impact of the Ottawa ankle rules, Ottawa knee rules, Canadian C-spine rule and the Canadian CT head rule. The studies conducted have confirmed that the use of well developed clinical decision rules results in less radiography, less time spent in the emergency department and does not decrease patient satisfaction or result in misdiagnosis. Emergency physicians around the world should adopt the use of clinical decision rules for ankle/foot, knee, cervical spine and minor head injuries. With relatively simple implementation strategies, care can be standardized and costs reduced while providing excellent clinical care.
Applications of rule-induction in the derivation of quantitative structure-activity relationships
NASA Astrophysics Data System (ADS)
A-Razzak, Mohammed; Glen, Robert C.
1992-08-01
Recently, methods have been developed in the field of Artificial Intelligence (AI), specifically in the expert systems area using rule-induction, designed to extract rules from data. We have applied these methods to the analysis of molecular series with the objective of generating rules which are predictive and reliable. The input to rule-induction consists of a number of examples with known outcomes (a training set) and the output is a tree-structured series of rules. Unlike most other analysis methods, the results of the analysis are in the form of simple statements which can be easily interpreted. These are readily applied to new data giving both a classification and a probability of correctness. Rule-induction has been applied to in-house generated and published QSAR datasets and the methodology, application and results of these analyses are discussed. The results imply that in some cases it would be advantageous to use rule-induction as a complementary technique in addition to conventional statistical and pattern-recognition methods.
NASA Astrophysics Data System (ADS)
Baird, M. E.; Walker, S. J.; Wallace, B. B.; Webster, I. T.; Parslow, J. S.
2003-03-01
A simple model of estuarine eutrophication is built on biomechanical (or mechanistic) descriptions of a number of the key ecological processes in estuaries. Mechanistically described processes include the nutrient uptake and light capture of planktonic and benthic autotrophs, and the encounter rates of planktonic predators and prey. Other more complex processes, such as sediment biogeochemistry, detrital processes and phosphate dynamics, are modelled using empirical descriptions from the Port Phillip Bay Environmental Study (PPBES) ecological model. A comparison is made between the mechanistically determined rates of ecological processes and the analogous empirically determined rates in the PPBES ecological model. The rates generally agree, with a few significant exceptions. Model simulations were run at a range of estuarine depths and nutrient loads, with outputs presented as the annually averaged biomass of autotrophs. The simulations followed a simple conceptual model of eutrophication, suggesting a simple biomechanical understanding of estuarine processes can provide a predictive tool for ecological processes in a wide range of estuarine ecosystems.
Symmetry rules for the indirect nuclear spin-spin coupling tensor revisited
NASA Astrophysics Data System (ADS)
Buckingham, A. D.; Pyykkö, P.; Robert, J. B.; Wiesenfeld, L.
The symmetry rules of Buckingham and Love (1970), relating the number of independent components of the indirect spin-spin coupling tensor J to the symmetry of the nuclear sites, are shown to require modification if the two nuclei are exchanged by a symmetry operation. In that case, the anti-symmetric part of J does not transform as a second-rank polar tensor under symmetry operations that interchange the coupled nuclei and may be called an anti-tensor. New rules are derived and illustrated by simple molecular models.
A new simple /spl infin/OH neuron model as a biologically plausible principal component analyzer.
Jankovic, M V
2003-01-01
A new approach to unsupervised learning in a single-layer neural network is discussed. An algorithm for unsupervised learning based upon the Hebbian learning rule is presented. A simple neuron model is analyzed. A dynamic neural model, which contains both feed-forward and feedback connections between the input and the output, has been adopted. The, proposed learning algorithm could be more correctly named self-supervised rather than unsupervised. The solution proposed here is a modified Hebbian rule, in which the modification of the synaptic strength is proportional not to pre- and postsynaptic activity, but instead to the presynaptic and averaged value of postsynaptic activity. It is shown that the model neuron tends to extract the principal component from a stationary input vector sequence. Usually accepted additional decaying terms for the stabilization of the original Hebbian rule are avoided. Implementation of the basic Hebbian scheme would not lead to unrealistic growth of the synaptic strengths, thanks to the adopted network structure.
Wittenberg, Philipp; Gan, Fah Fatt; Knoth, Sven
2018-04-17
The variable life-adjusted display (VLAD) is the first risk-adjusted graphical procedure proposed in the literature for monitoring the performance of a surgeon. It displays the cumulative sum of expected minus observed deaths. It has since become highly popular because the statistic plotted is easy to understand. But it is also easy to misinterpret a surgeon's performance by utilizing the VLAD, potentially leading to grave consequences. The problem of misinterpretation is essentially caused by the variance of the VLAD's statistic that increases with sample size. In order for the VLAD to be truly useful, a simple signaling rule is desperately needed. Various forms of signaling rules have been developed, but they are usually quite complicated. Without signaling rules, making inferences using the VLAD alone is difficult if not misleading. In this paper, we establish an equivalence between a VLAD with V-mask and a risk-adjusted cumulative sum (RA-CUSUM) chart based on the difference between the estimated probability of death and surgical outcome. Average run length analysis based on simulation shows that this particular RA-CUSUM chart has similar performance as compared to the established RA-CUSUM chart based on the log-likelihood ratio statistic obtained by testing the odds ratio of death. We provide a simple design procedure for determining the V-mask parameters based on a resampling approach. Resampling from a real data set ensures that these parameters can be estimated appropriately. Finally, we illustrate the monitoring of a real surgeon's performance using VLAD with V-mask. Copyright © 2018 John Wiley & Sons, Ltd.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-25
... Act of 1990 (Title IX of the Omnibus Budget Reconciliation Act of 1990, Public Law 101-508) and Part... operators and off-airport hotel vans; pedestrian crosswalk across Empire Avenue to connect the train station...
Explicit Grammar Rules and L2 Acquisition
ERIC Educational Resources Information Center
Scheffler, Pawel; Cinciala, Marcin
2011-01-01
This article reports an empirical study that examines to what extent learners can identify and understand the grammatical structures they produce when they speak spontaneously. In the study, 20 upper-intermediate Polish learners of English were interviewed in English by the researchers. The structures used accurately by each learner were isolated…
Studies in Philippine Linguistics. Vol. 1, No. 2.
ERIC Educational Resources Information Center
Edrial-Luzares, Casilda, Ed.; Hale, Austin, Ed.
This volume is devoted to papers on an empirical or theoretical nature contributing to the study of language and communicative behavior in the Philippines. Articles included are: (1) "The Phonemic Consequences of Two Morphophonemic Rules in Molbog," by H. Arnold Thiessen; (2) "A Look at a Northern Kankanay Text (a syntactic…
Remember-Know: A Matter of Confidence
ERIC Educational Resources Information Center
Dunn, John C.
2004-01-01
This article critically examines the view that the signal detection theory (SDT) interpretation of the remember-know (RK) paradigm has been ruled out by the evidence. The author evaluates 5 empirical arguments against a database of 72 studies reporting RK data under 400 different conditions. These arguments concern (a) the functional independence…
Rule-violations sensitise towards negative and authority-related stimuli.
Wirth, Robert; Foerster, Anna; Rendel, Hannah; Kunde, Wilfried; Pfister, Roland
2018-05-01
Rule violations have usually been studied from a third-person perspective, identifying situational factors that render violations more or less likely. A first-person perspective of the agent that actively violates the rules, on the other hand, is only just beginning to emerge. Here we show that committing a rule violation sensitises towards subsequent negative stimuli as well as subsequent authority-related stimuli. In a Prime-Probe design, we used an instructed rule-violation task as the Prime and a word categorisation task as the Probe. Also, we employed a control condition that used a rule inversion task as the Prime (instead of rule violations). Probe targets were categorised faster after a violation relative to after a rule-based response if they related to either, negative valence or authority. Inversions, however, primed only negative stimuli and did not accelerate the categorisation of authority-related stimuli. A heightened sensitivity towards authority-related targets thus seems to be specific to rule violations. A control experiment showed that these effects cannot be explained in terms of semantic priming. Therefore, we propose that rule violations necessarily activate authority-related representations that make rule violations qualitatively different from simple rule inversions.
Reinforcement Learning in a Nonstationary Environment: The El Farol Problem
NASA Technical Reports Server (NTRS)
Bell, Ann Maria
1999-01-01
This paper examines the performance of simple learning rules in a complex adaptive system based on a coordination problem modeled on the El Farol problem. The key features of the El Farol problem are that it typically involves a medium number of agents and that agents' pay-off functions have a discontinuous response to increased congestion. First we consider a single adaptive agent facing a stationary environment. We demonstrate that the simple learning rules proposed by Roth and Er'ev can be extremely sensitive to small changes in the initial conditions and that events early in a simulation can affect the performance of the rule over a relatively long time horizon. In contrast, a reinforcement learning rule based on standard practice in the computer science literature converges rapidly and robustly. The situation is reversed when multiple adaptive agents interact: the RE algorithms often converge rapidly to a stable average aggregate attendance despite the slow and erratic behavior of individual learners, while the CS based learners frequently over-attend in the early and intermediate terms. The symmetric mixed strategy equilibria is unstable: all three learning rules ultimately tend towards pure strategies or stabilize in the medium term at non-equilibrium probabilities of attendance. The brittleness of the algorithms in different contexts emphasize the importance of thorough and thoughtful examination of simulation-based results.
Phonological Concept Learning.
Moreton, Elliott; Pater, Joe; Pertsova, Katya
2017-01-01
Linguistic and non-linguistic pattern learning have been studied separately, but we argue for a comparative approach. Analogous inductive problems arise in phonological and visual pattern learning. Evidence from three experiments shows that human learners can solve them in analogous ways, and that human performance in both cases can be captured by the same models. We test GMECCS (Gradual Maximum Entropy with a Conjunctive Constraint Schema), an implementation of the Configural Cue Model (Gluck & Bower, ) in a Maximum Entropy phonotactic-learning framework (Goldwater & Johnson, ; Hayes & Wilson, ) with a single free parameter, against the alternative hypothesis that learners seek featurally simple algebraic rules ("rule-seeking"). We study the full typology of patterns introduced by Shepard, Hovland, and Jenkins () ("SHJ"), instantiated as both phonotactic patterns and visual analogs, using unsupervised training. Unlike SHJ, Experiments 1 and 2 found that both phonotactic and visual patterns that depended on fewer features could be more difficult than those that depended on more features, as predicted by GMECCS but not by rule-seeking. GMECCS also correctly predicted performance differences between stimulus subclasses within each pattern. A third experiment tried supervised training (which can facilitate rule-seeking in visual learning) to elicit simple rule-seeking phonotactic learning, but cue-based behavior persisted. We conclude that similar cue-based cognitive processes are available for phonological and visual concept learning, and hence that studying either kind of learning can lead to significant insights about the other. Copyright © 2015 Cognitive Science Society, Inc.
Statistical mechanics of neocortical interactions. Derivation of short-term-memory capacity
NASA Astrophysics Data System (ADS)
Ingber, Lester
1984-06-01
A theory developed by the author to describe macroscopic neocortical interactions demonstrates that empirical values of chemical and electrical parameters of synaptic interactions establish several minima of the path-integral Lagrangian as a function of excitatory and inhibitory columnar firings. The number of possible minima, their time scales of hysteresis and probable reverberations, and their nearest-neighbor columnar interactions are all consistent with well-established empirical rules of human short-term memory. Thus, aspects of conscious experience are derived from neuronal firing patterns, using modern methods of nonlinear nonequilibrium statistical mechanics to develop realistic explicit synaptic interactions.
A delta-rule model of numerical and non-numerical order processing.
Verguts, Tom; Van Opstal, Filip
2014-06-01
Numerical and non-numerical order processing share empirical characteristics (distance effect and semantic congruity), but there are also important differences (in size effect and end effect). At the same time, models and theories of numerical and non-numerical order processing developed largely separately. Currently, we combine insights from 2 earlier models to integrate them in a common framework. We argue that the same learning principle underlies numerical and non-numerical orders, but that environmental features determine the empirical differences. Implications for current theories on order processing are pointed out. PsycINFO Database Record (c) 2014 APA, all rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-01
... the proposal. While recognizing the interest of stockholders in simple majority voting to amend these... publishing this notice to solicit comments on the proposed rule change from interested persons. \\1\\ 15 U.S.C... would not oppose a change to a simple majority provision for certain of the provisions currently subject...
Using High Speed Smartphone Cameras and Video Analysis Techniques to Teach Mechanical Wave Physics
ERIC Educational Resources Information Center
Bonato, Jacopo; Gratton, Luigi M.; Onorato, Pasquale; Oss, Stefano
2017-01-01
We propose the use of smartphone-based slow-motion video analysis techniques as a valuable tool for investigating physics concepts ruling mechanical wave propagation. The simple experimental activities presented here, suitable for both high school and undergraduate students, allows one to measure, in a simple yet rigorous way, the speed of pulses…
When Practice Doesn't Lead to Retrieval: An Analysis of Children's Errors with Simple Addition
ERIC Educational Resources Information Center
de Villiers, Celéste; Hopkins, Sarah
2013-01-01
Counting strategies initially used by young children to perform simple addition are often replaced by more efficient counting strategies, decomposition strategies and rule-based strategies until most answers are encoded in memory and can be directly retrieved. Practice is thought to be the key to developing fluent retrieval of addition facts. This…
NASA Astrophysics Data System (ADS)
Ehrentreich, F.; Dietze, U.; Meyer, U.; Abbas, S.; Schulz, H.
1995-04-01
It is a main task within the SpecInfo-Project to develop interpretation tools that can handle a great deal more of the complicated, more specific spectrum-structure-correlations. In the first step the empirical knowledge about the assignment of structural groups and their characteristic IR-bands has been collected from literature and represented in a computer readable well-structured form. Vague, verbal rules are managed by introduction of linguistic variables. The next step was the development of automatic rule generating procedures. We had combined and enlarged the IDIOTS algorithm with the algorithm by Blaffert relying on set theory. The procedures were successfully applied to the SpecInfo database. The realization of the preceding items is a prerequisite for the improvement of the computerized structure elucidation procedure.
Time series regression-based pairs trading in the Korean equities market
NASA Astrophysics Data System (ADS)
Kim, Saejoon; Heo, Jun
2017-07-01
Pairs trading is an instance of statistical arbitrage that relies on heavy quantitative data analysis to profit by capitalising low-risk trading opportunities provided by anomalies of related assets. A key element in pairs trading is the rule by which open and close trading triggers are defined. This paper investigates the use of time series regression to define the rule which has previously been identified with fixed threshold-based approaches. Empirical results indicate that our approach may yield significantly increased excess returns compared to ones obtained by previous approaches on large capitalisation stocks in the Korean equities market.
Barik, Sailen
2017-12-01
A significant number of proteins in all living species contains amino acid repeats (AARs) of various lengths and compositions, many of which play important roles in protein structure and function. Here, I have surveyed select homopolymeric single [(A)n] and double [(AB)n] AARs in the human proteome. A close examination of their codon pattern and analysis of RNA structure propensity led to the following set of empirical rules: (1) One class of amino acid repeats (Class I) uses a mixture of synonymous codons, some of which approximate the codon bias ratio in the overall human proteome; (2) The second class (Class II) disregards the codon bias ratio, and appears to have originated by simple repetition of the same codon (or just a few codons); and finally, (3) In all AARs (including Class I, Class II, and the in-betweens), the codons are chosen in a manner that precludes the formation of RNA secondary structure. It appears that the AAR genes have evolved by orchestrating a balance between codon usage and mRNA secondary structure. The insights gained here should provide a better understanding of AAR evolution and may assist in designing synthetic genes.
NASA Astrophysics Data System (ADS)
Batac, Rene C.; Paguirigan, Antonino A., Jr.; Tarun, Anjali B.; Longjas, Anthony G.
2017-04-01
We propose a cellular automata model for earthquake occurrences patterned after the sandpile model of self-organized criticality (SOC). By incorporating a single parameter describing the probability to target the most susceptible site, the model successfully reproduces the statistical signatures of seismicity. The energy distributions closely follow power-law probability density functions (PDFs) with a scaling exponent of around -1. 6, consistent with the expectations of the Gutenberg-Richter (GR) law, for a wide range of the targeted triggering probability values. Additionally, for targeted triggering probabilities within the range 0.004-0.007, we observe spatiotemporal distributions that show bimodal behavior, which is not observed previously for the original sandpile. For this critical range of values for the probability, model statistics show remarkable comparison with long-period empirical data from earthquakes from different seismogenic regions. The proposed model has key advantages, the foremost of which is the fact that it simultaneously captures the energy, space, and time statistics of earthquakes by just introducing a single parameter, while introducing minimal parameters in the simple rules of the sandpile. We believe that the critical targeting probability parameterizes the memory that is inherently present in earthquake-generating regions.
Deviation of Long-Period Tides from Equilibrium: Kinematics and Geostrophy
NASA Technical Reports Server (NTRS)
Egbert, Gary D.; Ray, Richard D.
2003-01-01
New empirical estimates of the long-period fortnightly (Mf) tide obtained from TOPEX/Poseidon (T/P) altimeter data confirm significant basin-scale deviations from equilibrium. Elevations in the low-latitude Pacific have reduced amplitude and lag those in the Atlantic by 30 deg or more. These interbasin amplitude and phase variations are robust features that are reproduced by numerical solutions of the shallow-water equations, even for a constant-depth ocean with schematic interconnected rectangular basins. A simplified analytical model for cooscillating connected basins also reproduces the principal features observed in the empirical solutions. This simple model is largely kinematic. Zonally averaged elevations within a simple closed basin would be nearly in equilibrium with the gravitational potential, except for a constant offset required to conserve mass. With connected basins these offsets are mostly eliminated by interbasin mass flux. Because of rotation, this flux occurs mostly in a narrow boundary layer across the mouth and at the western edge of each basin, and geostrophic balance in this zone supports small residual offsets (and phase shifts) between basins. The simple model predicts that this effect should decrease roughly linearly with frequency, a result that is confirmed by numerical modeling and empirical T/P estimates of the monthly (Mm) tidal constituent. This model also explains some aspects of the anomalous nonisostatic response of the ocean to atmospheric pressure forcing at periods of around 5 days.
Empirical State Error Covariance Matrix for Batch Estimation
NASA Technical Reports Server (NTRS)
Frisbee, Joe
2015-01-01
State estimation techniques effectively provide mean state estimates. However, the theoretical state error covariance matrices provided as part of these techniques often suffer from a lack of confidence in their ability to describe the uncertainty in the estimated states. By a reinterpretation of the equations involved in the weighted batch least squares algorithm, it is possible to directly arrive at an empirical state error covariance matrix. The proposed empirical state error covariance matrix will contain the effect of all error sources, known or not. This empirical error covariance matrix may be calculated as a side computation for each unique batch solution. Results based on the proposed technique will be presented for a simple, two observer and measurement error only problem.
Effect of homophily on network formation
NASA Astrophysics Data System (ADS)
Kim, Kibae; Altmann, Jörn
2017-03-01
Although there is much research on network formation based on the preferential attachment rule, the research did not come up with a formula that, on the one hand, can reproduce shapes of cumulative degree distributions of empirical complex networks and, on the other hand, can represent intuitively theories on individual behavior. In this paper, we propose a formula that closes this gap by integrating into the formula for the preferential attachment rule (i.e., a node with higher degree is more likely to gain a new link) a representation of the theory of individual behavior with respect to nodes preferring to connect to other nodes with similar attributes (i.e., homophily). Based on this formula, we simulate the shapes of cumulative degree distributions for different levels of homophily and five different seed networks. Our simulation results suggest that homophily and the preferential attachment rule interact for all five types of seed networks. Surprisingly, the resulting cumulative degree distribution in log-log scale always shifts from a concave shape to a convex shape, as the level of homophily gets larger. Therefore, our formula can explain intuitively why some of the empirical complex networks show a linear cumulative degree distribution in log-log scale while others show either a concave or convex shape. Furthermore, another major finding indicates that homophily makes people of a group richer than people outside this group, which is a surprising and significant finding.
Estimating enthalpy of vaporization from vapor pressure using Trouton's rule.
MacLeod, Matthew; Scheringer, Martin; Hungerbühler, Konrad
2007-04-15
The enthalpy of vaporization of liquids and subcooled liquids at 298 K (delta H(VAP)) is an important parameter in environmental fate assessments that consider spatial and temporal variability in environmental conditions. It has been shown that delta H(VAP)P for non-hydrogen-bonding substances can be estimated from vapor pressure at 298 K (P(L)) using an empirically derived linear relationship. Here, we demonstrate that the relationship between delta H(VAP)and PL is consistent with Trouton's rule and the ClausiusClapeyron equation under the assumption that delta H(VAP) is linearly dependent on temperature between 298 K and the boiling point temperature. Our interpretation based on Trouton's rule substantiates the empirical relationship between delta H(VAP) degree and P(L) degrees for non-hydrogen-bonding chemicals with subcooled liquid vapor pressures ranging over 15 orders of magnitude. We apply the relationship between delta H(VAP) degrees and P(L) degrees to evaluate data reported in literature reviews for several important classes of semivolatile environmental contaminants, including polycyclic aromatic hydrocarbons, chlorobenzenes, polychlorinated biphenyls and polychlorinated dibenzo-dioxins and -furans and illustrate the temperature dependence of results from a multimedia model presented as a partitioning map. The uncertainty associated with estimating delta H(VAP)degrees from P(L) degrees using this relationship is acceptable for most environmental fate modeling of non-hydrogen-bonding semivolatile organic chemicals.
Design and Analysis of Complex D-Regions in Reinforced Concrete Structures
ERIC Educational Resources Information Center
Yindeesuk, Sukit
2009-01-01
STM design provisions, such as those in Appendix A of ACI318-08, consist of rules for evaluating the capacity of the load-resisting truss that is idealized to carry the forces through the D-Region. These code rules were primarily derived from test data on simple D-Regions such as deep beams and corbels. However, these STM provisions are taken as…
Biomarkers of Fatigue: Ranking Mental Fatigue Susceptibility
2010-12-10
expected declines in performance during the 36-hour, 15-minute period of sleep deprivation without caffeine. The simple change from baseline results...rankings for fatigue resistance were then determined via a percent- change rule similar to that used in Chaiken, Harville, Harrison, Fischer, Fisher...and Whitmore (2008). This rule ranks subjects on percent change of cognitive performance from a baseline performance (before fatigue) to a fatigue
Finding the Density of a Liquid Using a Metre Rule
ERIC Educational Resources Information Center
Chattopadhyay, K. N.
2008-01-01
A simple method, which is based on the principle of moment of forces only, is described for the determination of the density of liquids without measuring the mass and volume. At first, an empty test tube and a solid substance, which are hung on each side of a metre rule, are balanced and the moment arm of the test tube is measured. Keeping the…
Numerical calculation of the Fresnel transform.
Kelly, Damien P
2014-04-01
In this paper, we address the problem of calculating Fresnel diffraction integrals using a finite number of uniformly spaced samples. General and simple sampling rules of thumb are derived that allow the user to calculate the distribution for any propagation distance. It is shown how these rules can be extended to fast-Fourier-transform-based algorithms to increase calculation efficiency. A comparison with other theoretical approaches is made.
An automated approach to the design of decision tree classifiers
NASA Technical Reports Server (NTRS)
Argentiero, P.; Chin, R.; Beaudet, P.
1982-01-01
An automated technique is presented for designing effective decision tree classifiers predicated only on a priori class statistics. The procedure relies on linear feature extractions and Bayes table look-up decision rules. Associated error matrices are computed and utilized to provide an optimal design of the decision tree at each so-called 'node'. A by-product of this procedure is a simple algorithm for computing the global probability of correct classification assuming the statistical independence of the decision rules. Attention is given to a more precise definition of decision tree classification, the mathematical details on the technique for automated decision tree design, and an example of a simple application of the procedure using class statistics acquired from an actual Landsat scene.
New scheduling rules for a dynamic flexible flow line problem with sequence-dependent setup times
NASA Astrophysics Data System (ADS)
Kia, Hamidreza; Ghodsypour, Seyed Hassan; Davoudpour, Hamid
2017-09-01
In the literature, the application of multi-objective dynamic scheduling problem and simple priority rules are widely studied. Although these rules are not efficient enough due to simplicity and lack of general insight, composite dispatching rules have a very suitable performance because they result from experiments. In this paper, a dynamic flexible flow line problem with sequence-dependent setup times is studied. The objective of the problem is minimization of mean flow time and mean tardiness. A 0-1 mixed integer model of the problem is formulated. Since the problem is NP-hard, four new composite dispatching rules are proposed to solve it by applying genetic programming framework and choosing proper operators. Furthermore, a discrete-event simulation model is made to examine the performances of scheduling rules considering four new heuristic rules and the six adapted heuristic rules from the literature. It is clear from the experimental results that composite dispatching rules that are formed from genetic programming have a better performance in minimization of mean flow time and mean tardiness than others.
Hamilton's rule and the causes of social evolution
Bourke, Andrew F. G.
2014-01-01
Hamilton's rule is a central theorem of inclusive fitness (kin selection) theory and predicts that social behaviour evolves under specific combinations of relatedness, benefit and cost. This review provides evidence for Hamilton's rule by presenting novel syntheses of results from two kinds of study in diverse taxa, including cooperatively breeding birds and mammals and eusocial insects. These are, first, studies that empirically parametrize Hamilton's rule in natural populations and, second, comparative phylogenetic analyses of the genetic, life-history and ecological correlates of sociality. Studies parametrizing Hamilton's rule are not rare and demonstrate quantitatively that (i) altruism (net loss of direct fitness) occurs even when sociality is facultative, (ii) in most cases, altruism is under positive selection via indirect fitness benefits that exceed direct fitness costs and (iii) social behaviour commonly generates indirect benefits by enhancing the productivity or survivorship of kin. Comparative phylogenetic analyses show that cooperative breeding and eusociality are promoted by (i) high relatedness and monogamy and, potentially, by (ii) life-history factors facilitating family structure and high benefits of helping and (iii) ecological factors generating low costs of social behaviour. Overall, the focal studies strongly confirm the predictions of Hamilton's rule regarding conditions for social evolution and their causes. PMID:24686934
Hamilton's rule and the causes of social evolution.
Bourke, Andrew F G
2014-05-19
Hamilton's rule is a central theorem of inclusive fitness (kin selection) theory and predicts that social behaviour evolves under specific combinations of relatedness, benefit and cost. This review provides evidence for Hamilton's rule by presenting novel syntheses of results from two kinds of study in diverse taxa, including cooperatively breeding birds and mammals and eusocial insects. These are, first, studies that empirically parametrize Hamilton's rule in natural populations and, second, comparative phylogenetic analyses of the genetic, life-history and ecological correlates of sociality. Studies parametrizing Hamilton's rule are not rare and demonstrate quantitatively that (i) altruism (net loss of direct fitness) occurs even when sociality is facultative, (ii) in most cases, altruism is under positive selection via indirect fitness benefits that exceed direct fitness costs and (iii) social behaviour commonly generates indirect benefits by enhancing the productivity or survivorship of kin. Comparative phylogenetic analyses show that cooperative breeding and eusociality are promoted by (i) high relatedness and monogamy and, potentially, by (ii) life-history factors facilitating family structure and high benefits of helping and (iii) ecological factors generating low costs of social behaviour. Overall, the focal studies strongly confirm the predictions of Hamilton's rule regarding conditions for social evolution and their causes.
A neural network architecture for implementation of expert systems for real time monitoring
NASA Technical Reports Server (NTRS)
Ramamoorthy, P. A.
1991-01-01
Since neural networks have the advantages of massive parallelism and simple architecture, they are good tools for implementing real time expert systems. In a rule based expert system, the antecedents of rules are in the conjunctive or disjunctive form. We constructed a multilayer feedforward type network in which neurons represent AND or OR operations of rules. Further, we developed a translator which can automatically map a given rule base into the network. Also, we proposed a new and powerful yet flexible architecture that combines the advantages of both fuzzy expert systems and neural networks. This architecture uses the fuzzy logic concepts to separate input data domains into several smaller and overlapped regions. Rule-based expert systems for time critical applications using neural networks, the automated implementation of rule-based expert systems with neural nets, and fuzzy expert systems vs. neural nets are covered.
Continuous punishment and the potential of gentle rule enforcement.
Erev, Ido; Ingram, Paul; Raz, Ornit; Shany, Dror
2010-05-01
The paper explores the conditions that determine the effect of rule enforcement policies that imply an attempt to punish all the visible violations of the rule. We start with a simple game-theoretic analysis that highlights the value of gentle COntinuous Punishment (gentle COP) policies. If the subjects of the rule are rational, gentle COP can eliminate violations even when the rule enforcer has limited resources. The second part of the paper uses simulations to examine the robustness of gentle COP policies to likely deviations from rationality. The results suggest that when the probability of detecting violations is sufficiently high, gentle COP policies can be effective even when the subjects of the rule are boundedly rational adaptive learners. The paper concludes with experimental studies that clarify the value of gentle COP policies in the lab, and in attempt to eliminate cheating in exams. Copyright (c) 2009 Elsevier B.V. All rights reserved.
INDEXABILITY AND OPTIMAL INDEX POLICIES FOR A CLASS OF REINITIALISING RESTLESS BANDITS.
Villar, Sofía S
2016-01-01
Motivated by a class of Partially Observable Markov Decision Processes with application in surveillance systems in which a set of imperfectly observed state processes is to be inferred from a subset of available observations through a Bayesian approach, we formulate and analyze a special family of multi-armed restless bandit problems. We consider the problem of finding an optimal policy for observing the processes that maximizes the total expected net rewards over an infinite time horizon subject to the resource availability. From the Lagrangian relaxation of the original problem, an index policy can be derived, as long as the existence of the Whittle index is ensured. We demonstrate that such a class of reinitializing bandits in which the projects' state deteriorates while active and resets to its initial state when passive until its completion possesses the structural property of indexability and we further show how to compute the index in closed form. In general, the Whittle index rule for restless bandit problems does not achieve optimality. However, we show that the proposed Whittle index rule is optimal for the problem under study in the case of stochastically heterogenous arms under the expected total criterion, and it is further recovered by a simple tractable rule referred to as the 1-limited Round Robin rule. Moreover, we illustrate the significant suboptimality of other widely used heuristic: the Myopic index rule, by computing in closed form its suboptimality gap. We present numerical studies which illustrate for the more general instances the performance advantages of the Whittle index rule over other simple heuristics.
INDEXABILITY AND OPTIMAL INDEX POLICIES FOR A CLASS OF REINITIALISING RESTLESS BANDITS
Villar, Sofía S.
2016-01-01
Motivated by a class of Partially Observable Markov Decision Processes with application in surveillance systems in which a set of imperfectly observed state processes is to be inferred from a subset of available observations through a Bayesian approach, we formulate and analyze a special family of multi-armed restless bandit problems. We consider the problem of finding an optimal policy for observing the processes that maximizes the total expected net rewards over an infinite time horizon subject to the resource availability. From the Lagrangian relaxation of the original problem, an index policy can be derived, as long as the existence of the Whittle index is ensured. We demonstrate that such a class of reinitializing bandits in which the projects’ state deteriorates while active and resets to its initial state when passive until its completion possesses the structural property of indexability and we further show how to compute the index in closed form. In general, the Whittle index rule for restless bandit problems does not achieve optimality. However, we show that the proposed Whittle index rule is optimal for the problem under study in the case of stochastically heterogenous arms under the expected total criterion, and it is further recovered by a simple tractable rule referred to as the 1-limited Round Robin rule. Moreover, we illustrate the significant suboptimality of other widely used heuristic: the Myopic index rule, by computing in closed form its suboptimality gap. We present numerical studies which illustrate for the more general instances the performance advantages of the Whittle index rule over other simple heuristics. PMID:27212781
Schädler, Marc R; Warzybok, Anna; Kollmeier, Birger
2018-01-01
The simulation framework for auditory discrimination experiments (FADE) was adopted and validated to predict the individual speech-in-noise recognition performance of listeners with normal and impaired hearing with and without a given hearing-aid algorithm. FADE uses a simple automatic speech recognizer (ASR) to estimate the lowest achievable speech reception thresholds (SRTs) from simulated speech recognition experiments in an objective way, independent from any empirical reference data. Empirical data from the literature were used to evaluate the model in terms of predicted SRTs and benefits in SRT with the German matrix sentence recognition test when using eight single- and multichannel binaural noise-reduction algorithms. To allow individual predictions of SRTs in binaural conditions, the model was extended with a simple better ear approach and individualized by taking audiograms into account. In a realistic binaural cafeteria condition, FADE explained about 90% of the variance of the empirical SRTs for a group of normal-hearing listeners and predicted the corresponding benefits with a root-mean-square prediction error of 0.6 dB. This highlights the potential of the approach for the objective assessment of benefits in SRT without prior knowledge about the empirical data. The predictions for the group of listeners with impaired hearing explained 75% of the empirical variance, while the individual predictions explained less than 25%. Possibly, additional individual factors should be considered for more accurate predictions with impaired hearing. A competing talker condition clearly showed one limitation of current ASR technology, as the empirical performance with SRTs lower than -20 dB could not be predicted.
Schädler, Marc R.; Warzybok, Anna; Kollmeier, Birger
2018-01-01
The simulation framework for auditory discrimination experiments (FADE) was adopted and validated to predict the individual speech-in-noise recognition performance of listeners with normal and impaired hearing with and without a given hearing-aid algorithm. FADE uses a simple automatic speech recognizer (ASR) to estimate the lowest achievable speech reception thresholds (SRTs) from simulated speech recognition experiments in an objective way, independent from any empirical reference data. Empirical data from the literature were used to evaluate the model in terms of predicted SRTs and benefits in SRT with the German matrix sentence recognition test when using eight single- and multichannel binaural noise-reduction algorithms. To allow individual predictions of SRTs in binaural conditions, the model was extended with a simple better ear approach and individualized by taking audiograms into account. In a realistic binaural cafeteria condition, FADE explained about 90% of the variance of the empirical SRTs for a group of normal-hearing listeners and predicted the corresponding benefits with a root-mean-square prediction error of 0.6 dB. This highlights the potential of the approach for the objective assessment of benefits in SRT without prior knowledge about the empirical data. The predictions for the group of listeners with impaired hearing explained 75% of the empirical variance, while the individual predictions explained less than 25%. Possibly, additional individual factors should be considered for more accurate predictions with impaired hearing. A competing talker condition clearly showed one limitation of current ASR technology, as the empirical performance with SRTs lower than −20 dB could not be predicted. PMID:29692200
On the analysis of photo-electron spectra
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gao, C.-Z., E-mail: gao@irsamc.ups-tlse.fr; CNRS, LPT; Dinh, P.M.
2015-09-15
We analyze Photo-Electron Spectra (PES) for a variety of excitation mechanisms from a simple mono-frequency laser pulse to involved combination of pulses as used, e.g., in attosecond experiments. In the case of simple pulses, the peaks in PES reflect the occupied single-particle levels in combination with the given laser frequency. This usual, simple rule may badly fail in the case of excitation pulses with mixed frequencies and if resonant modes of the system are significantly excited. We thus develop an extension of the usual rule to cover all possible excitation scenarios, including mixed frequencies in the attosecond regime. We find thatmore » the spectral distributions of dipole, monopole and quadrupole power for the given excitation taken together and properly shifted by the single-particle energies provide a pertinent picture of the PES in all situations. This leads to the derivation of a generalized relation allowing to understand photo-electron yields even in complex experimental setups.« less
Yaraghi, Niam; Gopal, Ram D
2018-03-01
Policy Points: Frequent data breaches in the US health care system undermine the privacy of millions of patients every year-a large number of which happen among business associates of the health care providers that continue to gain unprecedented access to patients' data as the US health care system becomes digitally integrated. Implementation of the HIPAA Omnibus Rules in 2013 has led to a significant decrease in the number of privacy breach incidents among business associates. Frequent data breaches in the US health care system undermine the privacy of millions of patients every year. A large number of such breaches happens among business associates of the health care providers that continue to gain unprecedented access to patients' data as the US health care system becomes digitally integrated. The Omnibus Rules of the Health Insurance Portability and Accountability Act (HIPAA), which were enacted in 2013, significantly increased the regulatory oversight and privacy protection requirements of business associates. The objective of this study is to empirically examine the effects of this shift in policy on the frequency of medical privacy breaches among business associates in the US health care system. The findings of this research shed light on how regulatory efforts can protect patients' privacy. Using publicly available data on breach incidents between October 2009 and August 2017 as reported by the Office for Civil Rights (OCR), we conducted an interrupted time-series analysis and a difference-in-differences analysis to examine the immediate and long-term effects of implementation of HIPAA omnibus rules on the frequency of medical privacy breaches. We show that implementation of the omnibus rules led to a significant reduction in the number of breaches among business associates and prevented 180 privacy breaches from happening, which could have affected nearly 18 million Americans. Implementation of HIPAA omnibus rules may have been a successful federal policy in enhancing privacy protection efforts and reducing the number of breach incidents in the US health care system. © 2018 Milbank Memorial Fund.
Explanation-based learning in infancy.
Baillargeon, Renée; DeJong, Gerald F
2017-10-01
In explanation-based learning (EBL), domain knowledge is leveraged in order to learn general rules from few examples. An explanation is constructed for initial exemplars and is then generalized into a candidate rule that uses only the relevant features specified in the explanation; if the rule proves accurate for a few additional exemplars, it is adopted. EBL is thus highly efficient because it combines both analytic and empirical evidence. EBL has been proposed as one of the mechanisms that help infants acquire and revise their physical rules. To evaluate this proposal, 11- and 12-month-olds (n = 260) were taught to replace their current support rule (that an object is stable when half or more of its bottom surface is supported) with a more sophisticated rule (that an object is stable when half or more of the entire object is supported). Infants saw teaching events in which asymmetrical objects were placed on a base, followed by static test displays involving a novel asymmetrical object and a novel base. When the teaching events were designed to facilitate EBL, infants learned the new rule with as few as two (12-month-olds) or three (11-month-olds) exemplars. When the teaching events were designed to impede EBL, however, infants failed to learn the rule. Together, these results demonstrate that even infants, with their limited knowledge about the world, benefit from the knowledge-based approach of EBL.
Zhao, Xin; Kushnir, Tamar
2018-01-01
Young children demonstrate awareness of normativity in various domains of social learning. It is unclear, however, whether children recognize that rules can be changed in certain contexts and by certain people or groups. Across three studies, we provided empirical evidence that children consider individual authority and collective agreement when reasoning about who can change rules. In Study 1, children aged 4-7years watched videos of children playing simply sorting and stacking games in groups or alone. Across conditions, the group game was initiated (a) by one child, (b) by collaborative agreement, or (c) by an adult authority figure. In the group games with a rule initiated by one child, children attributed ability to change rules only to that individual and not his or her friends, and they mentioned ownership and authority in their explanations. When the rule was initiated collaboratively, older children said that no individual could change the rule, whereas younger children said that either individual could do so. When an adult initiated the rule, children stated that only the adult could change it. In contrast, children always endorsed a child's decision to change his or her own solitary rule and never endorsed any child's ability to change moral and conventional rules in daily life. Age differences corresponded to beliefs about friendship and agreement in peer play (Study 2) and disappeared when the decision process behind and normative force of collaboratively initiated rules were clarified (Study 3). These results show important connections between normativity and considerations of authority and collaboration during early childhood. Copyright © 2017 Elsevier Inc. All rights reserved.
Pushing the rules: effects and aftereffects of deliberate rule violations.
Wirth, Robert; Pfister, Roland; Foerster, Anna; Huestegge, Lynn; Kunde, Wilfried
2016-09-01
Most of our daily life is organized around rules and social norms. But what makes rules so special? And what if one were to break a rule intentionally? Can we simply free us from the present set of rules or do we automatically adhere to them? How do rule violations influence subsequent behavior? To investigate the effects and aftereffects of violating simple S-R rule, we conducted three experiments that investigated continuous finger-tracking responses on an iPad. Our experiments show that rule violations are distinct from rule-based actions in both response times and movement trajectories, they take longer to initiate and execute, and their movement trajectory is heavily contorted. Data not only show differences between the two types of response (rule-based vs. violation), but also yielded a characteristic pattern of aftereffects in case of rule violations: rule violations do not trigger adaptation effects that render further rule violations less difficult, but every rule violation poses repeated effort on the agent. The study represents a first step towards understanding the signature and underlying mechanisms of deliberate rule violations, they cannot be acted out by themselves, but require the activation of the original rule first. Consequently, they are best understood as reformulations of existing rules that are not accessible on their own, but need to be constantly derived from the original rule, with an add-on that might entail an active tendency to steer away from mental representations that reflect (socially) unwanted behavior.
NASA Astrophysics Data System (ADS)
Chen, Jing-Chao; Zhou, Yu; Wang, Xi
2018-02-01
Technical trading rules have been widely used by practitioners in financial markets for a long time. The profitability remains controversial and few consider the stationarity of technical indicators used in trading rules. We convert MA, KDJ and Bollinger bands into stationary processes and investigate the profitability of these trading rules by using 3 high-frequency data(15s,30s and 60s) of CSI300 Stock Index Futures from January 4th 2012 to December 31st 2016. Several performance and risk measures are adopted to assess the practical value of all trading rules directly while ADF-test is used to verify the stationarity and SPA test to check whether trading rules perform well due to intrinsic superiority or pure luck. The results show that there are several significant combinations of parameters for each indicator when transaction costs are not taken into consideration. Once transaction costs are included, trading profits will be eliminated completely. We also propose a method to reduce the risk of technical trading rules.
Knowledge base rule partitioning design for CLIPS
NASA Technical Reports Server (NTRS)
Mainardi, Joseph D.; Szatkowski, G. P.
1990-01-01
This describes a knowledge base (KB) partitioning approach to solve the problem of real-time performance using the CLIPS AI shell when containing large numbers of rules and facts. This work is funded under the joint USAF/NASA Advanced Launch System (ALS) Program as applied research in expert systems to perform vehicle checkout for real-time controller and diagnostic monitoring tasks. The Expert System advanced development project (ADP-2302) main objective is to provide robust systems responding to new data frames of 0.1 to 1.0 second intervals. The intelligent system control must be performed within the specified real-time window, in order to meet the demands of the given application. Partitioning the KB reduces the complexity of the inferencing Rete net at any given time. This reduced complexity improves performance but without undo impacts during load and unload cycles. The second objective is to produce highly reliable intelligent systems. This requires simple and automated approaches to the KB verification & validation task. Partitioning the KB reduces rule interaction complexity overall. Reduced interaction simplifies the V&V testing necessary by focusing attention only on individual areas of interest. Many systems require a robustness that involves a large number of rules, most of which are mutually exclusive under different phases or conditions. The ideal solution is to control the knowledge base by loading rules that directly apply for that condition, while stripping out all rules and facts that are not used during that cycle. The practical approach is to cluster rules and facts into associated 'blocks'. A simple approach has been designed to control the addition and deletion of 'blocks' of rules and facts, while allowing real-time operations to run freely. Timing tests for real-time performance for specific machines under R/T operating systems have not been completed but are planned as part of the analysis process to validate the design.
Bösner, Stefan; Haasenritter, Jörg; Becker, Annette; Karatolios, Konstantinos; Vaucher, Paul; Gencer, Baris; Herzig, Lilli; Heinzel-Gutenbrunner, Monika; Schaefer, Juergen R; Abu Hani, Maren; Keller, Heidi; Sönnichsen, Andreas C; Baum, Erika; Donner-Banzhoff, Norbert
2010-09-07
Chest pain can be caused by various conditions, with life-threatening cardiac disease being of greatest concern. Prediction scores to rule out coronary artery disease have been developed for use in emergency settings. We developed and validated a simple prediction rule for use in primary care. We conducted a cross-sectional diagnostic study in 74 primary care practices in Germany. Primary care physicians recruited all consecutive patients who presented with chest pain (n = 1249) and recorded symptoms and findings for each patient (derivation cohort). An independent expert panel reviewed follow-up data obtained at six weeks and six months on symptoms, investigations, hospital admissions and medications to determine the presence or absence of coronary artery disease. Adjusted odds ratios of relevant variables were used to develop a prediction rule. We calculated measures of diagnostic accuracy for different cut-off values for the prediction scores using data derived from another prospective primary care study (validation cohort). The prediction rule contained five determinants (age/sex, known vascular disease, patient assumes pain is of cardiac origin, pain is worse during exercise, and pain is not reproducible by palpation), with the score ranging from 0 to 5 points. The area under the curve (receiver operating characteristic curve) was 0.87 (95% confidence interval [CI] 0.83-0.91) for the derivation cohort and 0.90 (95% CI 0.87-0.93) for the validation cohort. The best overall discrimination was with a cut-off value of 3 (positive result 3-5 points; negative result
Investigating the Impacts of Design Heuristics on Idea Initiation and Development
ERIC Educational Resources Information Center
Kramer, Julia; Daly, Shanna R.; Yilmaz, Seda; Seifert, Colleen M.; Gonzalez, Richard
2015-01-01
This paper presents an analysis of engineering students' use of Design Heuristics as part of a team project in an undergraduate engineering design course. Design Heuristics are an empirically derived set of cognitive "rules of thumb" for use in concept generation. We investigated heuristic use in the initial concept generation phase,…
Application of Fuzzy Reasoning for Filtering and Enhancement of Ultrasonic Images
NASA Technical Reports Server (NTRS)
Sacha, J. P.; Cios, K. J.; Roth, D. J.; Berke, L.; Vary, A.
1994-01-01
This paper presents a new type of an adaptive fuzzy operator for detection of isolated abnormalities, and enhancement of raw ultrasonic images. Fuzzy sets used in decision rules are defined for each image based on empirical statistics of the color intensities. Examples of the method are also presented in the paper.
Quality, Conformity, and Conflict: Questioning the Assumptions of Osborn's Brainstorming Technique
ERIC Educational Resources Information Center
Goldenberg, Olga; Wiley, Jennifer
2011-01-01
Divergent thinking tasks are a popular basis for research on group creative problem solving, or brainstorming. The brainstorming literature has been dominated by research that investigates group performance by measuring the total number of generated ideas using the original rules put forth by Osborn (1953). This review of empirical literature on…
Signatures and Popular Literacy in Early Seventeenth-Century Japan
ERIC Educational Resources Information Center
Rubinger, Richard
2006-01-01
My paper looks at "signatures" in the form of "ciphers" (kao) and other personal marks made on population registers, town rules, and apostasy oaths in the early seventeenth century to provide some empirical evidence of very high literacy among village leaders. The essay also argues, using the same data, that literacy had…
ERIC Educational Resources Information Center
Glade, Matthias; Prediger, Susanne
2017-01-01
According to the design principle of progressive schematization, learning trajectories towards procedural rules can be organized as independent discoveries when the learning arrangement invites the students first to develop models for mathematical concepts and model-based informal strategies; then to explore the strategies and to discover pattern…
On Empirical Evidence for the Existence of Rules Governing Speech-Using Behavior.
ERIC Educational Resources Information Center
Sanders, Robert E.; Schneider, Michael
Departing from Baconian science which focuses on explanation of the occurrence of events, Chomsky's linguistics involves a different orientation--namely the explanation of form to account for linguistic behavior. The "knowledge" upon which linguistic judgements are based involves the premise of innate mechanisms. The assumption that speakers and…
Wheelchair batteries. II: Capacity, sizing, and life.
Kauzlarich, J J
1990-01-01
The characteristics of lead-acid batteries for wheelchairs in terms of a new empirical equation for the capacity, application of the Palmgren-Miner Rule for sizing the battery, and the effect of depth of discharge on the life cycles is presented. A brief section about selecting an economical battery for an electric wheelchair is included.
Using ITS to Create an Insurance Industry Application: A Joint Case Study.
ERIC Educational Resources Information Center
Boies, Stephen J.; And Others
1993-01-01
Presents an empirical case study of the use of ITS, a software development environment designed by IBM, by Continental Insurance for underwriting applications. Use of a rule-based user interface style that made electronic forms look like standard insurance industry paper forms and worked according to Continental's guidelines is described.…
Development of Turkish Education Policy and the Modernization Primary Education Revisited
ERIC Educational Resources Information Center
Sevinc, Muzeyyen
2006-01-01
The Turkish Republic was born out of the ruins of the Ottoman Empire (1270-1920), which extended into three continents and ruled people of various ethnic denominations having different languages, religions, and culture. Elementary education for the masses was left to the people themselves, with little input from the imperial administration. This…
Does Teacher Certification Program Lead to Better Quality Teachers? Evidence from Indonesia
ERIC Educational Resources Information Center
Kusumawardhani, Prita Nurmalia
2017-01-01
This paper examines the impact of the teacher certification program in Indonesia in 2007 and 2008 on student and teacher outcomes. I create a rule-based instrumental variable from discontinuities arising from the assignment mechanism of teachers into certification program. The thresholds are determined empirically. The study applies a two-sample…
Technetium: The First Radioelement on the Periodic Table
ERIC Educational Resources Information Center
Johnstone, Erik V.; Yates, Mary Anne; Poineau, Frederic; Sattelberger, Alfred P.; Czerwinski, Kenneth R.
2017-01-01
The radioactive nature of technetium is discussed using a combination of introductory nuclear physics concepts and empirical trends observed in the chart of the nuclides and the periodic table of the elements. Trends such as the enhanced stability of nucleon pairs, magic numbers, and Mattauch's rule are described. The concepts of nuclear binding…
Evidence for the Early Emergence of the Simple View of Reading in a Transparent Orthography
ERIC Educational Resources Information Center
Kendeou, Panayiota; Papadopoulos, Timothy C.; Kotzapoulou, Marianna
2013-01-01
The main aim of the present study was to empirically test the emergence of the Simple View of Reading (SVR) in a transparent orthography, and specifically in Greek. To do so, we examined whether the constituent components of the SVR could be identified in young, Greek-speaking children even before the beginning of formal reading instruction. Our…
ERIC Educational Resources Information Center
Lee, Wan-Fung; Bulcock, Jeffrey Wilson
The purposes of this study are: (1) to demonstrate the superiority of simple ridge regression over ordinary least squares regression through theoretical argument and empirical example; (2) to modify ridge regression through use of the variance normalization criterion; and (3) to demonstrate the superiority of simple ridge regression based on the…
A Simple Method to Determine the "R" or "S" Configuration of Molecules with an Axis of Chirality
ERIC Educational Resources Information Center
Wang, Cunde; Wu, Weiming
2011-01-01
A simple method for the "R" or "S" designation of molecules with an axis of chirality is described. The method involves projection of the substituents along the chiral axis, utilizes the Cahn-Ingold-Prelog sequence rules in assigning priority to the substituents, is easy to use, and has broad applicability. (Contains 5 figures.)
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-17
... solicit comments on the proposed rule change from interested persons. \\1\\ 15 U.S.C. 78s(b)(1). \\2\\ 17 CFR... PHLX. Also, in order to continue to maintain a relatively simple routing table and fee schedule, the... routed to PHLX while also maintaining a simple pricing structure. As it has done before, despite...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-01
... Order in a Simple or Complex Order that executes against non-Initiating Order interest and will also pay... to solicit comments on the proposed rule change from interested persons. \\1\\ 15 U.S.C. 78s(b)(1). \\2... paid to members executing electronically- delivered Customer Simple Orders in Penny Pilot Options and...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-08
... notice to solicit comments on the proposed rule change from interested persons. \\1\\ 15 U.S.C. 78s(b)(1... exchanges in the listed options marketplace. The Exchange proposes to adopt a set of fees for simple, non... Public Customer simple, non-complex Maker orders in all multiply-listed index and ETF options classes...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-18
... notice to solicit comments on the proposed rule change from interested persons. \\1\\ 15 U.S.C. 78s(b)(1...) The Customer \\3\\ Rebate Program in Section B; (ii) Simple Order pricing in Section I entitled Rebates... Exchange proposes to amend the Simple Order Fees for Removing Liquidity in Section I which are applicable...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-20
... solicit comments on the proposed rule change from interested persons. \\1\\ 15 U.S.C. 78s(b)(1). \\2\\ 17 CFR... Public Customer complex orders, including those that trade against simple (non-complex) orders (excluding... rebate for all Maker simple orders (excluding trades on the open, for which no fees are assessed or...
Tony Blair and the Politics of Race in Education: Whiteness, "Doublethink" and New Labour
ERIC Educational Resources Information Center
Gillborn, David
2008-01-01
It is tempting to view the Blairite legacy as a simple story of political hypocrisy: a government, swept to power after almost two decades of Conservative rule, promising much but reneging on those commitments and falling back on Thatcherite authoritarian popularism when the going got tough. But that would be too simple a story. The Blairite…
NASA Astrophysics Data System (ADS)
Vanini, Seyed Ali Sadough; Abolghasemzadeh, Mohammad; Assadi, Abbas
2013-07-01
Functionally graded steels with graded ferritic and austenitic regions including bainite and martensite intermediate layers produced by electroslag remelting have attracted much attention in recent years. In this article, an empirical model based on the Zener-Hollomon (Z-H) constitutive equation with generalized material constants is presented to investigate the effects of temperature and strain rate on the hot working behavior of functionally graded steels. Next, a theoretical model, generalized by strain compensation, is developed for the flow stress estimation of functionally graded steels under hot compression based on the phase mixture rule and boundary layer characteristics. The model is used for different strains and grading configurations. Specifically, the results for αβγMγ steels from empirical and theoretical models showed excellent agreement with those of experiments of other references within acceptable error.
Organizational Knowledge Transfer Using Ontologies and a Rule-Based System
NASA Astrophysics Data System (ADS)
Okabe, Masao; Yoshioka, Akiko; Kobayashi, Keido; Yamaguchi, Takahira
In recent automated and integrated manufacturing, so-called intelligence skill is becoming more and more important and its efficient transfer to next-generation engineers is one of the urgent issues. In this paper, we propose a new approach without costly OJT (on-the-job training), that is, combinational usage of a domain ontology, a rule ontology and a rule-based system. Intelligence skill can be decomposed into pieces of simple engineering rules. A rule ontology consists of these engineering rules as primitives and the semantic relations among them. A domain ontology consists of technical terms in the engineering rules and the semantic relations among them. A rule ontology helps novices get the total picture of the intelligence skill and a domain ontology helps them understand the exact meanings of the engineering rules. A rule-based system helps domain experts externalize their tacit intelligence skill to ontologies and also helps novices internalize them. As a case study, we applied our proposal to some actual job at a remote control and maintenance office of hydroelectric power stations in Tokyo Electric Power Co., Inc. We also did an evaluation experiment for this case study and the result supports our proposal.
Empirical and theoretical analysis of complex systems
NASA Astrophysics Data System (ADS)
Zhao, Guannan
This thesis is an interdisciplinary work under the heading of complexity science which focuses on an arguably common "hard" problem across physics, finance and biology [1], to quantify and mimic the macroscopic "emergent phenomenon" in large-scale systems consisting of many interacting "particles" governed by microscopic rules. In contrast to traditional statistical physics, we are interested in systems whose dynamics are subject to feedback, evolution, adaption, openness, etc. Global financial markets, like the stock market and currency market, are ideal candidate systems for such a complexity study: there exists a vast amount of accurate data, which is the aggregate output of many autonomous agents continuously competing with each other. We started by examining the ultrafast "mini flash crash (MFC)" events in the US stock market. An abrupt system-wide composition transition from a mixed human machine phase to a new all-machine phase is uncovered, and a novel theory developed to explain this observation. Then in the study of FX market, we found an unexpected variation in the synchronicity of price changes in different market subsections as a function of the overall trading activity. Several survival models have been tested in analyzing the distribution of waiting times to the next price change. In the region of long waiting-times, the distribution for each currency pair exhibits a power law with exponent in the vicinity of 3.5. By contrast, for short waiting times only, the market activity can be mimicked by the fluctuations emerging from a finite resource competition model containing multiple agents with limited rationality (so called El Farol Model). Switching to the biomedical domain, we present a minimal mathematical model built around a co-evolving resource network and cell population, yielding good agreement with primary tumors in mice experiment and with clinical metastasis data. In the quest to understand contagion phenomena in systems where social group structures evolve on a similar timescale to individual level transmission, we investigated the process of transmission through a model population comprising of social groups which follow simple dynamical rules for growth and break-up, and the profiles produced bear a striking resemblance to empirical data obtained from social, financial and biological systems. Finally, for better implementation of a widely accepted power law test algorithm, we have developed a fast testing procedure using parallel computation.
Evidence flow graph methods for validation and verification of expert systems
NASA Technical Reports Server (NTRS)
Becker, Lee A.; Green, Peter G.; Bhatnagar, Jayant
1989-01-01
The results of an investigation into the use of evidence flow graph techniques for performing validation and verification of expert systems are given. A translator to convert horn-clause rule bases into evidence flow graphs, a simulation program, and methods of analysis were developed. These tools were then applied to a simple rule base which contained errors. It was found that the method was capable of identifying a variety of problems, for example that the order of presentation of input data or small changes in critical parameters could affect the output from a set of rules.
Khadilkar, Mihir R; Escobedo, Fernando A
2014-10-17
Sought-after ordered structures of mixtures of hard anisotropic nanoparticles can often be thermodynamically unfavorable due to the components' geometric incompatibility to densely pack into regular lattices. A simple compatibilization rule is identified wherein the particle sizes are chosen such that the order-disorder transition pressures of the pure components match (and the entropies of the ordered phases are similar). Using this rule with representative polyhedra from the truncated-cube family that form pure-component plastic crystals, Monte Carlo simulations show the formation of plastic-solid solutions for all compositions and for a wide range of volume fractions.
A programmable rules engine to provide clinical decision support using HTML forms.
Heusinkveld, J; Geissbuhler, A; Sheshelidze, D; Miller, R
1999-01-01
The authors have developed a simple method for specifying rules to be applied to information on HTML forms. This approach allows clinical experts, who lack the programming expertise needed to write CGI scripts, to construct and maintain domain-specific knowledge and ordering capabilities within WizOrder, the order-entry and decision support system used at Vanderbilt Hospital. The clinical knowledge base maintainers use HTML editors to create forms and spreadsheet programs for rule entry. A test environment has been developed which uses Netscape to display forms; the production environment displays forms using an embedded browser.
Why Rules Matter in Complex Event Processing...and Vice Versa
NASA Astrophysics Data System (ADS)
Vincent, Paul
Many commercial and research CEP solutions are moving beyond simple stream query languages to more complete definitions of "process" and thence to "decisions" and "actions". And as capabilities increase in event processing capabilities, there is an increasing realization that the humble "rule" is as relevant to the event cloud as it is to specific services. Less obvious is how much event processing has to offer the process and rule execution and management technologies. Does event processing change the way we should manage businesses, processes and services, together with their embedded (and hopefully managed) rulesets?
Can the oscillator strength of the quantum dot bandgap transition exceed unity?
NASA Astrophysics Data System (ADS)
Hens, Z.
2008-10-01
We discuss the apparent contradiction between the Thomas-Reiche-Kuhn sum rule for oscillator strengths and recent experimental data on the oscillator strength of the band gap transition of quantum dots. Starting from two simple single electron model systems, we show that the sum rule does not limit this oscillator strength to values below unity, or below the number of electrons in the highest occupied single electron state. The only upper limit the sum rule imposes on the oscillator strength of the quantum dot band gap transition is the total number of electrons in the quantum dot.
Glance, Laurent G; Lustik, Stewart J; Hannan, Edward L; Osler, Turner M; Mukamel, Dana B; Qian, Feng; Dick, Andrew W
2012-04-01
To develop a 30-day mortality risk index for noncardiac surgery that can be used to communicate risk information to patients and guide clinical management at the "point-of-care," and that can be used by surgeons and hospitals to internally audit their quality of care. Clinicians rely on the Revised Cardiac Risk Index to quantify the risk of cardiac complications in patients undergoing noncardiac surgery. Because mortality from noncardiac causes accounts for many perioperative deaths, there is also a need for a simple bedside risk index to predict 30-day all-cause mortality after noncardiac surgery. Retrospective cohort study of 298,772 patients undergoing noncardiac surgery during 2005 to 2007 using the American College of Surgeons National Surgical Quality Improvement Program database. The 9-point S-MPM (Surgical Mortality Probability Model) 30-day mortality risk index was derived empirically and includes three risk factors: ASA (American Society of Anesthesiologists) physical status, emergency status, and surgery risk class. Patients with ASA physical status I, II, III, IV or V were assigned either 0, 2, 4, 5, or 6 points, respectively; intermediate- or high-risk procedures were assigned 1 or 2 points, respectively; and emergency procedures were assigned 1 point. Patients with risk scores less than 5 had a predicted risk of mortality less than 0.50%, whereas patients with a risk score of 5 to 6 had a risk of mortality between 1.5% and 4.0%. Patients with a risk score greater than 6 had risk of mortality more than 10%. S-MPM exhibited excellent discrimination (C statistic, 0.897) and acceptable calibration (Hosmer-Lemeshow statistic 13.0, P = 0.023) in the validation data set. Thirty-day mortality after noncardiac surgery can be accurately predicted using a simple and accurate risk score based on information readily available at the bedside. This risk index may play a useful role in facilitating shared decision making, developing and implementing risk-reduction strategies, and guiding quality improvement efforts.
Graphical Tools for Linear Structural Equation Modeling
2014-06-01
others. 4Kenny and Milan (2011) write, “Identification is perhaps the most difficult concept for SEM researchers to understand. We have seen SEM...model to using typical SEM software to determine model identifia- bility. Kenny and Milan (2011) list the following drawbacks: (i) If poor starting...the well known recursive and null rules (Bollen, 1989) and the regression rule (Kenny and Milan , 2011). A Simple Criterion for Identifying Individual
He, ZeFang
2014-01-01
An attitude control strategy based on Ziegler-Nichols rules for tuning PD (proportional-derivative) parameters of quadrotor helicopters is presented to solve the problem that quadrotor tends to be instable. This problem is caused by the narrow definition domain of attitude angles of quadrotor helicopters. The proposed controller is nonlinear and consists of a linear part and a nonlinear part. The linear part is a PD controller with PD parameters tuned by Ziegler-Nichols rules and acts on the quadrotor decoupled linear system after feedback linearization; the nonlinear part is a feedback linearization item which converts a nonlinear system into a linear system. It can be seen from the simulation results that the attitude controller proposed in this paper is highly robust, and its control effect is better than the other two nonlinear controllers. The nonlinear parts of the other two nonlinear controllers are the same as the attitude controller proposed in this paper. The linear part involves a PID (proportional-integral-derivative) controller with the PID controller parameters tuned by Ziegler-Nichols rules and a PD controller with the PD controller parameters tuned by GA (genetic algorithms). Moreover, this attitude controller is simple and easy to implement. PMID:25614879
Navigating a Mobile Robot Across Terrain Using Fuzzy Logic
NASA Technical Reports Server (NTRS)
Seraji, Homayoun; Howard, Ayanna; Bon, Bruce
2003-01-01
A strategy for autonomous navigation of a robotic vehicle across hazardous terrain involves the use of a measure of traversability of terrain within a fuzzy-logic conceptual framework. This navigation strategy requires no a priori information about the environment. Fuzzy logic was selected as a basic element of this strategy because it provides a formal methodology for representing and implementing a human driver s heuristic knowledge and operational experience. Within a fuzzy-logic framework, the attributes of human reasoning and decision- making can be formulated by simple IF (antecedent), THEN (consequent) rules coupled with easily understandable and natural linguistic representations. The linguistic values in the rule antecedents convey the imprecision associated with measurements taken by sensors onboard a mobile robot, while the linguistic values in the rule consequents represent the vagueness inherent in the reasoning processes to generate the control actions. The operational strategies of the human expert driver can be transferred, via fuzzy logic, to a robot-navigation strategy in the form of a set of simple conditional statements composed of linguistic variables. These linguistic variables are defined by fuzzy sets in accordance with user-defined membership functions. The main advantages of a fuzzy navigation strategy lie in the ability to extract heuristic rules from human experience and to obviate the need for an analytical model of the robot navigation process.
Sainz de Murieta, Iñaki; Rodríguez-Patón, Alfonso
2012-08-01
Despite the many designs of devices operating with the DNA strand displacement, surprisingly none is explicitly devoted to the implementation of logical deductions. The present article introduces a new model of biosensor device that uses nucleic acid strands to encode simple rules such as "IF DNA_strand(1) is present THEN disease(A)" or "IF DNA_strand(1) AND DNA_strand(2) are present THEN disease(B)". Taking advantage of the strand displacement operation, our model makes these simple rules interact with input signals (either DNA or any type of RNA) to generate an output signal (in the form of nucleotide strands). This output signal represents a diagnosis, which either can be measured using FRET techniques, cascaded as the input of another logical deduction with different rules, or even be a drug that is administered in response to a set of symptoms. The encoding introduces an implicit error cancellation mechanism, which increases the system scalability enabling longer inference cascades with a bounded and controllable signal-noise relation. It also allows the same rule to be used in forward inference or backward inference, providing the option of validly outputting negated propositions (e.g. "diagnosis A excluded"). The models presented in this paper can be used to implement smart logical DNA devices that perform genetic diagnosis in vitro. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Nonlocal rheological properties of granular flows near a jamming limit.
Aranson, Igor S; Tsimring, Lev S; Malloggi, Florent; Clément, Eric
2008-09-01
We study the rheology of sheared granular flows close to a jamming transition. We use the approach of partially fluidized theory (PFT) with a full set of equations extending the thin layer approximation derived previously for the description of the granular avalanches phenomenology. This theory provides a picture compatible with a local rheology at large shear rates [G. D. R. Midi, Eur. Phys. J. E 14, 341 (2004)] and it works in the vicinity of the jamming transition, where a description in terms of a simple local rheology comes short. We investigate two situations displaying important deviations from local rheology. The first one is based on a set of numerical simulations of sheared soft two-dimensional circular grains. The next case describes previous experimental results obtained on avalanches of sandy material flowing down an incline. Both cases display, close to jamming, significant deviations from the now standard Pouliquen's flow rule [O. Pouliquen, Phys. Fluids 11, 542 (1999); 11, 1956 (1999)]. This discrepancy is the hallmark of a strongly nonlocal rheology and in both cases, we relate the empirical results and the outcomes of PFT. The numerical simulations show a characteristic constitutive structure for the fluid part of the stress involving the confining pressure and the material stiffness that appear in the form of an additional dimensionless parameter. This constitutive relation is then used to describe the case of sandy flows. We show a quantitative agreement as far as the effective flow rules are concerned. A fundamental feature is identified in PFT as the existence of a jammed layer developing in the vicinity of the flow arrest that corroborates the experimental findings. Finally, we study the case of solitary erosive granular avalanches and relate the outcome with the PFT analysis.
A corpus for plant-chemical relationships in the biomedical domain.
Choi, Wonjun; Kim, Baeksoo; Cho, Hyejin; Lee, Doheon; Lee, Hyunju
2016-09-20
Plants are natural products that humans consume in various ways including food and medicine. They have a long empirical history of treating diseases with relatively few side effects. Based on these strengths, many studies have been performed to verify the effectiveness of plants in treating diseases. It is crucial to understand the chemicals contained in plants because these chemicals can regulate activities of proteins that are key factors in causing diseases. With the accumulation of a large volume of biomedical literature in various databases such as PubMed, it is possible to automatically extract relationships between plants and chemicals in a large-scale way if we apply a text mining approach. A cornerstone of achieving this task is a corpus of relationships between plants and chemicals. In this study, we first constructed a corpus for plant and chemical entities and for the relationships between them. The corpus contains 267 plant entities, 475 chemical entities, and 1,007 plant-chemical relationships (550 and 457 positive and negative relationships, respectively), which are drawn from 377 sentences in 245 PubMed abstracts. Inter-annotator agreement scores for the corpus among three annotators were measured. The simple percent agreement scores for entities and trigger words for the relationships were 99.6 and 94.8 %, respectively, and the overall kappa score for the classification of positive and negative relationships was 79.8 %. We also developed a rule-based model to automatically extract such plant-chemical relationships. When we evaluated the rule-based model using the corpus and randomly selected biomedical articles, overall F-scores of 68.0 and 61.8 % were achieved, respectively. We expect that the corpus for plant-chemical relationships will be a useful resource for enhancing plant research. The corpus is available at http://combio.gist.ac.kr/plantchemicalcorpus .
Non-local rheological properties of granular flows near a jamming limit.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aranson, I. S.; Tsimring, L. S.; Malloggi, F.
2008-01-01
We study the rheology of sheared granular flows close to a jamming transition. We use the approach of partially fluidized theory (PFT) with a full set of equations extending the thin layer approximation derived previously for the description of the granular avalanches phenomenology. This theory provides a picture compatible with a local rheology at large shear rates [G. D. R. Midi, Eur. Phys. J. E 14, 341 (2004)] and it works in the vicinity of the jamming transition, where a description in terms of a simple local rheology comes short. We investigate two situations displaying important deviations from local rheology.more » The first one is based on a set of numerical simulations of sheared soft two-dimensional circular grains. The next case describes previous experimental results obtained on avalanches of sandy material flowing down an incline. Both cases display, close to jamming, significant deviations from the now standard Pouliquen's flow rule [O. Pouliquen, Phys. Fluids 11, 542 (1999); 11, 1956 (1999)]. This discrepancy is the hallmark of a strongly nonlocal rheology and in both cases, we relate the empirical results and the outcomes of PFT. The numerical simulations show a characteristic constitutive structure for the fluid part of the stress involving the confining pressure and the material stiffness that appear in the form of an additional dimensionless parameter. This constitutive relation is then used to describe the case of sandy flows. We show a quantitative agreement as far as the effective flow rules are concerned. A fundamental feature is identified in PFT as the existence of a jammed layer developing in the vicinity of the flow arrest that corroborates the experimental findings. Finally, we study the case of solitary erosive granular avalanches and relate the outcome with the PFT analysis.« less
An Empirical State Error Covariance Matrix for the Weighted Least Squares Estimation Method
NASA Technical Reports Server (NTRS)
Frisbee, Joseph H., Jr.
2011-01-01
State estimation techniques effectively provide mean state estimates. However, the theoretical state error covariance matrices provided as part of these techniques often suffer from a lack of confidence in their ability to describe the un-certainty in the estimated states. By a reinterpretation of the equations involved in the weighted least squares algorithm, it is possible to directly arrive at an empirical state error covariance matrix. This proposed empirical state error covariance matrix will contain the effect of all error sources, known or not. Results based on the proposed technique will be presented for a simple, two observer, measurement error only problem.
The effects of cumulative practice on mathematics problem solving.
Mayfield, Kristin H; Chase, Philip N
2002-01-01
This study compared three different methods of teaching five basic algebra rules to college students. All methods used the same procedures to teach the rules and included four 50-question review sessions interspersed among the training of the individual rules. The differences among methods involved the kinds of practice provided during the four review sessions. Participants who received cumulative practice answered 50 questions covering a mix of the rules learned prior to each review session. Participants who received a simple review answered 50 questions on one previously trained rule. Participants who received extra practice answered 50 extra questions on the rule they had just learned. Tests administered after each review included new questions for applying each rule (application items) and problems that required novel combinations of the rules (problem-solving items). On the final test, the cumulative group outscored the other groups on application and problem-solving items. In addition, the cumulative group solved the problem-solving items significantly faster than the other groups. These results suggest that cumulative practice of component skills is an effective method of training problem solving.
The effects of cumulative practice on mathematics problem solving.
Mayfield, Kristin H; Chase, Philip N
2002-01-01
This study compared three different methods of teaching five basic algebra rules to college students. All methods used the same procedures to teach the rules and included four 50-question review sessions interspersed among the training of the individual rules. The differences among methods involved the kinds of practice provided during the four review sessions. Participants who received cumulative practice answered 50 questions covering a mix of the rules learned prior to each review session. Participants who received a simple review answered 50 questions on one previously trained rule. Participants who received extra practice answered 50 extra questions on the rule they had just learned. Tests administered after each review included new questions for applying each rule (application items) and problems that required novel combinations of the rules (problem-solving items). On the final test, the cumulative group outscored the other groups on application and problem-solving items. In addition, the cumulative group solved the problem-solving items significantly faster than the other groups. These results suggest that cumulative practice of component skills is an effective method of training problem solving. PMID:12102132
Even conservation rules are made to be broken: implications for biodiversity.
Robbins, Paul; McSweeney, Kendra; Waite, Thomas; Rice, Jennifer
2006-02-01
Despite efforts to enclose and control conservation zones around the world, direct human impacts in conservation areas continue, often resulting from clandestine violations of conservation rules through outright poaching, strategic agricultural encroachment, or noncompliance. Nevertheless, next to nothing is actually known about the spatially and temporally explicit patterns of anthropogenic disturbance resulting from such noncompliance. This article reviews current understandings of ecological disturbance and conservation noncompliance, concluding that differing forms of noncompliance hold differing implications for diversity. The authors suggest that forms of anthropogenic patchy disturbance resulting from violation may maintain, if not enhance, floral diversity. They therefore argue for extended empirical investigation of such activities and call for conservation biologists to work with social scientists to assess this conservation reality by analyzing how and when incomplete enforcement and rule-breaking drive ecological change.
Privacy Preserving Association Rule Mining Revisited: Privacy Enhancement and Resources Efficiency
NASA Astrophysics Data System (ADS)
Mohaisen, Abedelaziz; Jho, Nam-Su; Hong, Dowon; Nyang, Daehun
Privacy preserving association rule mining algorithms have been designed for discovering the relations between variables in data while maintaining the data privacy. In this article we revise one of the recently introduced schemes for association rule mining using fake transactions (FS). In particular, our analysis shows that the FS scheme has exhaustive storage and high computation requirements for guaranteeing a reasonable level of privacy. We introduce a realistic definition of privacy that benefits from the average case privacy and motivates the study of a weakness in the structure of FS by fake transactions filtering. In order to overcome this problem, we improve the FS scheme by presenting a hybrid scheme that considers both privacy and resources as two concurrent guidelines. Analytical and empirical results show the efficiency and applicability of our proposed scheme.
NASA Technical Reports Server (NTRS)
Christodoulou, Dimitris M.; Kazanas, Demosthenes
2017-01-01
We consider the geometric Titius-Bode rule for the semimajor axes of planetary orbits. We derive an equivalent rule for the midpoints of the segments between consecutive orbits along the radial direction and we interpret it physically in terms of the work done in the gravitational field of the Sun by particles whose orbits are perturbed around each planetary orbit. On such energetic grounds, it is not surprising that some exoplanets in multiple-planet extrasolar systems obey the same relation. However, it is surprising that this simple interpretation of the Titius-Bode rule also reveals new properties of the bound closed orbits predicted by Bertrand's theorem, which has been known since 1873.
NASA Astrophysics Data System (ADS)
Christodoulou, Dimitris M.; Kazanas, Demosthenes
2017-12-01
We consider the geometric Titius-Bode rule for the semimajor axes of planetary orbits. We derive an equivalent rule for the midpoints of the segments between consecutive orbits along the radial direction and we interpret it physically in terms of the work done in the gravitational field of the Sun by particles whose orbits are perturbed around each planetary orbit. On such energetic grounds, it is not surprising that some exoplanets in multiple-planet extrasolar systems obey the same relation. However, it is surprising that this simple interpretation of the Titius-Bode rule also reveals new properties of the bound closed orbits predicted by Bertrand’s theorem, which has been known since 1873.
Vehicular headways on signalized intersections: theory, models, and reality
NASA Astrophysics Data System (ADS)
Krbálek, Milan; Šleis, Jiří
2015-01-01
We discuss statistical properties of vehicular headways measured on signalized crossroads. On the basis of mathematical approaches, we formulate theoretical and empirically inspired criteria for the acceptability of theoretical headway distributions. Sequentially, the multifarious families of statistical distributions (commonly used to fit real-road headway statistics) are confronted with these criteria, and with original empirical time clearances gauged among neighboring vehicles leaving signal-controlled crossroads after a green signal appears. Using three different numerical schemes, we demonstrate that an arrangement of vehicles on an intersection is a consequence of the general stochastic nature of queueing systems, rather than a consequence of traffic rules, driver estimation processes, or decision-making procedures.
Pillars of Power: Silver and Steel of the Ottoman Empire.
NASA Astrophysics Data System (ADS)
Nerantzis, N.
The Ottoman Empire was forged over disintegrating Byzantium, stretching across Anatolia and the Balkans and ruled for almost five centuries. One crucial parameter that allowed for its quick expansion has been a combination of economic wealth and superiority of armed forces. The Ottomans succeeded in both sectors by promoting innovative technology in the field of silver and steel production for supplying their monetary system and weapons industry. Rich mines and smelting workshops provided increased output in metals, allowing for quick expansion and economic growth. Some of the major centres for silver and steel production are being discussed in this paper in conjunction with analytical data from smelting residues.
Sum rules for quasifree scattering of hadrons
NASA Astrophysics Data System (ADS)
Peterson, R. J.
2018-02-01
The areas d σ /d Ω of fitted quasifree scattering peaks from bound nucleons for continuum hadron-nucleus spectra measuring d2σ /d Ω d ω are converted to sum rules akin to the Coulomb sums familiar from continuum electron scattering spectra from nuclear charge. Hadronic spectra with or without charge exchange of the beam are considered. These sums are compared to the simple expectations of a nonrelativistic Fermi gas, including a Pauli blocking factor. For scattering without charge exchange, the hadronic sums are below this expectation, as also observed with Coulomb sums. For charge exchange spectra, the sums are near or above the simple expectation, with larger uncertainties. The strong role of hadron-nucleon in-medium total cross sections is noted from use of the Glauber model.
NASA Technical Reports Server (NTRS)
Della-Corte, Christopher
2012-01-01
Foil gas bearings are a key technology in many commercial and emerging oilfree turbomachinery systems. These bearings are nonlinear and have been difficult to analytically model in terms of performance characteristics such as load capacity, power loss, stiffness, and damping. Previous investigations led to an empirically derived method to estimate load capacity. This method has been a valuable tool in system development. The current work extends this tool concept to include rules for stiffness and damping coefficient estimation. It is expected that these rules will further accelerate the development and deployment of advanced oil-free machines operating on foil gas bearings.
Optical activities of steroid ketones - Elucidation of the octant rule
NASA Astrophysics Data System (ADS)
Hatanaka, Masashi; Sayama, Daisuke; Miyasaka, Makoto
2018-07-01
Theoretical calculations of optical activities in steroid ketones are presented by using modern semi-empirical PM7 wavefunctions. Both circular dichroism (CD) and specific rotation, which is proportional to optical rotation dispersion (ORD), are well simulated, and signs of the Cotton effect at the most long-wavelength region are fully in accordance with the experimental results. The good accordance is related to the octant rule, which is deduced within the framework of the perturbation theory. Our treatment is promising to predict the signs of the Cotton effect of large molecules, and thus, the absolute configurations can also be grasped without demanding procedures.
NASA Astrophysics Data System (ADS)
Skersys, Tomas; Butleris, Rimantas; Kapocius, Kestutis
2013-10-01
Approaches for the analysis and specification of business vocabularies and rules are very relevant topics in both Business Process Management and Information Systems Development disciplines. However, in common practice of Information Systems Development, the Business modeling activities still are of mostly empiric nature. In this paper, basic aspects of the approach for business vocabularies' semi-automated extraction from business process models are presented. The approach is based on novel business modeling-level OMG standards "Business Process Model and Notation" (BPMN) and "Semantics for Business Vocabularies and Business Rules" (SBVR), thus contributing to OMG's vision about Model-Driven Architecture (MDA) and to model-driven development in general.
NASA Technical Reports Server (NTRS)
Lautenschlager, L.; Perry, C. R., Jr. (Principal Investigator)
1981-01-01
The development of formulae for the reduction of multispectral scanner measurements to a single value (vegetation index) for predicting and assessing vegetative characteristics is addressed. The origin, motivation, and derivation of some four dozen vegetation indices are summarized. Empirical, graphical, and analytical techniques are used to investigate the relationships among the various indices. It is concluded that many vegetative indices are very similar, some being simple algebraic transforms of others.
Read Code quality assurance: from simple syntax to semantic stability.
Schulz, E B; Barrett, J W; Price, C
1998-01-01
As controlled clinical vocabularies assume an increasing role in modern clinical information systems, so the issue of their quality demands greater attention. In order to meet the resulting stringent criteria for completeness and correctness, a quality assurance system comprising a database of more than 500 rules is being developed and applied to the Read Thesaurus. The authors discuss the requirement to apply quality assurance processes to their dynamic editing database in order to ensure the quality of exported products. Sources of errors include human, hardware, and software factors as well as new rules and transactions. The overall quality strategy includes prevention, detection, and correction of errors. The quality assurance process encompasses simple data specification, internal consistency, inspection procedures and, eventually, field testing. The quality assurance system is driven by a small number of tables and UNIX scripts, with "business rules" declared explicitly as Structured Query Language (SQL) statements. Concurrent authorship, client-server technology, and an initial failure to implement robust transaction control have all provided valuable lessons. The feedback loop for error management needs to be short.
Hou, Chen; Amunugama, Kaushalya
2015-07-01
The relationship between energy expenditure and longevity has been a central theme in aging studies. Empirical studies have yielded controversial results, which cannot be reconciled by existing theories. In this paper, we present a simple theoretical model based on first principles of energy conservation and allometric scaling laws. The model takes into considerations the energy tradeoffs between life history traits and the efficiency of the energy utilization, and offers quantitative and qualitative explanations for a set of seemingly contradictory empirical results. We show that oxidative metabolism can affect cellular damage and longevity in different ways in animals with different life histories and under different experimental conditions. Qualitative data and the linearity between energy expenditure, cellular damage, and lifespan assumed in previous studies are not sufficient to understand the complexity of the relationships. Our model provides a theoretical framework for quantitative analyses and predictions. The model is supported by a variety of empirical studies, including studies on the cellular damage profile during ontogeny; the intra- and inter-specific correlations between body mass, metabolic rate, and lifespan; and the effects on lifespan of (1) diet restriction and genetic modification of growth hormone, (2) the cold and exercise stresses, and (3) manipulations of antioxidant. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Occupational exposure decisions: can limited data interpretation training help improve accuracy?
Logan, Perry; Ramachandran, Gurumurthy; Mulhausen, John; Hewett, Paul
2009-06-01
Accurate exposure assessments are critical for ensuring that potentially hazardous exposures are properly identified and controlled. The availability and accuracy of exposure assessments can determine whether resources are appropriately allocated to engineering and administrative controls, medical surveillance, personal protective equipment and other programs designed to protect workers. A desktop study was performed using videos, task information and sampling data to evaluate the accuracy and potential bias of participants' exposure judgments. Desktop exposure judgments were obtained from occupational hygienists for material handling jobs with small air sampling data sets (0-8 samples) and without the aid of computers. In addition, data interpretation tests (DITs) were administered to participants where they were asked to estimate the 95th percentile of an underlying log-normal exposure distribution from small data sets. Participants were presented with an exposure data interpretation or rule of thumb training which included a simple set of rules for estimating 95th percentiles for small data sets from a log-normal population. DIT was given to each participant before and after the rule of thumb training. Results of each DIT and qualitative and quantitative exposure judgments were compared with a reference judgment obtained through a Bayesian probabilistic analysis of the sampling data to investigate overall judgment accuracy and bias. There were a total of 4386 participant-task-chemical judgments for all data collections: 552 qualitative judgments made without sampling data and 3834 quantitative judgments with sampling data. The DITs and quantitative judgments were significantly better than random chance and much improved by the rule of thumb training. In addition, the rule of thumb training reduced the amount of bias in the DITs and quantitative judgments. The mean DIT % correct scores increased from 47 to 64% after the rule of thumb training (P < 0.001). The accuracy for quantitative desktop judgments increased from 43 to 63% correct after the rule of thumb training (P < 0.001). The rule of thumb training did not significantly impact accuracy for qualitative desktop judgments. The finding that even some simple statistical rules of thumb improve judgment accuracy significantly suggests that hygienists need to routinely use statistical tools while making exposure judgments using monitoring data.
Model of Pressure Distribution in Vortex Flow Controls
NASA Astrophysics Data System (ADS)
Mielczarek, Szymon; Sawicki, Jerzy M.
2015-06-01
Vortex valves belong to the category of hydrodynamic flow controls. They are important and theoretically interesting devices, so complex from hydraulic point of view, that probably for this reason none rational concept of their operation has been proposed so far. In consequence, functioning of vortex valves is described by CFD-methods (computer-aided simulation of technical objects) or by means of simple empirical relations (using discharge coefficient or hydraulic loss coefficient). Such rational model of the considered device is proposed in the paper. It has a simple algebraic form, but is well grounded physically. The basic quantitative relationship, which describes the valve operation, i.e. dependence between the flow discharge and the circumferential pressure head, caused by the rotation, has been verified empirically. Conformity between calculated and measured parameters of the device allows for acceptation of the proposed concept.
Statistical Tests Black swans or dragon-kings? A simple test for deviations from the power law★
NASA Astrophysics Data System (ADS)
Janczura, J.; Weron, R.
2012-05-01
We develop a simple test for deviations from power law tails. Actually, from the tails of any distribution. We use this test - which is based on the asymptotic properties of the empirical distribution function - to answer the question whether great natural disasters, financial crashes or electricity price spikes should be classified as dragon-kings or `only' as black swans.
ERIC Educational Resources Information Center
Cadime, Irene; Rodrigues, Bruna; Santos, Sandra; Viana, Fernanda Leopoldina; Chaves-Sousa, Séli; do Céu Cosme, Maria; Ribeiro, Iolanda
2017-01-01
Empirical research has provided evidence for the simple view of reading across a variety of orthographies, but the role of oral reading fluency in the model is unclear. Moreover, the relative weight of listening comprehension, oral reading fluency and word recognition in reading comprehension seems to vary across orthographies and schooling years.…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-23
... on the proposed rule change from interested persons. \\1\\ 15 U.S.C. 78s(b)(1). \\2\\ 17 CFR 240.19b-4. I....'' Specifically, the Exchange proposes to amend the Customer Rebate Program, Select Symbols,\\5\\ Simple and Complex... Category D to the Customer Rebate Program relating to Customer Simple Orders in Select Symbols. The...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-28
... Account. These funds are held ``in trust'' for the obligor and currently earn simple interest at the rate..., the Government has paid simple interest at the rate of 3 percent per year on cash deposited by bond... #0;notices is to give interested persons an opportunity to participate in #0;the rule making prior to...
On the fusion of tuning parameters of fuzzy rules and neural network
NASA Astrophysics Data System (ADS)
Mamuda, Mamman; Sathasivam, Saratha
2017-08-01
Learning fuzzy rule-based system with neural network can lead to a precise valuable empathy of several problems. Fuzzy logic offers a simple way to reach at a definite conclusion based upon its vague, ambiguous, imprecise, noisy or missing input information. Conventional learning algorithm for tuning parameters of fuzzy rules using training input-output data usually end in a weak firing state, this certainly powers the fuzzy rule and makes it insecure for a multiple-input fuzzy system. In this paper, we introduce a new learning algorithm for tuning the parameters of the fuzzy rules alongside with radial basis function neural network (RBFNN) in training input-output data based on the gradient descent method. By the new learning algorithm, the problem of weak firing using the conventional method was addressed. We illustrated the efficiency of our new learning algorithm by means of numerical examples. MATLAB R2014(a) software was used in simulating our result The result shows that the new learning method has the best advantage of training the fuzzy rules without tempering with the fuzzy rule table which allowed a membership function of the rule to be used more than one time in the fuzzy rule base.
Robots In War: Issues Of Risk And Ethics
2009-01-01
unexpected, untested ways. (And even straightforward, simple rules such as Asimov’s Laws of Robotics ( Asimov , 1950) can create unexpected dilemmas...stories (e. g., Asimov , 1950). Likewise, we may understand each rule of engagement and believe them to be sensible, but are they truly consistent...Netherlands: lOS Press. Asimov , I. (1950).1, Robot (2004 edition), New York, NY: Bantam Dell. BBC (2005). SLA Confirm Spy Plane Crash. BBC.com. Retrieved
Extension of the firefly algorithm and preference rules for solving MINLP problems
NASA Astrophysics Data System (ADS)
Costa, M. Fernanda P.; Francisco, Rogério B.; Rocha, Ana Maria A. C.; Fernandes, Edite M. G. P.
2017-07-01
An extension of the firefly algorithm (FA) for solving mixed-integer nonlinear programming (MINLP) problems is presented. Although penalty functions are nowadays frequently used to handle integrality conditions and inequality and equality constraints, this paper proposes the implementation within the FA of a simple rounded-based heuristic and four preference rules to find and converge to MINLP feasible solutions. Preliminary numerical experiments are carried out to validate the proposed methodology.
A Simple Model of Circuit Design.
1980-05-01
mathematicians who discover mathematical ideas (i.cnat>, programmers who write code <Manna> <Barstow>, physicists who solve mechanics problems <de Kiecr-l...rules and shows how - they result in the design of circuits. ’l’he design rules must not only capture the purely mathematical constralints given by VICs...K VI.. *? and KCI, but also how those constraints can implement mechanism. Mathematical constraints tell us an amplifier’s input and output voltages
Mallik, Saurav; Zhao, Zhongming
2017-12-28
For transcriptomic analysis, there are numerous microarray-based genomic data, especially those generated for cancer research. The typical analysis measures the difference between a cancer sample-group and a matched control group for each transcript or gene. Association rule mining is used to discover interesting item sets through rule-based methodology. Thus, it has advantages to find causal effect relationships between the transcripts. In this work, we introduce two new rule-based similarity measures-weighted rank-based Jaccard and Cosine measures-and then propose a novel computational framework to detect condensed gene co-expression modules ( C o n G E M s) through the association rule-based learning system and the weighted similarity scores. In practice, the list of evolved condensed markers that consists of both singular and complex markers in nature depends on the corresponding condensed gene sets in either antecedent or consequent of the rules of the resultant modules. In our evaluation, these markers could be supported by literature evidence, KEGG (Kyoto Encyclopedia of Genes and Genomes) pathway and Gene Ontology annotations. Specifically, we preliminarily identified differentially expressed genes using an empirical Bayes test. A recently developed algorithm-RANWAR-was then utilized to determine the association rules from these genes. Based on that, we computed the integrated similarity scores of these rule-based similarity measures between each rule-pair, and the resultant scores were used for clustering to identify the co-expressed rule-modules. We applied our method to a gene expression dataset for lung squamous cell carcinoma and a genome methylation dataset for uterine cervical carcinogenesis. Our proposed module discovery method produced better results than the traditional gene-module discovery measures. In summary, our proposed rule-based method is useful for exploring biomarker modules from transcriptomic data.
Newton's Path to Universal Gravitation: The Role of the Pendulum
ERIC Educational Resources Information Center
Boulos, Pierre J.
2006-01-01
Much attention has been given to Newton's argument for Universal Gravitation in Book III of the "Principia". Newton brings an impressive array of phenomena, along with the three laws of motion, and his rules for reasoning to deduce Universal Gravitation. At the centre of this argument is the famous "moon test". Here it is the empirical evidence…
Early Enrollees and Peer Age Effect: First Evidence from INVALSI Data
ERIC Educational Resources Information Center
Ordine, Patrizia; Rose, Giuseppe; Sposato, Daniela
2015-01-01
This paper estimates peer age effect on educational outcomes of Italian pupils attending primary school by exploiting changes in enrollment rules over the last few years. The empirical procedure allows to understand if there is selection in classroom formation, arguing that in the absence of pupils sorting by early age at school entry, it is…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-27
..., Financings which Constitute Conflicts of Interest of the Small Business Administration (``SBA'') Rules and Regulations (13 CFR 107.730). Plexus II, L.P., proposes to provide debt security financing to Project Empire... the Small Business Investment Act of 1958, as amended (``the Act''), in connection with the financing...
ERIC Educational Resources Information Center
Gross, Betheny; DeArmond, Michael; Goldhaber, Dan
2010-01-01
Education reformers routinely call on school districts to stop hiring teachers based on seniority, which they argue interferes with effective staffing, especially in disadvantaged schools. The few researchers who have empirically studied the issue, however, disagree about whether seniority-based hiring is systematically associated with staffing…
Corrective Feedback in SLA: Theoretical Relevance and Empirical Research
ERIC Educational Resources Information Center
Chen, Jin; Lin, Jianghao; Jiang, Lin
2016-01-01
Corrective feedback (CF) refers to the responses or treatments from teachers to a learner's nontargetlike second language (L2) production. CF has been a crucial and controversial topic in the discipline of second language acquisition (SLA). Some SLA theorists believe that CF is harmful to L2 acquisition and should be ruled out completely while…
"A City of Brick": Visual Rhetoric in the Roman Principate
ERIC Educational Resources Information Center
Lamp, Kathleen S.
2009-01-01
This study explores the impact of non-traditional rhetorical media such as art, architecture, coins, and city planning in order to examine how these media promoted dynastic rule and influenced practices of citizenship during Augustus' reign, the period between the Roman Republic and Empire (31 BCE-14CE). My findings challenge the long-standing…
In the Service of Empire: Imperialism and the British Spy Thriller, 1901-1914
2010-06-01
against British rule, perhaps as a prelude to a more serious attack through Afghanistan. Although the British had managed to crush the Indian... waiters , and bar- bers. In detailing the German hidden hand, le Queux was ada- mant that his novel was based on “serious facts,” unearthed over a 12
College Students' Conceptualizations of Deficits Involved in Mild Intellectual Disability
ERIC Educational Resources Information Center
Musso, Mandi W.; Barker, Alyse A.; Proto, Daniel A.; Gouvier, Wm. Drew
2012-01-01
Precedential rulings in recent capital murder trials may, in some cases, leave it up to a jury to determine whether or not an individual meets criteria for an intellectual disability (ID) and should be spared from the death penalty. Despite the potential for misconceptions about ID to bias decisions, few empirical studies have examined the…
2013-09-20
October 1928 – 14 January 2013. His numerous papers on the subject, along with those of Donald L. Blount, David Fox, Stephen Denny, E. Nadine Hubble ...Classing High-Speed Craft”, American Bureau of Shipping, Publication 61, Part 1, Rules for Condition of Classification, March 2013. 3. Hubble , E.N
Morphological Awareness and Learning to Read: A Cross-Language Perspective
ERIC Educational Resources Information Center
Kuo, Li-jen; Anderson, Richard C.
2006-01-01
In the past decade, there has been a surge of interest in morphological awareness, which refers to the ability to reflect on and manipulate morphemes and word formation rules in a language. This review provides a critical synthesis of empirical studies on this topic from a broad cross-linguistic perspective. Research with children speaking several…
Integrating Graphing Assignments into a Money and Banking Course Using FRED
ERIC Educational Resources Information Center
Staveley-O'Carroll, James
2018-01-01
Over the course of one semester, six empirical assignments that utilize FRED are used to introduce students of money and banking courses to the economic analysis required for the conduct of monetary policy. The first five assignments cover the following topics: inflation, bonds and stocks, monetary aggregates, the Taylor rule, and employment.…
Training, Sharing or Cheating? Gamer Strategies to Get a Digital Upper Hand
ERIC Educational Resources Information Center
Mortensen, Torill Elvira
2010-01-01
Digital game-players devote a large amount of their time to discovering rules hidden in the code and discoverable through empirical study, experiments, and developing or rediscovering the mathematical formulae governing the code. They do this through their own independent play as they test areas, gear and abilities, through data mining using…
ERIC Educational Resources Information Center
Keane, Michael P.; Wolpin, Kenneth I.
2002-01-01
Part I uses simulations of a model of welfare participation and women's fertility decisions, showing that increases in per-child payments have substantial impact on fertility. Part II uses estimations of decision rules of forward-looking women regarding welfare participation, fertility, marriage, work, and schooling. (SK)
ERIC Educational Resources Information Center
Constantinou, Vaso; Ioannou, Andri
2016-01-01
The article addresses ICT in Education by describing an empirical investigation of technology-enhanced sports education. The study examines the use of clickers by 162 Judo athletes during seminars on the rules and regulations of the sport. Results are based on quantitative data collected on athletes' performances and attitudes and qualitative data…
Experiments in Knowledge Refinement for a Large Rule-Based System
1993-08-01
empirical analysis to refine expert system knowledge bases. Aritificial Intelligence , 22:23-48, 1984. *! ...The Addison- Weslev series in artificial intelligence . Addison-Weslev. Reading, Massachusetts. 1981. Cooke, 1991: ttoger M. Cooke. Experts in...ment for classification systems. Artificial Intelligence , 35:197-226, 1988. 14 Overall, we believe that it will be possible to build a heuristic system
Modeling wildland fire propagation with level set methods
V. Mallet; D.E Keyes; F.E. Fendell
2009-01-01
Level set methods are versatile and extensible techniques for general front tracking problems, including the practically important problem of predicting the advance of a fire front across expanses of surface vegetation. Given a rule, empirical or otherwise, to specify the rate of advance of an infinitesimal segment of fire front arc normal to itself (i.e., given the...
High-resolution genome-wide dissection of the two rules of speciation in Drosophila.
Masly, John P; Presgraves, Daven C
2007-09-01
Postzygotic reproductive isolation is characterized by two striking empirical patterns. The first is Haldane's rule--the preferential inviability or sterility of species hybrids of the heterogametic (XY) sex. The second is the so-called large X effect--substitution of one species's X chromosome for another's has a disproportionately large effect on hybrid fitness compared to similar substitution of an autosome. Although the first rule has been well-established, the second rule remains controversial. Here, we dissect the genetic causes of these two rules using a genome-wide introgression analysis of Drosophila mauritiana chromosome segments in an otherwise D. sechellia genetic background. We find that recessive hybrid incompatibilities outnumber dominant ones and that hybrid male steriles outnumber all other types of incompatibility, consistent with the dominance and faster-male theories of Haldane's rule, respectively. We also find that, although X-linked and autosomal introgressions are of similar size, most X-linked introgressions cause hybrid male sterility (60%) whereas few autosomal introgressions do (18%). Our results thus confirm the large X effect and identify its proximate cause: incompatibilities causing hybrid male sterility have a higher density on the X chromosome than on the autosomes. We evaluate several hypotheses for the evolutionary cause of this excess of X-linked hybrid male sterility.
Information from multiple modalities helps 5-month-olds learn abstract rules.
Frank, Michael C; Slemmer, Jonathan A; Marcus, Gary F; Johnson, Scott P
2009-07-01
By 7 months of age, infants are able to learn rules based on the abstract relationships between stimuli (Marcus et al., 1999), but they are better able to do so when exposed to speech than to some other classes of stimuli. In the current experiments we ask whether multimodal stimulus information will aid younger infants in identifying abstract rules. We habituated 5-month-olds to simple abstract patterns (ABA or ABB) instantiated in coordinated looming visual shapes and speech sounds (Experiment 1), shapes alone (Experiment 2), and speech sounds accompanied by uninformative but coordinated shapes (Experiment 3). Infants showed evidence of rule learning only in the presence of the informative multimodal cues. We hypothesize that the additional evidence present in these multimodal displays was responsible for the success of younger infants in learning rules, congruent with both a Bayesian account and with the Intersensory Redundancy Hypothesis.
Evaluating the Effectiveness of Auditing Rules for Electronic Health Record Systems
Hedda, Monica; Malin, Bradley A.; Yan, Chao; Fabbri, Daniel
2017-01-01
Healthcare organizations (HCOs) often deploy rule-based auditing systems to detect insider threats to sensitive patient health information in electronic health record (EHR) systems. These rule-based systems define behavior deemed to be high-risk a priori (e.g., family member, co-worker access). While such rules seem logical, there has been little scientific investigation into the effectiveness of these auditing rules in identifying inappropriate behavior. Thus, in this paper, we introduce an approach to evaluate the effectiveness of individual high-risk rules and rank them according to their potential risk. We investigate the rate of high-risk access patterns and minimum rate of high-risk accesses that can be explained with appropriate clinical reasons in a large EHR system. An analysis of 8M accesses from one-week of data shows that specific high-risk flags occur more frequently than theoretically expected and the rate at which accesses can be explained away with five simple reasons is 16 - 43%. PMID:29854153
Evaluating the Effectiveness of Auditing Rules for Electronic Health Record Systems.
Hedda, Monica; Malin, Bradley A; Yan, Chao; Fabbri, Daniel
2017-01-01
Healthcare organizations (HCOs) often deploy rule-based auditing systems to detect insider threats to sensitive patient health information in electronic health record (EHR) systems. These rule-based systems define behavior deemed to be high-risk a priori (e.g., family member, co-worker access). While such rules seem logical, there has been little scientific investigation into the effectiveness of these auditing rules in identifying inappropriate behavior. Thus, in this paper, we introduce an approach to evaluate the effectiveness of individual high-risk rules and rank them according to their potential risk. We investigate the rate of high-risk access patterns and minimum rate of high-risk accesses that can be explained with appropriate clinical reasons in a large EHR system. An analysis of 8M accesses from one-week of data shows that specific high-risk flags occur more frequently than theoretically expected and the rate at which accesses can be explained away with five simple reasons is 16 - 43%.
Empirical study on voting power in participatory forest planning.
Vainikainen, N; Kangas, A; Kangas, J
2008-07-01
Multicriteria decision support systems are applied in natural resource management in order to clarify the planning process for the stakeholders, to make all available information usable and all objectives manageable. Especially when the public is involved in planning, the decision support system should be easy to comprehend, transparent and fair. Social choice theory has recently been applied to group decision-making in natural resources management to accomplish these objectives. Although voting forms the basis of democracy, and is usually taken as a fair method, the influence of voters over the outcome may vary. It is also possible to vote strategically to improve the results from each stakeholder's point of view. This study examines the use of social choice theory in revealing stakeholders' preferences in participatory forest planning, and the influence of different voters on the outcome. The positional voting rules examined were approval voting and Borda count, but both rules were slightly modified for the purposes of this study. The third rule examined, cumulative rule, resembles utilitarian voting rules. The voting rules were tested in a real participatory forest planning situation in eastern Lapland, Finland. All voting rules resulted in a different joint order of importance of the criteria. Yet, the preference orders produced had also a lot in common and the criteria could be divided into three quite distinct groups according to their importance. The influence of individual voters varied between the voting rules, and in each case different voter was the most influential.
Behavioural social choice: a status report.
Regenwetter, Michel; Grofman, Bernard; Popova, Anna; Messner, William; Davis-Stober, Clintin P; Cavagnaro, Daniel R
2009-03-27
Behavioural social choice has been proposed as a social choice parallel to seminal developments in other decision sciences, such as behavioural decision theory, behavioural economics, behavioural finance and behavioural game theory. Behavioural paradigms compare how rational actors should make certain types of decisions with how real decision makers behave empirically. We highlight that important theoretical predictions in social choice theory change dramatically under even minute violations of standard assumptions. Empirical data violate those critical assumptions. We argue that the nature of preference distributions in electorates is ultimately an empirical question, which social choice theory has often neglected. We also emphasize important insights for research on decision making by individuals. When researchers aggregate individual choice behaviour in laboratory experiments to report summary statistics, they are implicitly applying social choice rules. Thus, they should be aware of the potential for aggregation paradoxes. We hypothesize that such problems may substantially mar the conclusions of a number of (sometimes seminal) papers in behavioural decision research.
Improved Design of Tunnel Supports : Executive Summary
DOT National Transportation Integrated Search
1979-12-01
This report focuses on improvement of design methodologies related to the ground-structure interaction in tunneling. The design methods range from simple analytical and empirical methods to sophisticated finite element techniques as well as an evalua...
Evidence flow graph methods for validation and verification of expert systems
NASA Technical Reports Server (NTRS)
Becker, Lee A.; Green, Peter G.; Bhatnagar, Jayant
1988-01-01
This final report describes the results of an investigation into the use of evidence flow graph techniques for performing validation and verification of expert systems. This was approached by developing a translator to convert horn-clause rule bases into evidence flow graphs, a simulation program, and methods of analysis. These tools were then applied to a simple rule base which contained errors. It was found that the method was capable of identifying a variety of problems, for example that the order of presentation of input data or small changes in critical parameters could effect the output from a set of rules.
Development of Watch Schedule Using Rules Approach
NASA Astrophysics Data System (ADS)
Jurkevicius, Darius; Vasilecas, Olegas
The software for schedule creation and optimization solves a difficult, important and practical problem. The proposed solution is an online employee portal where administrator users can create and manage watch schedules and employee requests. Each employee can login with his/her own account and see his/her assignments, manage requests, etc. Employees set as administrators can perform the employee scheduling online, manage requests, etc. This scheduling software allows users not only to see the initial and optimized watch schedule in a simple and understandable form, but also to create special rules and criteria and input their business. The system using rules automatically will generate watch schedule.
A programmable rules engine to provide clinical decision support using HTML forms.
Heusinkveld, J.; Geissbuhler, A.; Sheshelidze, D.; Miller, R.
1999-01-01
The authors have developed a simple method for specifying rules to be applied to information on HTML forms. This approach allows clinical experts, who lack the programming expertise needed to write CGI scripts, to construct and maintain domain-specific knowledge and ordering capabilities within WizOrder, the order-entry and decision support system used at Vanderbilt Hospital. The clinical knowledge base maintainers use HTML editors to create forms and spreadsheet programs for rule entry. A test environment has been developed which uses Netscape to display forms; the production environment displays forms using an embedded browser. Images Figure 1 PMID:10566470
Bilinearity, Rules, and Prefrontal Cortex
Dayan, Peter
2007-01-01
Humans can be instructed verbally to perform computationally complex cognitive tasks; their performance then improves relatively slowly over the course of practice. Many skills underlie these abilities; in this paper, we focus on the particular question of a uniform architecture for the instantiation of habitual performance and the storage, recall, and execution of simple rules. Our account builds on models of gated working memory, and involves a bilinear architecture for representing conditional input-output maps and for matching rules to the state of the input and working memory. We demonstrate the performance of our model on two paradigmatic tasks used to investigate prefrontal and basal ganglia function. PMID:18946523
Building a common pipeline for rule-based document classification.
Patterson, Olga V; Ginter, Thomas; DuVall, Scott L
2013-01-01
Instance-based classification of clinical text is a widely used natural language processing task employed as a step for patient classification, document retrieval, or information extraction. Rule-based approaches rely on concept identification and context analysis in order to determine the appropriate class. We propose a five-step process that enables even small research teams to develop simple but powerful rule-based NLP systems by taking advantage of a common UIMA AS based pipeline for classification. Our proposed methodology coupled with the general-purpose solution provides researchers with access to the data locked in clinical text in cases of limited human resources and compact timelines.
Evaluating scale-up rules of a high-shear wet granulation process.
Tao, Jing; Pandey, Preetanshu; Bindra, Dilbir S; Gao, Julia Z; Narang, Ajit S
2015-07-01
This work aimed to evaluate the commonly used scale-up rules for high-shear wet granulation process using a microcrystalline cellulose-lactose-based low drug loading formulation. Granule properties such as particle size, porosity, flow, and tabletability, and tablet dissolution were compared across scales using scale-up rules based on different impeller speed calculations or extended wet massing time. Constant tip speed rule was observed to produce slightly less granulated material at the larger scales. Longer wet massing time can be used to compensate for the lower shear experienced by the granules at the larger scales. Constant Froude number and constant empirical stress rules yielded granules that were more comparable across different scales in terms of compaction performance and tablet dissolution. Granule porosity was shown to correlate well with blend tabletability and tablet dissolution, indicating the importance of monitoring granule densification (porosity) during scale-up. It was shown that different routes can be chosen during scale-up to achieve comparable granule growth and densification by altering one of the three parameters: water amount, impeller speed, and wet massing time. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.
Pool Safety: A Few Simple Rules.
ERIC Educational Resources Information Center
PTA Today, 1993
1993-01-01
Presents suggestions by the National Swimming Pool Safety Committee on how to keep children safe while swimming. Ideas include maintaining strict adult supervision, pool and spa barriers, and knowledge of cardiopulmonary resuscitation. (SM)
An empirical relationship for homogenization in single-phase binary alloy systems
NASA Technical Reports Server (NTRS)
Unnam, J.; Tenney, D. R.; Stein, B. A.
1979-01-01
A semiempirical formula is developed for describing the extent of interaction between constituents in single-phase binary alloy systems with planar, cylindrical, or spherical interfaces. The formula contains two parameters that are functions of mean concentration and interface geometry of the couple. The empirical solution is simple, easy to use, and does not involve sequential calculations, thereby allowing quick estimation of the extent of interactions without lengthy calculations. Results obtained with this formula are in good agreement with those from a finite-difference analysis.
Malthusian dynamics in a diverging Europe: Northern Italy, 1650-1881.
Fernihough, Alan
2013-02-01
Recent empirical research questions the validity of using Malthusian theory in preindustrial England. Using real wage and vital rate data for the years 1650-1881, I provide empirical estimates for a different region: Northern Italy. The empirical methodology is theoretically underpinned by a simple Malthusian model, in which population, real wages, and vital rates are determined endogenously. My findings strongly support the existence of a Malthusian economy wherein population growth decreased living standards, which in turn influenced vital rates. However, these results also demonstrate how the system is best characterized as one of weak homeostasis. Furthermore, there is no evidence of Boserupian effects given that increases in population failed to spur any sustained technological progress.
Review of Thawing Time Prediction Models Depending on Process Conditions and Product Characteristics
Kluza, Franciszek; Spiess, Walter E. L.; Kozłowicz, Katarzyna
2016-01-01
Summary Determining thawing times of frozen foods is a challenging problem as the thermophysical properties of the product change during thawing. A number of calculation models and solutions have been developed. The proposed solutions range from relatively simple analytical equations based on a number of assumptions to a group of empirical approaches that sometimes require complex calculations. In this paper analytical, empirical and graphical models are presented and critically reviewed. The conditions of solution, limitations and possible applications of the models are discussed. The graphical and semi--graphical models are derived from numerical methods. Using the numerical methods is not always possible as running calculations takes time, whereas the specialized software and equipment are not always cheap. For these reasons, the application of analytical-empirical models is more useful for engineering. It is demonstrated that there is no simple, accurate and feasible analytical method for thawing time prediction. Consequently, simplified methods are needed for thawing time estimation of agricultural and food products. The review reveals the need for further improvement of the existing solutions or development of new ones that will enable accurate determination of thawing time within a wide range of practical conditions of heat transfer during processing. PMID:27904387
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manturov, Vassily O
2010-06-29
In this work we study knot theories with a parity property for crossings: every crossing is declared to be even or odd according to a certain preassigned rule. If this rule satisfies a set of simple axioms related to the Reidemeister moves, then certain simple invariants solving the minimality problem can be defined, and invariant maps on the set of knots can be constructed. The most important example of a knot theory with parity is the theory of virtual knots. Using the parity property arising from Gauss diagrams we show that even a gross simplification of the theory of virtualmore » knots, namely, the theory of free knots, admits simple and highly nontrivial invariants. This gives a solution to a problem of Turaev, who conjectured that all free knots are trivial. In this work we show that free knots are generally not invertible, and provide invariants which detect the invertibility of free knots. The passage to ordinary virtual knots allows us to strengthen known invariants (such as the Kauffman bracket) using parity considerations. We also discuss other examples of knot theories with parity. Bibliography: 27 items.« less
A method for validating Rent's rule for technological and biological networks.
Alcalde Cuesta, Fernando; González Sequeiros, Pablo; Lozano Rojo, Álvaro
2017-07-14
Rent's rule is empirical power law introduced in an effort to describe and optimize the wiring complexity of computer logic graphs. It is known that brain and neuronal networks also obey Rent's rule, which is consistent with the idea that wiring costs play a fundamental role in brain evolution and development. Here we propose a method to validate this power law for a certain range of network partitions. This method is based on the bifurcation phenomenon that appears when the network is subjected to random alterations preserving its degree distribution. It has been tested on a set of VLSI circuits and real networks, including biological and technological ones. We also analyzed the effect of different types of random alterations on the Rentian scaling in order to test the influence of the degree distribution. There are network architectures quite sensitive to these randomization procedures with significant increases in the values of the Rent exponents.
ERIC Educational Resources Information Center
Beem, Kate
2004-01-01
It is such a simple mandate: Prepare healthy, nutritious meals for the schoolchildren so they can go about the business of learning. But operating a school district food service department is anything but simple. Even in the smallest districts, food service operations are businesses that must comply with many more rules than those in the private…
ERIC Educational Resources Information Center
Beal, Christine
1992-01-01
Describes typical differences in conversational routines in French and Australian English and kinds of tensions arising when speakers with two different sets of rules come into contact. Even simple questions contain a variety of assumptions ranging from whom it is suitable to ask to the kind of answer or the amount of detail that is expected. (13…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-13
... Commission is publishing this notice to solicit comments on the proposed rule change from interested persons... that the credit amounts in the Exchange's VIP for simple orders will not change as a result of the new... (simple (complex classes (monthly) orders) orders) 1 0%-0.75 $0.00 $0.00 2 Above 0.75%-2.00 0.10 0.17 3...
Yago, Martín
2017-05-01
QC planning based on risk management concepts can reduce the probability of harming patients due to an undetected out-of-control error condition. It does this by selecting appropriate QC procedures to decrease the number of erroneous results reported. The selection can be easily made by using published nomograms for simple QC rules when the out-of-control condition results in increased systematic error. However, increases in random error also occur frequently and are difficult to detect, which can result in erroneously reported patient results. A statistical model was used to construct charts for the 1 ks and X /χ 2 rules. The charts relate the increase in the number of unacceptable patient results reported due to an increase in random error with the capability of the measurement procedure. They thus allow for QC planning based on the risk of patient harm due to the reporting of erroneous results. 1 ks Rules are simple, all-around rules. Their ability to deal with increases in within-run imprecision is minimally affected by the possible presence of significant, stable, between-run imprecision. X /χ 2 rules perform better when the number of controls analyzed during each QC event is increased to improve QC performance. Using nomograms simplifies the selection of statistical QC procedures to limit the number of erroneous patient results reported due to an increase in analytical random error. The selection largely depends on the presence or absence of stable between-run imprecision. © 2017 American Association for Clinical Chemistry.
Competitive STDP Learning of Overlapping Spatial Patterns.
Krunglevicius, Dalius
2015-08-01
Spike-timing-dependent plasticity (STDP) is a set of Hebbian learning rules firmly based on biological evidence. It has been demonstrated that one of the STDP learning rules is suited for learning spatiotemporal patterns. When multiple neurons are organized in a simple competitive spiking neural network, this network is capable of learning multiple distinct patterns. If patterns overlap significantly (i.e., patterns are mutually inclusive), however, competition would not preclude trained neuron's responding to a new pattern and adjusting synaptic weights accordingly. This letter presents a simple neural network that combines vertical inhibition and Euclidean distance-dependent synaptic strength factor. This approach helps to solve the problem of pattern size-dependent parameter optimality and significantly reduces the probability of a neuron's forgetting an already learned pattern. For demonstration purposes, the network was trained for the first ten letters of the Braille alphabet.
Big Bang Day : The Great Big Particle Adventure - 3. Origins
None
2017-12-09
In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe
SYSTEMATIZATION OF MASS LEVELS OF PARTICLES AND RESONANCES ON HEURISTIC BASIS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Takabayasi, T.
1963-12-16
Once more a scheme of simple mass rules and formulas for particles and resonant levels is investigated and organized, based on some general hypotheses. The essential ingredients in the scheme are, on one hand, the equalinterval rule governing the isosinglet meson series, associated with particularly simple mass ratio between the 2/sup ++/ level f and 0/sup ++/ level ABC, and on the other a new basic mass formula that unifies some of the meson and baryon levels. The whole baryon levels are arranged in a table analogous to the periodic table, and then correspondences between different series and equivalence betweenmore » spin and hypercharge, when properly applied, just fix the whole baryon mass spectrum in good agreement with observations. Connections with the scheme of mass formulas formerly given are also shown. (auth)« less
Can simple rules control development of a pioneer vertebrate neuronal network generating behavior?
Roberts, Alan; Conte, Deborah; Hull, Mike; Merrison-Hort, Robert; al Azad, Abul Kalam; Buhl, Edgar; Borisyuk, Roman; Soffe, Stephen R
2014-01-08
How do the pioneer networks in the axial core of the vertebrate nervous system first develop? Fundamental to understanding any full-scale neuronal network is knowledge of the constituent neurons, their properties, synaptic interconnections, and normal activity. Our novel strategy uses basic developmental rules to generate model networks that retain individual neuron and synapse resolution and are capable of reproducing correct, whole animal responses. We apply our developmental strategy to young Xenopus tadpoles, whose brainstem and spinal cord share a core vertebrate plan, but at a tractable complexity. Following detailed anatomical and physiological measurements to complete a descriptive library of each type of spinal neuron, we build models of their axon growth controlled by simple chemical gradients and physical barriers. By adding dendrites and allowing probabilistic formation of synaptic connections, we reconstruct network connectivity among up to 2000 neurons. When the resulting "network" is populated by model neurons and synapses, with properties based on physiology, it can respond to sensory stimulation by mimicking tadpole swimming behavior. This functioning model represents the most complete reconstruction of a vertebrate neuronal network that can reproduce the complex, rhythmic behavior of a whole animal. The findings validate our novel developmental strategy for generating realistic networks with individual neuron- and synapse-level resolution. We use it to demonstrate how early functional neuronal connectivity and behavior may in life result from simple developmental "rules," which lay out a scaffold for the vertebrate CNS without specific neuron-to-neuron recognition.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lincoln, Don
2015-01-20
Albert Einstein said that what he wanted to know was “God’s thoughts,” which is a metaphor for the ultimate and most basic rules of the universe. Once known, all other phenomena would then be a consequence of these simple rules. While modern science is far from that goal, we have some thoughts on how this inquiry might unfold. In this video, Fermilab’s Dr. Don Lincoln tells what we know about GUTs (grand unified theories) and TOEs (theories of everything).
Objective estimates based on experimental data and initial and final knowledge
NASA Technical Reports Server (NTRS)
Rosenbaum, B. M.
1972-01-01
An extension of the method of Jaynes, whereby least biased probability estimates are obtained, permits such estimates to be made which account for experimental data on hand as well as prior and posterior knowledge. These estimates can be made for both discrete and continuous sample spaces. The method allows a simple interpretation of Laplace's two rules: the principle of insufficient reason and the rule of succession. Several examples are analyzed by way of illustration.
NASA Astrophysics Data System (ADS)
Ahmadianfar, Iman; Adib, Arash; Taghian, Mehrdad
2017-10-01
The reservoir hedging rule curves are used to avoid severe water shortage during drought periods. In this method reservoir storage is divided into several zones, wherein the rationing factors are changed immediately when water storage level moves from one zone to another. In the present study, a hedging rule with fuzzy rationing factors was applied for creating a transition zone in up and down each rule curve, and then the rationing factor will be changed in this zone gradually. For this propose, a monthly simulation model was developed and linked to the non-dominated sorting genetic algorithm for calculation of the modified shortage index of two objective functions involving water supply of minimum flow and agriculture demands in a long-term simulation period. Zohre multi-reservoir system in south Iran has been considered as a case study. The results of the proposed hedging rule have improved the long-term system performance from 10 till 27 percent in comparison with the simple hedging rule, where these results demonstrate that the fuzzification of hedging factors increase the applicability and the efficiency of the new hedging rule in comparison to the conventional rule curve for mitigating the water shortage problem.
Simple, empirical approach to predict neutron capture cross sections from nuclear masses
NASA Astrophysics Data System (ADS)
Couture, A.; Casten, R. F.; Cakirli, R. B.
2017-12-01
Background: Neutron capture cross sections are essential to understanding the astrophysical s and r processes, the modeling of nuclear reactor design and performance, and for a wide variety of nuclear forensics applications. Often, cross sections are needed for nuclei where experimental measurements are difficult. Enormous effort, over many decades, has gone into attempting to develop sophisticated statistical reaction models to predict these cross sections. Such work has met with some success but is often unable to reproduce measured cross sections to better than 40 % , and has limited predictive power, with predictions from different models rapidly differing by an order of magnitude a few nucleons from the last measurement. Purpose: To develop a new approach to predicting neutron capture cross sections over broad ranges of nuclei that accounts for their values where known and which has reliable predictive power with small uncertainties for many nuclei where they are unknown. Methods: Experimental neutron capture cross sections were compared to empirical mass observables in regions of similar structure. Results: We present an extremely simple method, based solely on empirical mass observables, that correlates neutron capture cross sections in the critical energy range from a few keV to a couple hundred keV. We show that regional cross sections are compactly correlated in medium and heavy mass nuclei with the two-neutron separation energy. These correlations are easily amenable to predict unknown cross sections, often converting the usual extrapolations to more reliable interpolations. It almost always reproduces existing data to within 25 % and estimated uncertainties are below about 40 % up to 10 nucleons beyond known data. Conclusions: Neutron capture cross sections display a surprisingly strong connection to the two-neutron separation energy, a nuclear structure property. The simple, empirical correlations uncovered provide model-independent predictions of neutron capture cross sections, extending far from stability, including for nuclei of the highest sensitivity to r -process nucleosynthesis.
NASA Astrophysics Data System (ADS)
Miyake, Yasufumi; Boned, Christian; Baylaucq, Antoine; Bessières, David; Zéberg-Mikkelsen, Claus K.; Galliéro, Guillaume; Ushiki, Hideharu
2007-07-01
In order to study the influence of stereoisomeric effects on the dynamic viscosity, an extensive experimental study of the viscosity of the binary system composed of the two stereoisomeric molecular forms of decalin - cis and trans - has been carried out for five different mixtures at three temperatures (303.15, 323.15 and 343.15) K and six isobars up to 100 MPa with a falling-body viscometer (a total of 90 points). The experimental relative uncertainty is estimated to be 2%. The variations of dynamic viscosity versus composition are discussed with respect to their behavior due to stereoisomerism. Four different models with a physical and theoretical background are studied in order to investigate how they take the stereoisomeric effect into account through their required model parameters. The evaluated models are based on the hard-sphere scheme, the concepts of the free-volume and the friction theory, and a model derived from molecular dynamics. Overall, a satisfactory representation of the viscosity of this binary system is found for the different models within the considered ( T, p) range taken into account their simplicity. All the models are able to distinguish between the two stereoisomeric decalin compounds. Further, based on the analysis of the model parameters performed on the pure compounds, it has been found that the use of simple mixing rules without introducing any binary interaction parameters are sufficient in order to predict the viscosity of cis + trans-decalin mixtures with the same accuracy in comparison with the experimental values as obtained for the pure compounds. In addition to these models, a semi-empirical self-referencing model and the simple mixing laws of Grunberg-Nissan and Katti-Chaudhri are also applied in the representation of the viscosity behavior of these systems.
Newgreen, Donald F; Dufour, Sylvie; Howard, Marthe J; Landman, Kerry A
2013-10-01
We review morphogenesis of the enteric nervous system from migratory neural crest cells, and defects of this process such as Hirschsprung disease, centering on cell motility and assembly, and cell adhesion and extracellular matrix molecules, along with cell proliferation and growth factors. We then review continuum and agent-based (cellular automata) models with rules of cell movement and logistical proliferation. Both movement and proliferation at the individual cell level are modeled with stochastic components from which stereotyped outcomes emerge at the population level. These models reproduced the wave-like colonization of the intestine by enteric neural crest cells, and several new properties emerged, such as colonization by frontal expansion, which were later confirmed biologically. These models predict a surprising level of clonal heterogeneity both in terms of number and distribution of daughter cells. Biologically, migrating cells form stable chains made up of unstable cells, but this is not seen in the initial model. We outline additional rules for cell differentiation into neurons, axon extension, cell-axon and cell-cell adhesions, chemotaxis and repulsion which can reproduce chain migration. After the migration stage, the cells re-arrange as a network of ganglia. Changes in cell adhesion molecules parallel this, and we describe additional rules based on Steinberg's Differential Adhesion Hypothesis, reflecting changing levels of adhesion in neural crest cells and neurons. This was able to reproduce enteric ganglionation in a model. Mouse mutants with disturbances of enteric nervous system morphogenesis are discussed, and these suggest future refinement of the models. The modeling suggests a relatively simple set of cell behavioral rules could account for complex patterns of morphogenesis. The model has allowed the proposal that Hirschsprung disease is mostly an enteric neural crest cell proliferation defect, not a defect of cell migration. In addition, the model suggests an explanations for zonal and skip segment variants of Hirschsprung disease, and also gives a novel stochastic explanation for the observed discordancy of Hirschsprung disease in identical twins. © 2013 Elsevier Inc. All rights reserved.
Kanematsu, Nobuyuki
2009-03-07
Dose calculation for radiotherapy with protons and heavier ions deals with a large volume of path integrals involving a scattering power of body tissue. This work provides a simple model for such demanding applications. There is an approximate linearity between RMS end-point displacement and range of incident particles in water, empirically found in measurements and detailed calculations. This fact was translated into a simple linear formula, from which the scattering power that is only inversely proportional to the residual range was derived. The simplicity enabled the analytical formulation for ions stopping in water, which was designed to be equivalent with the extended Highland model and agreed with measurements within 2% or 0.02 cm in RMS displacement. The simplicity will also improve the efficiency of numerical path integrals in the presence of heterogeneity.
Antibiotic stewardship and empirical antibiotic treatment: How can they get along?
Zuccaro, Valentina; Columpsi, Paola; Sacchi, Paolo; Lucà, Maria Grazia; Fagiuoli, Stefano; Bruno, Raffaele
2017-06-01
The aim of this review is to focus on the recent knowledge on antibiotic stewardship and empiric antibiotic treatment in cirrhotic patients. The application of antimicrobial stewardship (AMS) rules appears to be the most appropriate strategy to globally manage cirrhotic patients with infectious complications: indeed they represent a unique way to provide both early diagnosis and appropriate therapy in order to avoid not only antibiotic over-prescription but, more importantly, selection and spread of antimicrobial resistance. Moreover, cirrhotic patients must be considered "frail" and susceptible to healthcare associated infections: applying AMS policies would assure a cost reduction and thus contribute to the improvement of public health strategies. Copyright © 2017. Published by Elsevier Ltd.
Robust Strategy for Rocket Engine Health Monitoring
NASA Technical Reports Server (NTRS)
Santi, L. Michael
2001-01-01
Monitoring the health of rocket engine systems is essentially a two-phase process. The acquisition phase involves sensing physical conditions at selected locations, converting physical inputs to electrical signals, conditioning the signals as appropriate to establish scale or filter interference, and recording results in a form that is easy to interpret. The inference phase involves analysis of results from the acquisition phase, comparison of analysis results to established health measures, and assessment of health indications. A variety of analytical tools may be employed in the inference phase of health monitoring. These tools can be separated into three broad categories: statistical, rule based, and model based. Statistical methods can provide excellent comparative measures of engine operating health. They require well-characterized data from an ensemble of "typical" engines, or "golden" data from a specific test assumed to define the operating norm in order to establish reliable comparative measures. Statistical methods are generally suitable for real-time health monitoring because they do not deal with the physical complexities of engine operation. The utility of statistical methods in rocket engine health monitoring is hindered by practical limits on the quantity and quality of available data. This is due to the difficulty and high cost of data acquisition, the limited number of available test engines, and the problem of simulating flight conditions in ground test facilities. In addition, statistical methods incur a penalty for disregarding flow complexity and are therefore limited in their ability to define performance shift causality. Rule based methods infer the health state of the engine system based on comparison of individual measurements or combinations of measurements with defined health norms or rules. This does not mean that rule based methods are necessarily simple. Although binary yes-no health assessment can sometimes be established by relatively simple rules, the causality assignment needed for refined health monitoring often requires an exceptionally complex rule base involving complicated logical maps. Structuring the rule system to be clear and unambiguous can be difficult, and the expert input required to maintain a large logic network and associated rule base can be prohibitive.
The P600 in Implicit Artificial Grammar Learning
ERIC Educational Resources Information Center
Silva, Susana; Folia, Vasiliki; Hagoort, Peter; Petersson, Karl Magnus
2017-01-01
The suitability of the artificial grammar learning (AGL) paradigm to capture relevant aspects of the acquisition of linguistic structures has been empirically tested in a number of EEG studies. Some have shown a syntax-related P600 component, but it has not been ruled out that the AGL P600 effect is a response to surface features (e.g.,…
Speculative behavior and asset price dynamics.
Westerhoff, Frank
2003-07-01
This paper deals with speculative trading. Guided by empirical observations, a nonlinear deterministic asset pricing model is developed in which traders repeatedly choose between technical and fundamental analysis to determine their orders. The interaction between the trading rules produces complex dynamics. The model endogenously replicates the stylized facts of excess volatility, high trading volumes, shifts in the level of asset prices, and volatility clustering.
Analysis and Synthesis of Adaptive Neural Elements and Assembles
1992-02-17
effects of neuromodulators on electrically activity. Based on the simulations it appears that there are potentially novel mechanisms with which modulatory...and Byrne, J.H. A learning rule based on empirically-derived activity-dependent neuromodulation supports operant conditioning in a small network...dependent neuromodulation can support operant conditioning in a small oscillatory network". 2. Society for Neuroscience Short Course on Neural
The Concept of British Education Policy in the Colonies 1850-1960
ERIC Educational Resources Information Center
Whitehead, Clive
2007-01-01
It is common in the literature to refer to British colonial education policy as if it were "a settled course adopted and purposefully carried into action", but in reality it was never like that. Contrary to popular belief, the size and diversity of the empire meant that no one really ruled it in any direct sense. Clearly some kind of…
First-order fire effects models for land Management: Overview and issues
Elizabeth D. Reinhardt; Matthew B. Dickinson
2010-01-01
We give an overview of the science application process at work in supporting fire management. First-order fire effects models, such as those discussed in accompanying papers, are the building blocks of software systems designed for application to landscapes over time scales from days to centuries. Fire effects may be modeled using empirical, rule based, or process...
ERIC Educational Resources Information Center
Lim, Ik Soo; Leek, E. Charles
2012-01-01
Previous empirical studies have shown that information along visual contours is known to be concentrated in regions of high magnitude of curvature, and, for closed contours, segments of negative curvature (i.e., concave segments) carry greater perceptual relevance than corresponding regions of positive curvature (i.e., convex segments). Lately,…
How Long Should a Training Program Be? A Field Study of "Rules-of-Thumb"
ERIC Educational Resources Information Center
Cole, Nina
2008-01-01
Purpose: This study aims to examine the question of how long a behavioral skills training program should be in order to result in measurable behavioral change. Design/methodology/approach: An empirical field study was conducted to compare two different lengths of time for a managerial skills training program aimed at achieving behavioral change.…
An Empirical Test of the Modified C Index and SII, O*NET, and DHOC Occupational Code Classifications
ERIC Educational Resources Information Center
Dik, Bryan J.; Hu, Ryan S. C.; Hansen, Jo-Ida C.
2007-01-01
The present study investigated new approaches for assessing Holland's congruence hypothesis by (a) developing and applying four sets of decision rules for assigning Holland codes of varying lengths for purposes of computing Eggerth and Andrew's modified C index; (b) testing the modified C index computed using these four approaches against Brown…
ERIC Educational Resources Information Center
van der Spek, Erik D.; Wouters, Pieter; van Oostendorp, Herre
2011-01-01
Serious games have a great potential for training and educating people in novel and engaging ways. However, little empirical research has been done on the effectiveness of serious games, and although early findings do point to a moderately positive direction, even less is known about why some games succeed in effectively educating while others do…
Artificial grammar learning meets formal language theory: an overview
Fitch, W. Tecumseh; Friederici, Angela D.
2012-01-01
Formal language theory (FLT), part of the broader mathematical theory of computation, provides a systematic terminology and set of conventions for describing rules and the structures they generate, along with a rich body of discoveries and theorems concerning generative rule systems. Despite its name, FLT is not limited to human language, but is equally applicable to computer programs, music, visual patterns, animal vocalizations, RNA structure and even dance. In the last decade, this theory has been profitably used to frame hypotheses and to design brain imaging and animal-learning experiments, mostly using the ‘artificial grammar-learning’ paradigm. We offer a brief, non-technical introduction to FLT and then a more detailed analysis of empirical research based on this theory. We suggest that progress has been hampered by a pervasive conflation of distinct issues, including hierarchy, dependency, complexity and recursion. We offer clarifications of several relevant hypotheses and the experimental designs necessary to test them. We finally review the recent brain imaging literature, using formal languages, identifying areas of convergence and outstanding debates. We conclude that FLT has much to offer scientists who are interested in rigorous empirical investigations of human cognition from a neuroscientific and comparative perspective. PMID:22688631
The logic of counterfactual analysis in case-study explanation.
Mahoney, James; Barrenechea, Rodrigo
2017-12-19
In this paper, we develop a set-theoretic and possible worlds approach to counterfactual analysis in case-study explanation. Using this approach, we first consider four kinds of counterfactuals: necessary condition counterfactuals, SUIN condition counterfactuals, sufficient condition counterfactuals, and INUS condition counterfactuals. We explore the distinctive causal claims entailed in each, and conclude that necessary condition and SUIN condition counterfactuals are the most useful types for hypothesis assessment in case-study research. We then turn attention to the development of a rigorous understanding of the 'minimal-rewrite' rule, linking this rule to insights from set theory about the relative importance of necessary conditions. We show why, logically speaking, a comparative analysis of two necessary condition counterfactuals will tend to favour small events and contingent happenings. A third section then presents new tools for specifying the level of generality of the events in a counterfactual. We show why and how the goals of formulating empirically important versus empirically plausible counterfactuals stand in tension with one another. Finally, we use our framework to link counterfactual analysis to causal sequences, which in turn provides advantages for conducting counterfactual projections. © London School of Economics and Political Science 2017.
The role of U.S. states in facilitating effective water governance under stress and change
NASA Astrophysics Data System (ADS)
Kirchhoff, Christine J.; Dilling, Lisa
2016-04-01
Worldwide water governance failures undermine effective water management under uncertainty and change. Overcoming these failures requires employing more adaptive, resilient water management approaches; yet, while scholars have advance theory of what adaptive, resilient approaches should be, there is little empirical evidence to support those normative propositions. To fill this gap, we reviewed the literature to derive theorized characteristics of adaptive, resilient water governance including knowledge generation and use, participation, clear rules for water use, and incorporating nonstationarity. Then, using interviews and documentary analysis focused on five U.S. states' allocation and planning approaches, we examined empirically if embodying these characteristics made states more (or less) adaptive and resilient in practice. We found that adaptive, resilient water governance requires not just possessing these characteristics but combining and building on them. That is, adaptive, resilient water governance requires well-funded, transparent knowledge systems combined with broad, multilevel participatory processes that support learning, strong institutional arrangements that establish authorities and rules and that allow flexibility as conditions change, and resources for integrated planning and allocation. We also found that difficulty incorporating climate change or altering existing water governance paradigms and inadequate funding of water programs undermine adaptive, resilient governance.
Liu, Kehui; Zhang, Jiyang; Fu, Bin; Xie, Hongwei; Wang, Yingchun; Qian, Xiaohong
2014-07-01
Precise protein quantification is essential in comparative proteomics. Currently, quantification bias is inevitable when using proteotypic peptide-based quantitative proteomics strategy for the differences in peptides measurability. To improve quantification accuracy, we proposed an "empirical rule for linearly correlated peptide selection (ERLPS)" in quantitative proteomics in our previous work. However, a systematic evaluation on general application of ERLPS in quantitative proteomics under diverse experimental conditions needs to be conducted. In this study, the practice workflow of ERLPS was explicitly illustrated; different experimental variables, such as, different MS systems, sample complexities, sample preparations, elution gradients, matrix effects, loading amounts, and other factors were comprehensively investigated to evaluate the applicability, reproducibility, and transferability of ERPLS. The results demonstrated that ERLPS was highly reproducible and transferable within appropriate loading amounts and linearly correlated response peptides should be selected for each specific experiment. ERLPS was used to proteome samples from yeast to mouse and human, and in quantitative methods from label-free to O18/O16-labeled and SILAC analysis, and enabled accurate measurements for all proteotypic peptide-based quantitative proteomics over a large dynamic range. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Technical Reports Server (NTRS)
Ellis, Stephen R.; Adelstein, Bernard D.; Yeom, Kiwon
2013-01-01
The Misalignment Effect Function (MEF) describes the decrement in manual performance associated with a rotation between operators' visual display frame of reference and that of their manual control. It now has been empirically determined for rotation axes oblique to canonical body axes and is compared with the MEF previously measured for rotations about canonical axes. A targeting rule, called the Secant Rule, based on these earlier measurements is derived from a hypothetical process and shown to describe some of the data from three previous experiments. It explains the motion trajectories determined for rotations less than 65deg in purely kinematic terms without the need to appeal to a mental rotation process. Further analysis of this rule in three dimensions applied to oblique rotation axes leads to a somewhat surprising expectation that the difficulty posed by rotational misalignment should get harder as the required movement is shorter. This prediction is confirmed. Geometry underlying this rule also suggests analytic extensions for predicting more generally the difficulty of making movements in arbitrary directions subject to arbitrary misalignments.
Controlling sludge settleability in the oxidation ditch process.
Hartley, K J
2008-03-01
This paper describes an investigation aimed at developing an operating technique for controlling sludge settleability in the oxidation ditch form of the nitrification denitrification activated sludge process. It was hypothesized that specific sludge volume index (SSVI) is lowest at an optimum process anoxic fraction and increases at higher and lower fractions. Using effluent ammonia:nitrate ratio as a surrogate for anoxic fraction, it was found that a simple empirical model based on a three solids retention time moving average nitrogen ratio was able to replicate the long-term SSVI variations in two independent oxidation ditches at a full-scale plant. Operating data from a second oxidation ditch plant during periods when a prefermenter was on- or off-line showed that SSVI also varies with RBCOD, greater RBCOD giving lower SSVI. It was concluded that best settleability occurs at about the same anoxic fraction as lowest effluent total nitrogen concentration, with an ammonia:nitrate ratio of about 1. An operating rule of thumb is to use dissolved oxygen control to maintain effluent ammonia and nitrate nitrogen concentrations about equal. A third oxidation ditch plant deliberately operated in this manner achieved 15-month median operating values for SSVI of 60mL/g and for effluent ammonia, nitrate and total N, respectively, of 0.2, 0.3 and 2.0mgN/L.
Measures and Interpretations of Vigilance Performance: Evidence Against the Detection Criterion
NASA Technical Reports Server (NTRS)
Balakrishnan, J. D.
1998-01-01
Operators' performance in a vigilance task is often assumed to depend on their choice of a detection criterion. When the signal rate is low this criterion is set high, causing the hit and false alarm rates to be low. With increasing time on task the criterion presumably tends to increase even further, thereby further decreasing the hit and false alarm rates. Virtually all of the empirical evidence for this simple interpretation is based on estimates of the bias measure Beta from signal detection theory. In this article, I describe a new approach to studying decision making that does not require the technical assumptions of signal detection theory. The results of this new analysis suggest that the detection criterion is never biased toward either response, even when the signal rate is low and the time on task is long. Two modifications of the signal detection theory framework are considered to account for this seemingly paradoxical result. The first assumes that the signal rate affects the relative sizes of the variances of the information distributions; the second assumes that the signal rate affects the logic of the operator's stopping rule. Actual or potential applications of this research include the improved training and performance assessment of operators in areas such as product quality control, air traffic control, and medical and clinical diagnosis.
Simple rules describe bottom-up and top-down control in food webs with alternative energy pathways.
Wollrab, Sabine; Diehl, Sebastian; De Roos, André M
2012-09-01
Many human influences on the world's ecosystems have their largest direct impacts at either the top or the bottom of the food web. To predict their ecosystem-wide consequences we must understand how these impacts propagate. A long-standing, but so far elusive, problem in this endeavour is how to reduce food web complexity to a mathematically tractable, but empirically relevant system. Simplification to main energy channels linking primary producers to top consumers has been recently advocated. Following this approach, we propose a general framework for the analysis of bottom-up and top-down forcing of ecosystems by reducing food webs to two energy pathways originating from a limiting resource shared by competing guilds of primary producers (e.g. edible vs. defended plants). Exploring dynamical models of such webs we find that their equilibrium responses to nutrient enrichment and top consumer harvesting are determined by only two easily measurable topological properties: the lengths of the component food chains (odd-odd, odd-even, or even-even) and presence vs. absence of a generalist top consumer reconnecting the two pathways (yielding looped vs. branched webs). Many results generalise to other looped or branched web structures and the model can be easily adapted to include a detrital pathway. © 2012 Blackwell Publishing Ltd/CNRS.
The mathematical relationship between Zipf’s law and the hierarchical scaling law
NASA Astrophysics Data System (ADS)
Chen, Yanguang
2012-06-01
The empirical studies of city-size distribution show that Zipf's law and the hierarchical scaling law are linked in many ways. The rank-size scaling and hierarchical scaling seem to be two different sides of the same coin, but their relationship has never been revealed by strict mathematical proof. In this paper, the Zipf's distribution of cities is abstracted as a q-sequence. Based on this sequence, a self-similar hierarchy consisting of many levels is defined and the numbers of cities in different levels form a geometric sequence. An exponential distribution of the average size of cities is derived from the hierarchy. Thus we have two exponential functions, from which follows a hierarchical scaling equation. The results can be statistically verified by simple mathematical experiments and observational data of cities. A theoretical foundation is then laid for the conversion from Zipf's law to the hierarchical scaling law, and the latter can show more information about city development than the former. Moreover, the self-similar hierarchy provides a new perspective for studying networks of cities as complex systems. A series of mathematical rules applied to cities such as the allometric growth law, the 2n principle and Pareto's law can be associated with one another by the hierarchical organization.
Tarozzi, Alessandro; Pfaff, Alexander; Balasubramanya, Soumya; Ahmed, Kazi Matin; van Geen, Alexander
2013-01-01
We conducted a randomized controlled trial in rural Bangladesh to examine how household drinking-water choices were affected by two different messages about risk from naturally occurring groundwater arsenic. Households in both randomized treatment arms were informed about the arsenic level in their well and whether that level was above or below the Bangladesh standard for arsenic. Households in one group of villages were encouraged to seek water from wells below the national standard. Households in the second group of villages received additional information explaining that lower-arsenic well water is always safer and these households were encouraged to seek water from wells with lower levels of arsenic, irrespective of the national standard. A simple model of household drinking-water choice indicates that the effect of the emphasis message is theoretically ambiguous. Empirically, we find that the richer message had a negative, but insignificant, effect on well-switching rates, but the estimates are sufficiently precise that we can rule out large positive effects. The main policy implication of this finding is that a one-time oral message conveying richer information on arsenic risks, while inexpensive and easily scalable, is unlikely to be successful in reducing exposure relative to the status-quo policy. PMID:23997355
NASA Technical Reports Server (NTRS)
Conel, James E.; Vandenbosch, Jeannette; Grove, Cindy I.
1993-01-01
We used the Kubelka-Munk theory of diffuse spectral reflectance in layers to analyze influences of multiple chemical components in leaves. As opposed to empirical approaches to estimation of plant chemistry, the full spectral resolution of laboratory reflectance data was retained in an attempt to estimate lignin or other constituent concentrations from spectral band positions. A leaf water reflectance spectrum was derived from theoretical mixing rules, reflectance observations, and calculations from theory of intrinsic k- and s-functions. Residual reflectance bands were then isolated from spectra of fresh green leaves. These proved hard to interpret for composition in terms of simple two component mixtures such as lignin and cellulose. We next investigated spectral and dilution influences of other possible components (starch, protein). These components, among others, added to cellulose in hypothetical mixtures, produce band displacements similar to lignin, but will disguise by dilution the actual abundance of lignin present in a multicomponent system. This renders interpretation of band positions problematical. Knowledge of end-members and their spectra, and a more elaborate mixture analysis procedure may be called for. Good observational atmospheric and instrumental conditions and knowledge thereof are required for retrieval of expected subtle reflectance variations present in spectra of green vegetation.
A neuronal model of predictive coding accounting for the mismatch negativity.
Wacongne, Catherine; Changeux, Jean-Pierre; Dehaene, Stanislas
2012-03-14
The mismatch negativity (MMN) is thought to index the activation of specialized neural networks for active prediction and deviance detection. However, a detailed neuronal model of the neurobiological mechanisms underlying the MMN is still lacking, and its computational foundations remain debated. We propose here a detailed neuronal model of auditory cortex, based on predictive coding, that accounts for the critical features of MMN. The model is entirely composed of spiking excitatory and inhibitory neurons interconnected in a layered cortical architecture with distinct input, predictive, and prediction error units. A spike-timing dependent learning rule, relying upon NMDA receptor synaptic transmission, allows the network to adjust its internal predictions and use a memory of the recent past inputs to anticipate on future stimuli based on transition statistics. We demonstrate that this simple architecture can account for the major empirical properties of the MMN. These include a frequency-dependent response to rare deviants, a response to unexpected repeats in alternating sequences (ABABAA…), a lack of consideration of the global sequence context, a response to sound omission, and a sensitivity of the MMN to NMDA receptor antagonists. Novel predictions are presented, and a new magnetoencephalography experiment in healthy human subjects is presented that validates our key hypothesis: the MMN results from active cortical prediction rather than passive synaptic habituation.
NASA Astrophysics Data System (ADS)
Zhang, Rui; Newhauser, Wayne D.
2009-03-01
In proton therapy, the radiological thickness of a material is commonly expressed in terms of water equivalent thickness (WET) or water equivalent ratio (WER). However, the WET calculations required either iterative numerical methods or approximate methods of unknown accuracy. The objective of this study was to develop a simple deterministic formula to calculate WET values with an accuracy of 1 mm for materials commonly used in proton radiation therapy. Several alternative formulas were derived in which the energy loss was calculated based on the Bragg-Kleeman rule (BK), the Bethe-Bloch equation (BB) or an empirical version of the Bethe-Bloch equation (EBB). Alternative approaches were developed for targets that were 'radiologically thin' or 'thick'. The accuracy of these methods was assessed by comparison to values from an iterative numerical method that utilized evaluated stopping power tables. In addition, we also tested the approximate formula given in the International Atomic Energy Agency's dosimetry code of practice (Technical Report Series No 398, 2000, IAEA, Vienna) and stopping power ratio approximation. The results of these comparisons revealed that most methods were accurate for cases involving thin or low-Z targets. However, only the thick-target formulas provided accurate WET values for targets that were radiologically thick and contained high-Z material.
Highly scalable and robust rule learner: performance evaluation and comparison.
Kurgan, Lukasz A; Cios, Krzysztof J; Dick, Scott
2006-02-01
Business intelligence and bioinformatics applications increasingly require the mining of datasets consisting of millions of data points, or crafting real-time enterprise-level decision support systems for large corporations and drug companies. In all cases, there needs to be an underlying data mining system, and this mining system must be highly scalable. To this end, we describe a new rule learner called DataSqueezer. The learner belongs to the family of inductive supervised rule extraction algorithms. DataSqueezer is a simple, greedy, rule builder that generates a set of production rules from labeled input data. In spite of its relative simplicity, DataSqueezer is a very effective learner. The rules generated by the algorithm are compact, comprehensible, and have accuracy comparable to rules generated by other state-of-the-art rule extraction algorithms. The main advantages of DataSqueezer are very high efficiency, and missing data resistance. DataSqueezer exhibits log-linear asymptotic complexity with the number of training examples, and it is faster than other state-of-the-art rule learners. The learner is also robust to large quantities of missing data, as verified by extensive experimental comparison with the other learners. DataSqueezer is thus well suited to modern data mining and business intelligence tasks, which commonly involve huge datasets with a large fraction of missing data.
Clinical decision rules for termination of resuscitation in out-of-hospital cardiac arrest.
Sherbino, Jonathan; Keim, Samuel M; Davis, Daniel P
2010-01-01
Out-of-hospital cardiac arrest (OHCA) has a low probability of survival to hospital discharge. Four clinical decision rules (CDRs) have been validated to identify patients with no probability of survival. Three of these rules focus on exclusive prehospital basic life support care for OHCA, and two of these rules focus on prehospital advanced life support care for OHCA. Can a CDR for the termination of resuscitation identify a patient with no probability of survival in the setting of OHCA? Six validation studies were selected from a PubMed search. A structured review of each of the studies is presented. In OHCA receiving basic life support care, the BLS-TOR (basic life support termination of resuscitation) rule has a positive predictive value for death of 99.5% (95% confidence interval 98.9-99.8%), and decreases the transportation of all patients by 62.6%. This rule has been appropriately validated for widespread use. In OHCA receiving advanced life support care, no current rule has been appropriately validated for widespread use. The BLS-TOR rule is a simple rule that identifies patients who will not survive OHCA. Further research is required to identify similarly robust CDRs for patients receiving advanced life support care in the setting of OHCA. Copyright 2010 Elsevier Inc. All rights reserved.
Pesesky, Mitchell W; Hussain, Tahir; Wallace, Meghan; Patel, Sanket; Andleeb, Saadia; Burnham, Carey-Ann D; Dantas, Gautam
2016-01-01
The time-to-result for culture-based microorganism recovery and phenotypic antimicrobial susceptibility testing necessitates initial use of empiric (frequently broad-spectrum) antimicrobial therapy. If the empiric therapy is not optimal, this can lead to adverse patient outcomes and contribute to increasing antibiotic resistance in pathogens. New, more rapid technologies are emerging to meet this need. Many of these are based on identifying resistance genes, rather than directly assaying resistance phenotypes, and thus require interpretation to translate the genotype into treatment recommendations. These interpretations, like other parts of clinical diagnostic workflows, are likely to be increasingly automated in the future. We set out to evaluate the two major approaches that could be amenable to automation pipelines: rules-based methods and machine learning methods. The rules-based algorithm makes predictions based upon current, curated knowledge of Enterobacteriaceae resistance genes. The machine-learning algorithm predicts resistance and susceptibility based on a model built from a training set of variably resistant isolates. As our test set, we used whole genome sequence data from 78 clinical Enterobacteriaceae isolates, previously identified to represent a variety of phenotypes, from fully-susceptible to pan-resistant strains for the antibiotics tested. We tested three antibiotic resistance determinant databases for their utility in identifying the complete resistome for each isolate. The predictions of the rules-based and machine learning algorithms for these isolates were compared to results of phenotype-based diagnostics. The rules based and machine-learning predictions achieved agreement with standard-of-care phenotypic diagnostics of 89.0 and 90.3%, respectively, across twelve antibiotic agents from six major antibiotic classes. Several sources of disagreement between the algorithms were identified. Novel variants of known resistance factors and incomplete genome assembly confounded the rules-based algorithm, resulting in predictions based on gene family, rather than on knowledge of the specific variant found. Low-frequency resistance caused errors in the machine-learning algorithm because those genes were not seen or seen infrequently in the test set. We also identified an example of variability in the phenotype-based results that led to disagreement with both genotype-based methods. Genotype-based antimicrobial susceptibility testing shows great promise as a diagnostic tool, and we outline specific research goals to further refine this methodology.
78 FR 16268 - Submission for OMB Review; Service Contracts Reporting Requirements
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-14
... the final rule. DATES: Interested parties should submit written comments to the Regulatory Secretariat... between the hours that a simple disclosure by a very small business might require and the much higher...
Understanding Singular Vectors
ERIC Educational Resources Information Center
James, David; Botteron, Cynthia
2013-01-01
matrix yields a surprisingly simple, heuristical approximation to its singular vectors. There are correspondingly good approximations to the singular values. Such rules of thumb provide an intuitive interpretation of the singular vectors that helps explain why the SVD is so…
A simple model for calculating air pollution within street canyons
NASA Astrophysics Data System (ADS)
Venegas, Laura E.; Mazzeo, Nicolás A.; Dezzutti, Mariana C.
2014-04-01
This paper introduces the Semi-Empirical Urban Street (SEUS) model. SEUS is a simple mathematical model based on the scaling of air pollution concentration inside street canyons employing the emission rate, the width of the canyon, the dispersive velocity scale and the background concentration. Dispersive velocity scale depends on turbulent motions related to wind and traffic. The parameterisations of these turbulent motions include two dimensionless empirical parameters. Functional forms of these parameters have been obtained from full scale data measured in street canyons at four European cities. The sensitivity of SEUS model is studied analytically. Results show that relative errors in the evaluation of the two dimensionless empirical parameters have less influence on model uncertainties than uncertainties in other input variables. The model estimates NO2 concentrations using a simple photochemistry scheme. SEUS is applied to estimate NOx and NO2 hourly concentrations in an irregular and busy street canyon in the city of Buenos Aires. The statistical evaluation of results shows that there is a good agreement between estimated and observed hourly concentrations (e.g. fractional bias are -10.3% for NOx and +7.8% for NO2). The agreement between the estimated and observed values has also been analysed in terms of its dependence on wind speed and direction. The model shows a better performance for wind speeds >2 m s-1 than for lower wind speeds and for leeward situations than for others. No significant discrepancies have been found between the results of the proposed model and that of a widely used operational dispersion model (OSPM), both using the same input information.
A Mathematical Model of a Simple Amplifier Using a Ferroelectric Transistor
NASA Technical Reports Server (NTRS)
Sayyah, Rana; Hunt, Mitchell; MacLeod, Todd C.; Ho, Fat D.
2009-01-01
This paper presents a mathematical model characterizing the behavior of a simple amplifier using a FeFET. The model is based on empirical data and incorporates several variables that affect the output, including frequency, load resistance, and gate-to-source voltage. Since the amplifier is the basis of many circuit configurations, a mathematical model that describes the behavior of a FeFET-based amplifier will help in the integration of FeFETs into many other circuits.
Development of apparent viscosity test for hot-poured crack sealants.
DOT National Transportation Integrated Search
2008-11-01
Current crack sealant specifications focuses on utilizing simple empirical tests such as penetration, : resilience, flow, and bonding to cement concrete briquettes (ASTM D3405) to measure the ability of the material : to resist cohesive and adhesion ...
A hybrid learning method for constructing compact rule-based fuzzy models.
Zhao, Wanqing; Niu, Qun; Li, Kang; Irwin, George W
2013-12-01
The Takagi–Sugeno–Kang-type rule-based fuzzy model has found many applications in different fields; a major challenge is, however, to build a compact model with optimized model parameters which leads to satisfactory model performance. To produce a compact model, most existing approaches mainly focus on selecting an appropriate number of fuzzy rules. In contrast, this paper considers not only the selection of fuzzy rules but also the structure of each rule premise and consequent, leading to the development of a novel compact rule-based fuzzy model. Here, each fuzzy rule is associated with two sets of input attributes, in which the first is used for constructing the rule premise and the other is employed in the rule consequent. A new hybrid learning method combining the modified harmony search method with a fast recursive algorithm is hereby proposed to determine the structure and the parameters for the rule premises and consequents. This is a hard mixed-integer nonlinear optimization problem, and the proposed hybrid method solves the problem by employing an embedded framework, leading to a significantly reduced number of model parameters and a small number of fuzzy rules with each being as simple as possible. Results from three examples are presented to demonstrate the compactness (in terms of the number of model parameters and the number of rules) and the performance of the fuzzy models obtained by the proposed hybrid learning method, in comparison with other techniques from the literature.
Selecting informative subsets of sparse supermatrices increases the chance to find correct trees.
Misof, Bernhard; Meyer, Benjamin; von Reumont, Björn Marcus; Kück, Patrick; Misof, Katharina; Meusemann, Karen
2013-12-03
Character matrices with extensive missing data are frequently used in phylogenomics with potentially detrimental effects on the accuracy and robustness of tree inference. Therefore, many investigators select taxa and genes with high data coverage. Drawbacks of these selections are their exclusive reliance on data coverage without consideration of actual signal in the data which might, thus, not deliver optimal data matrices in terms of potential phylogenetic signal. In order to circumvent this problem, we have developed a heuristics implemented in a software called mare which (1) assesses information content of genes in supermatrices using a measure of potential signal combined with data coverage and (2) reduces supermatrices with a simple hill climbing procedure to submatrices with high total information content. We conducted simulation studies using matrices of 50 taxa × 50 genes with heterogeneous phylogenetic signal among genes and data coverage between 10-30%. With matrices of 50 taxa × 50 genes with heterogeneous phylogenetic signal among genes and data coverage between 10-30% Maximum Likelihood (ML) tree reconstructions failed to recover correct trees. A selection of a data subset with the herein proposed approach increased the chance to recover correct partial trees more than 10-fold. The selection of data subsets with the herein proposed simple hill climbing procedure performed well either considering the information content or just a simple presence/absence information of genes. We also applied our approach on an empirical data set, addressing questions of vertebrate systematics. With this empirical dataset selecting a data subset with high information content and supporting a tree with high average boostrap support was most successful if information content of genes was considered. Our analyses of simulated and empirical data demonstrate that sparse supermatrices can be reduced on a formal basis outperforming the usually used simple selections of taxa and genes with high data coverage.
NASA Astrophysics Data System (ADS)
Knipp, D.; Kilcommons, L. M.; Damas, M. C.
2015-12-01
We have created a simple and user-friendly web application to visualize output from empirical atmospheric models that describe the lower atmosphere and the Space-Atmosphere Interface Region (SAIR). The Atmospheric Model Web Explorer (AtModWeb) is a lightweight, multi-user, Python-driven application which uses standard web technology (jQuery, HTML5, CSS3) to give an in-browser interface that can produce plots of modeled quantities such as temperature and individual species and total densities of neutral and ionized upper-atmosphere. Output may be displayed as: 1) a contour plot over a map projection, 2) a pseudo-color plot (heatmap) which allows visualization of a variable as a function of two spatial coordinates, or 3) a simple line plot of one spatial coordinate versus any number of desired model output variables. The application is designed around an abstraction of an empirical atmospheric model, essentially treating the model code as a black box, which makes it simple to add additional models without modifying the main body of the application. Currently implemented are the Naval Research Laboratory NRLMSISE00 model for neutral atmosphere and the International Reference Ionosphere (IRI). These models are relevant to the Low Earth Orbit environment and the SAIR. The interface is simple and usable, allowing users (students and experts) to specify time and location, and choose between historical (i.e. the values for the given date) or manual specification of whichever solar or geomagnetic activity drivers are required by the model. We present a number of use-case examples from research and education: 1) How does atmospheric density between the surface and 1000 km vary with time of day, season and solar cycle?; 2) How do ionospheric layers change with the solar cycle?; 3 How does the composition of the SAIR vary between day and night at a fixed altitude?
Self-assembly of Archimedean tilings with enthalpically and entropically patchy polygons.
Millan, Jaime A; Ortiz, Daniel; van Anders, Greg; Glotzer, Sharon C
2014-03-25
Considerable progress in the synthesis of anisotropic patchy nanoplates (nanoplatelets) promises a rich variety of highly ordered two-dimensional superlattices. Recent experiments of superlattices assembled from nanoplates confirm the accessibility of exotic phases and motivate the need for a better understanding of the underlying self-assembly mechanisms. Here, we present experimentally accessible, rational design rules for the self-assembly of the Archimedean tilings from polygonal nanoplates. The Archimedean tilings represent a model set of target patterns that (i) contain both simple and complex patterns, (ii) are comprised of simple regular shapes, and (iii) contain patterns with potentially interesting materials properties. Via Monte Carlo simulations, we propose a set of design rules with general applicability to one- and two-component systems of polygons. These design rules, specified by increasing levels of patchiness, correspond to a reduced set of anisotropy dimensions for robust self-assembly of the Archimedean tilings. We show for which tilings entropic patches alone are sufficient for assembly and when short-range enthalpic interactions are required. For the latter, we show how patchy these interactions should be for optimal yield. This study provides a minimal set of guidelines for the design of anisostropic patchy particles that can self-assemble all 11 Archimedean tilings.
Phase transitions in the q -voter model with noise on a duplex clique
NASA Astrophysics Data System (ADS)
Chmiel, Anna; Sznajd-Weron, Katarzyna
2015-11-01
We study a nonlinear q -voter model with stochastic noise, interpreted in the social context as independence, on a duplex network. To study the role of the multilevelness in this model we propose three methods of transferring the model from a mono- to a multiplex network. They take into account two criteria: one related to the status of independence (LOCAL vs GLOBAL) and one related to peer pressure (AND vs OR). In order to examine the influence of the presence of more than one level in the social network, we perform simulations on a particularly simple multiplex: a duplex clique, which consists of two fully overlapped complete graphs (cliques). Solving numerically the rate equation and simultaneously conducting Monte Carlo simulations, we provide evidence that even a simple rearrangement into a duplex topology may lead to significant changes in the observed behavior. However, qualitative changes in the phase transitions can be observed for only one of the considered rules: LOCAL&AND. For this rule the phase transition becomes discontinuous for q =5 , whereas for a monoplex such behavior is observed for q =6 . Interestingly, only this rule admits construction of realistic variants of the model, in line with recent social experiments.
Knowledge acquisition for case-based reasoning systems
NASA Technical Reports Server (NTRS)
Riesbeck, Christopher K.
1988-01-01
Case-based reasoning (CBR) is a simple idea: solve new problems by adapting old solutions to similar problems. The CBR approach offers several potential advantages over rule-based reasoning: rules are not combined blindly in a search for solutions, solutions can be explained in terms of concrete examples, and performance can improve automatically as new problems are solved and added to the case library. Moving CBR for the university research environment to the real world requires smooth interfaces for getting knowledge from experts. Described are the basic elements of an interface for acquiring three basic bodies of knowledge that any case-based reasoner requires: the case library of problems and their solutions, the analysis rules that flesh out input problem specifications so that relevant cases can be retrieved, and the adaptation rules that adjust old solutions to fit new problems.