Outlines of a multiple trace theory of temporal preparation.
Los, Sander A; Kruijne, Wouter; Meeter, Martijn
2014-01-01
We outline a new multiple trace theory of temporal preparation (MTP), which accounts for behavior in reaction time (RT) tasks in which the participant is presented with a warning stimulus (S1) followed by a target stimulus (S2) that requires a speeded response. The theory assumes that during the foreperiod (FP; the S1-S2 interval) inhibition is applied to prevent premature response, while a wave of activation occurs upon the presentation of S2. On each trial, these actions are stored in a separate memory trace, which, jointly with earlier formed memory traces, starts contributing to preparation on subsequent trials. We show that MTP accounts for classic effects in temporal preparation, including mean RT-FP functions observed under a variety of FP distributions and asymmetric sequential effects. We discuss the advantages of MTP over other accounts of these effects (trace-conditioning and hazard-based explanations) and suggest a critical experiment to empirically distinguish among them.
Facilitating Multiple Intelligences through Multimodal Learning Analytics
ERIC Educational Resources Information Center
Perveen, Ayesha
2018-01-01
This paper develops a theoretical framework for employing learning analytics in online education to trace multiple learning variations of online students by considering their potential of being multiple intelligences based on Howard Gardner's 1983 theory of multiple intelligences. The study first emphasizes the need to facilitate students as…
ERIC Educational Resources Information Center
Roberts, Kim P.
2002-01-01
Outlines five perspectives addressing alternate aspects of the development of children's source monitoring: source-monitoring theory, fuzzy-trace theory, schema theory, person-based perspective, and mental-state reasoning model. Discusses research areas with relation to forensic developmental psychology: agent identity, prospective processing,…
Reading Educational Reform with Actor Network Theory: Fluid Spaces, Otherings, and Ambivalences
ERIC Educational Resources Information Center
Fenwick, Tara
2011-01-01
In considering two extended examples of educational reform efforts, this discussion traces relations that become visible through analytic approaches associated with actor-network theory (ANT). The strategy here is to present multiple readings of the two examples. The first reading adopts an ANT approach to follow ways that all actors--human and…
Greve, Andrea; Donaldson, David I; van Rossum, Mark C W
2010-02-01
Dual-process theories of episodic memory state that retrieval is contingent on two independent processes: familiarity (providing a sense of oldness) and recollection (recovering events and their context). A variety of studies have reported distinct neural signatures for familiarity and recollection, supporting dual-process theory. One outstanding question is whether these signatures reflect the activation of distinct memory traces or the operation of different retrieval mechanisms on a single memory trace. We present a computational model that uses a single neuronal network to store memory traces, but two distinct and independent retrieval processes access the memory. The model is capable of performing familiarity and recollection-based discrimination between old and new patterns, demonstrating that dual-process models need not to rely on multiple independent memory traces, but can use a single trace. Importantly, our putative familiarity and recollection processes exhibit distinct characteristics analogous to those found in empirical data; they diverge in capacity and sensitivity to sparse and correlated patterns, exhibit distinct ROC curves, and account for performance on both item and associative recognition tests. The demonstration that a single-trace, dual-process model can account for a range of empirical findings highlights the importance of distinguishing between neuronal processes and the neuronal representations on which they operate.
NASA Astrophysics Data System (ADS)
Yuan, Cadmus C. A.
2015-12-01
Optical ray tracing modeling applied Beer-Lambert method in the single luminescence material system to model the white light pattern from blue LED light source. This paper extends such algorithm to a mixed multiple luminescence material system by introducing the equivalent excitation and emission spectrum of individual luminescence materials. The quantum efficiency numbers of individual material and self-absorption of the multiple luminescence material system are considered as well. By this combination, researchers are able to model the luminescence characteristics of LED chip-scaled packaging (CSP), which provides simple process steps and the freedom of the luminescence material geometrical dimension. The method will be first validated by the experimental results. Afterward, a further parametric investigation has been then conducted.
Competitive Trace Theory: A Role for the Hippocampus in Contextual Interference during Retrieval.
Yassa, Michael A; Reagh, Zachariah M
2013-01-01
Much controversy exists regarding the role of the hippocampus in retrieval. The two dominant and competing accounts have been the Standard Model of Systems Consolidation (SMSC) and Multiple Trace Theory (MTT), which specifically make opposing predictions as to the necessity of the hippocampus for retrieval of remote memories. Under SMSC, memories eventually become independent of the hippocampus as they become more reliant on cortical connectivity, and thus the hippocampus is not required for retrieval of remote memories, only recent ones. MTT on the other hand claims that the hippocampus is always required no matter the age of the memory. We argue that this dissociation may be too simplistic, and a continuum model may be better suited to address the role of the hippocampus in retrieval of remote memories. Such a model is presented here with the main function of the hippocampus during retrieval being "recontextualization," or the reconstruction of memory using overlapping traces. As memories get older, they are decontextualized due to competition among partially overlapping traces and become more semantic and reliant on neocortical storage. In this framework dubbed the Competitive Trace Theory (CTT), consolidation events that lead to the strengthening of memories enhance conceptual knowledge (semantic memory) at the expense of contextual details (episodic memory). As a result, remote memories are more likely to have a stronger semantic representation. At the same time, remote memories are also more likely to include illusory details. The CTT is a novel candidate model that may provide some resolution to the memory consolidation debate.
Competitive Trace Theory: A Role for the Hippocampus in Contextual Interference during Retrieval
Yassa, Michael A.; Reagh, Zachariah M.
2013-01-01
Much controversy exists regarding the role of the hippocampus in retrieval. The two dominant and competing accounts have been the Standard Model of Systems Consolidation (SMSC) and Multiple Trace Theory (MTT), which specifically make opposing predictions as to the necessity of the hippocampus for retrieval of remote memories. Under SMSC, memories eventually become independent of the hippocampus as they become more reliant on cortical connectivity, and thus the hippocampus is not required for retrieval of remote memories, only recent ones. MTT on the other hand claims that the hippocampus is always required no matter the age of the memory. We argue that this dissociation may be too simplistic, and a continuum model may be better suited to address the role of the hippocampus in retrieval of remote memories. Such a model is presented here with the main function of the hippocampus during retrieval being “recontextualization,” or the reconstruction of memory using overlapping traces. As memories get older, they are decontextualized due to competition among partially overlapping traces and become more semantic and reliant on neocortical storage. In this framework dubbed the Competitive Trace Theory (CTT), consolidation events that lead to the strengthening of memories enhance conceptual knowledge (semantic memory) at the expense of contextual details (episodic memory). As a result, remote memories are more likely to have a stronger semantic representation. At the same time, remote memories are also more likely to include illusory details. The CTT is a novel candidate model that may provide some resolution to the memory consolidation debate. PMID:23964216
Fuzzy-Trace Theory and False Memory: New Frontiers.
ERIC Educational Resources Information Center
Reyna, Valerie F.; Brainerd, C. J.
1998-01-01
Describes the origins of fuzzy-trace theory, including Piagetian, interference, information-processing, and judgment and decision-making influences. Discusses similarities and differences between fuzzy-trace theory and other approaches to memory falsification. Considers the theory's predictions regarding age differences in memory falsification and…
[ital N]-string vertices in string field theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bordes, J.; Abdurrahman, A.; Anton, F.
1994-03-15
We give the general form of the vertex corresponding to the interaction of an arbitrary number of strings. The technique employed relies on the comma'' representation of string field theory where string fields and interactions are represented as matrices and operations between them such as multiplication and trace. The general formulation presented here shows that the interaction vertex of [ital N] strings, for any arbitrary [ital N], is given as a function of particular combinations of matrices corresponding to the change of representation between the full string and the half string degrees of freedom.
Examining single- and multiple-process theories of trust in automation.
Rice, Stephen
2009-07-01
The author examined the effects of human responses to automation alerts and nonalerts. Previous research has shown that automation false alarms and misses have differential effects on human trust (i.e., automation false alarms tend to affect operator compliance, whereas automation misses tend to affect operator reliance). Participants performed a simulated combat task, whereby they examined aerial photographs for the presence of enemy targets. A diagnostic aid provided a recommendation during each trial. The author manipulated the reliability and response bias of the aid to provide appropriate data for state-trace analyses. The analyses provided strong evidence that only a multiple-process theory of operator trust can explain the effects of automation errors on human dependence behaviors. The author discusses the theoretical and practical implications of this finding.
Consolidation of long-term memory: evidence and alternatives.
Meeter, Martijn; Murre, Jaap M J
2004-11-01
Memory loss in retrograde amnesia has long been held to be larger for recent periods than for remote periods, a pattern usually referred to as the Ribot gradient. One explanation for this gradient is consolidation of long-term memories. Several computational models of such a process have shown how consolidation can explain characteristics of amnesia, but they have not elucidated how consolidation must be envisaged. Here findings are reviewed that shed light on how consolidation may be implemented in the brain. Moreover, consolidation is contrasted with alternative theories of the Ribot gradient. Consolidation theory, multiple trace theory, and semantization can all handle some findings well but not others. Conclusive evidence for or against consolidation thus remains to be found.
ERIC Educational Resources Information Center
Pollock, J. Y.
1976-01-01
Taking as an example the "trace theory" of movement rules developed at MIT, the article shows the conditions to which a theoretical innovation must conform in order to be considered legitimate in the context of transformational grammar's "Extended Standard Theory." (Text is in French.) (CDSH/AM)
Sutherland, R J; Lehmann, H
2011-06-01
We discuss very recent experiments with rodents addressing the idea that long-term memories initially depending on the hippocampus, over a prolonged period, become independent of it. No unambiguous recent evidence exists to substantiate that this occurs. Most experiments find that recent and remote memories are equally affected by hippocampus damage. Nearly all experiments that report spared remote memories suffer from two problems: retrieval could be based upon substantial regions of spared hippocampus and recent memory is tested at intervals that are of the same order of magnitude as cellular consolidation. Accordingly, we point the way beyond systems consolidation theories, both the Standard Model of Consolidation and the Multiple Trace Theory, and propose a simpler multiple storage site hypothesis. On this view, with event reiterations, different memory representations are independently established in multiple networks. Many detailed memories always depend on the hippocampus; the others may be established and maintained independently. Copyright © 2011 Elsevier Ltd. All rights reserved.
Neutral Theory and Rapidly Evolving Viral Pathogens.
Frost, Simon D W; Magalis, Brittany Rife; Kosakovsky Pond, Sergei L
2018-06-01
The evolution of viral pathogens is shaped by strong selective forces that are exerted during jumps to new hosts, confrontations with host immune responses and antiviral drugs, and numerous other processes. However, while undeniably strong and frequent, adaptive evolution is largely confined to small parts of information-packed viral genomes, and the majority of observed variation is effectively neutral. The predictions and implications of the neutral theory have proven immensely useful in this context, with applications spanning understanding within-host population structure, tracing the origins and spread of viral pathogens, predicting evolutionary dynamics, and modeling the emergence of drug resistance. We highlight the multiple ways in which the neutral theory has had an impact, which has been accelerated in the age of high-throughput, high-resolution genomics.
An Overview of Judgment and Decision Making Research Through the Lens of Fuzzy Trace Theory.
Setton, Roni; Wilhelms, Evan; Weldon, Becky; Chick, Christina; Reyna, Valerie
2014-12-01
We present the basic tenets of fuzzy trace theory, a comprehensive theory of memory, judgment, and decision making that is grounded in research on how information is stored as knowledge, mentally represented, retrieved from storage, and processed. In doing so, we highlight how it is distinguished from traditional models of decision making in that gist reasoning plays a central role. The theory also distinguishes advanced intuition from primitive impulsivity. It predicts that different sorts of errors occur with respect to each component of judgment and decision making: background knowledge, representation, retrieval, and processing. Classic errors in the judgment and decision making literature, such as risky-choice framing and the conjunction fallacy, are accounted for by fuzzy trace theory and new results generated by the theory contradict traditional approaches. We also describe how developmental changes in brain and behavior offer crucial insight into adult cognitive processing. Research investigating brain and behavior in developing and special populations supports fuzzy trace theory's predictions about reliance on gist processing.
ERIC Educational Resources Information Center
Reagh, Zachariah M.; Yassa, Michael A.
2014-01-01
Most theories of memory assume that representations are strengthened with repetition. We recently proposed Competitive Trace Theory, building on the hippocampus' powerful capacity to orthogonalize inputs into distinct outputs. We hypothesized that repetition elicits a similar but nonidentical memory trace, and that contextual details of…
Blalock, Susan J.; Reyna, Valerie F.
2016-01-01
Objective Fuzzy-trace theory is a dual-process model of memory, reasoning, judgment, and decision making that contrasts with traditional expectancy-value approaches. We review the literature applying fuzzy-trace theory to health with three aims: evaluating whether the theory’s basic distinctions have been validated empirically in the domain of health; determining whether these distinctions are useful in assessing, explaining, and predicting health-related psychological processes; and determining whether the theory can be used to improve health judgments, decisions, or behaviors, especially in comparison to other approaches. Methods We conducted a literature review using PubMed, PsycInfo, and Web of Science to identify empirical peer-reviewed papers that applied fuzzy-trace theory, or central constructs of the theory, to investigate health judgments, decisions, or behaviors. Results 79 studies were identified, over half published since 2012, spanning a wide variety of conditions and populations. Study findings supported the prediction that verbatim and gist representations are distinct constructs that can be retrieved independently using different cues. Although gist-based reasoning was usually associated with improved judgment and decision making, four sources of bias that can impair gist reasoning were identified. Finally, promising findings were reported from intervention studies that used fuzzy-trace theory to improve decision making and decrease unhealthy risk taking. Conclusions Despite large gaps in the literature, most studies supported all three aims. By focusing on basic psychological processes that underlie judgment and decision making, fuzzy-trace theory provides insights into how individuals make decisions involving health risks and suggests innovative intervention approaches to improve health outcomes. PMID:27505197
Blalock, Susan J; Reyna, Valerie F
2016-08-01
Fuzzy-trace theory is a dual-process model of memory, reasoning, judgment, and decision making that contrasts with traditional expectancy-value approaches. We review the literature applying fuzzy-trace theory to health with 3 aims: evaluating whether the theory's basic distinctions have been validated empirically in the domain of health; determining whether these distinctions are useful in assessing, explaining, and predicting health-related psychological processes; and determining whether the theory can be used to improve health judgments, decisions, or behaviors, especially compared to other approaches. We conducted a literature review using PubMed, PsycINFO, and Web of Science to identify empirical peer-reviewed papers that applied fuzzy-trace theory, or central constructs of the theory, to investigate health judgments, decisions, or behaviors. Seventy nine studies (updated total is 94 studies; see Supplemental materials) were identified, over half published since 2012, spanning a wide variety of conditions and populations. Study findings supported the prediction that verbatim and gist representations are distinct constructs that can be retrieved independently using different cues. Although gist-based reasoning was usually associated with improved judgment and decision making, 4 sources of bias that can impair gist reasoning were identified. Finally, promising findings were reported from intervention studies that used fuzzy-trace theory to improve decision making and decrease unhealthy risk taking. Despite large gaps in the literature, most studies supported all 3 aims. By focusing on basic psychological processes that underlie judgment and decision making, fuzzy-trace theory provides insights into how individuals make decisions involving health risks and suggests innovative intervention approaches to improve health outcomes. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Not Alone: Tracing the Origins of Very-Low-Mass Stars and Brown Dwarfs Through Multiplicity Studies
NASA Astrophysics Data System (ADS)
Burgasser, A. J.; Reid, I. N.; Siegler, N.; Close, L.; Allen, P.; Lowrance, P.; Gizis, J.
The properties of multiple stellar systems have long provided important empirical constraints for star-formation theories, enabling (along with several other lines of evidence) a concrete, qualitative picture of the birth and early evolution of normal stars. At very low masses (VLM; M ? 0.1 solar mass), down to and below the hydrogen-burning minimum mass, our understanding of formation processes is not as clear, with several competing theories now under consideration. One means of testing these theories is through the empirical characterization of VLM multiple systems. Here, we review the results of various VLM multiplicity studies to date. These systems can be generally characterized as closely separated (93% have projected separations ? < 20 AU), near equal-mass (77% have M2/M1 ? 0.8) and occurring infrequently (perhaps 10-30% of systems are binary). Both the frequency and maximum separation of stellar and brown dwarf binaries steadily decrease for lower system masses, suggesting that VLM binary formation and/or evolution may be a mass-dependent process. There is evidence for a fairly rapid decline in the number of loosely bound systems below ~0.3 solar mass, corresponding to a factor of 10-20 increase in the minimum binding energy of VLM binaries as compared to more massive stellar binaries. This wide-separation "desert" is present among both field (~1-5 G.y.) and older (>100 m.y.) cluster systems, while the youngest (<10 m.y.) VLM binaries, particularly those in nearby, low-density star-forming regions, appear to have somewhat different systemic properties. We compare these empirical trends to predictions laid out by current formation theories, and outline future observational studies needed to probe the full parameter space of the lowest-mass multiple systems.
WHEN AND WHY DO HEDGEHOGS AND FOXES DIFFER?
Keil, Frank C
2010-01-01
Philip E. Tetlock's finding that "hedgehog" experts (those with one big theory) are worse predictors than "foxes" (those with multiple, less comprehensive theories) offers fertile ground for future research. Are experts as likely to exhibit hedgehog- or fox-like tendencies in areas that call for explanatory, diagnostic, and skill-based expertise-as they did when Tetlock called on experts to make predictions? Do particular domains of expertise curtail or encourage different styles of expertise? Can we trace these different styles to childhood? Finally, can we nudge hedgehogs to be more like foxes? Current research can only grope at the answers to these questions, but they are essential to gauging the health of expert political judgment.
An Overview of Judgment and Decision Making Research Through the Lens of Fuzzy Trace Theory
Setton, Roni; Wilhelms, Evan; Weldon, Becky; Chick, Christina; Reyna, Valerie
2017-01-01
We present the basic tenets of fuzzy trace theory, a comprehensive theory of memory, judgment, and decision making that is grounded in research on how information is stored as knowledge, mentally represented, retrieved from storage, and processed. In doing so, we highlight how it is distinguished from traditional models of decision making in that gist reasoning plays a central role. The theory also distinguishes advanced intuition from primitive impulsivity. It predicts that different sorts of errors occur with respect to each component of judgment and decision making: background knowledge, representation, retrieval, and processing. Classic errors in the judgment and decision making literature, such as risky-choice framing and the conjunction fallacy, are accounted for by fuzzy trace theory and new results generated by the theory contradict traditional approaches. We also describe how developmental changes in brain and behavior offer crucial insight into adult cognitive processing. Research investigating brain and behavior in developing and special populations supports fuzzy trace theory’s predictions about reliance on gist processing. PMID:28725239
Trace identities and their semiclassical implications
NASA Astrophysics Data System (ADS)
Smilansky, Uzy
2000-03-01
The compatibility of the semiclassical quantization of area-preserving maps with some exact identities which follow from the unitarity of the quantum evolution operator is discussed. The quantum identities involve relations between traces of powers of the evolution operator. For classically integrable maps, the semiclassical approximation is shown to be compatible with the trace identities. This is done by the identification of stationary phase manifolds which give the main contributions to the result. The compatibility of the semiclassical quantization with the trace identities demonstrates the crucial importance of non-diagonal contributions. The same technique is not applicable for chaotic maps, and the compatibility of the semiclassical theory in this case remains unsettled. However, the trace identities are applied to maps which appear naturally in the theory of quantum graphs, revealing some features of the periodic orbit theory for these paradigms of quantum chaos.
ERIC Educational Resources Information Center
Obidzinski, Michal; Nieznanski, Marek
2017-01-01
The presented research was conducted in order to investigate the connections between developmental dyslexia and the functioning of verbatim and gist memory traces--assumed in the fuzzy-trace theory. The participants were 71 high school students (33 with dyslexia and 38 without learning difficulties). The modified procedure and multinomial model of…
Modeling a 400 Hz Signal Transmission Through the South China Sea Basin
2009-03-01
TRACING ..........................8 1. General Ray Theory and the Eikonal Approximation .....................8 2. Hamiltonian Ray Tracing...HAMILTONIAN RAY TRACING 1. General Ray Theory and the Eikonal Approximation In general, modeling acoustic propagation through the ocean necessitates... eikonal and represents the phase component of the solution. Since solutions of constant phase represent wave fronts, and rays travel in a direction
Risk Taking under the Influence: A Fuzzy-Trace Theory of Emotion in Adolescence
ERIC Educational Resources Information Center
Rivers, Susan E.; Reyna, Valerie F.; Mills, Britain
2008-01-01
Fuzzy-trace theory explains risky decision making in children, adolescents, and adults, incorporating social and cultural factors as well as differences in impulsivity. Here, we provide an overview of the theory, including support for counterintuitive predictions (e.g., when adolescents "rationally" weigh costs and benefits, risk taking increases,…
The Distinctions of False and Fuzzy Memories.
ERIC Educational Resources Information Center
Schooler, Jonathan W.
1998-01-01
Notes that fuzzy-trace theory has been used to understand false memories of children. Demonstrates the irony imbedded in the theory, maintaining that a central implication of fuzzy-trace theory is that some errors characterized as false memories are not really false at all. These errors, when applied to false alarms to related lures, are best…
WHEN AND WHY DO HEDGEHOGS AND FOXES DIFFER?
Keil, Frank C.
2011-01-01
Philip E. Tetlock’s finding that “hedgehog” experts (those with one big theory) are worse predictors than “foxes” (those with multiple, less comprehensive theories) offers fertile ground for future research. Are experts as likely to exhibit hedgehog- or fox-like tendencies in areas that call for explanatory, diagnostic, and skill-based expertise—as they did when Tetlock called on experts to make predictions? Do particular domains of expertise curtail or encourage different styles of expertise? Can we trace these different styles to childhood? Finally, can we nudge hedgehogs to be more like foxes? Current research can only grope at the answers to these questions, but they are essential to gauging the health of expert political judgment. PMID:21698070
Modeling phoneme perception. II: A model of stop consonant discrimination.
van Hessen, A J; Schouten, M E
1992-10-01
Combining elements from two existing theories of speech sound discrimination, dual process theory (DPT) and trace context theory (TCT), a new theory, called phoneme perception theory, is proposed, consisting of a long-term phoneme memory, a context-coding memory, and a trace memory, each with its own time constants. This theory is tested by means of stop-consonant discrimination data in which interstimulus interval (ISI; values of 100, 300, and 2000 ms) is an important variable. It is shown that discrimination in which labeling plays an important part (2IFC and AX between category) benefits from increased ISI, whereas discrimination in which only sensory traces are compared (AX within category), decreases with increasing ISI. The theory is also tested on speech discrimination data from the literature in which ISI is a variable [Pisoni, J. Acoust. Soc. Am. 36, 277-282 (1964); Cowan and Morse, J. Acoust. Soc. Am. 79, 500-507 (1986)]. It is concluded that the number of parameters in trace context theory is not sufficient to account for most speech-sound discrimination data and that a few additional assumptions are needed, such as a form of sublabeling, in which subjects encode the quality of a stimulus as a member of a category, and which requires processing time.
Ghezzi, Pietro; Davies, Kevin; Delaney, Aidan; Floridi, Luciano
2018-03-06
Biomarkers are widely used not only as prognostic or diagnostic indicators, or as surrogate markers of disease in clinical trials, but also to formulate theories of pathogenesis. We identify two problems in the use of biomarkers in mechanistic studies. The first problem arises in the case of multifactorial diseases, where different combinations of multiple causes result in patient heterogeneity. The second problem arises when a pathogenic mediator is difficult to measure. This is the case of the oxidative stress (OS) theory of disease, where the causal components are reactive oxygen species (ROS) that have very short half-lives. In this case, it is usual to measure the traces left by the reaction of ROS with biological molecules, rather than the ROS themselves. Borrowing from the philosophical theories of signs, we look at the different facets of biomarkers and discuss their different value and meaning in multifactorial diseases and system medicine to inform their use in patient stratification in personalized medicine.
Research on Contextual Memorizing of Meaning in Foreign Language Vocabulary
ERIC Educational Resources Information Center
Xu, Linjing; Xiong, Qingxia; Qin, Yufang
2018-01-01
The purpose of this study was to examine the role of contexts in the memory of meaning in foreign vocabularies. The study was based on the cognitive processing hierarchy theory of Craik and Lockhart (1972), the memory trace theory of McClelland and Rumelhart (1986) and the memory trace theory of cognitive psychology. The subjects were non-English…
Temporally Graded Activation of Neocortical Regions in Response to Memories of Different Ages
Woodard, John L.; Seidenberg, Michael; Nielson, Kristy A.; Miller, Sarah K.; Franczak, Malgorzata; Antuono, Piero; Douville, Kelli L.; Rao, Stephen M.
2007-01-01
The temporally graded memory impairment seen in many neurobehavioral disorders implies different neuroanatomical pathways and/or cognitive mechanisms involved in storage and retrieval of memories of different ages. A dynamic interaction between medial-temporal and neocortical brain regions has been proposed to account for memory’s greater permanence with time. Despite considerable debate concerning its time-dependent role in memory retrieval, medial-temporal lobe activity has been well studied. However, the relative participation of neocortical regions in recent and remote memory retrieval has received much less attention. Using functional magnetic resonance imaging, we demonstrate robust, temporally graded signal differences in posterior cingulate, right middle frontal, right fusiform, and left middle temporal regions in healthy older adults during famous name identification from two disparate time epochs. Importantly, no neocortical regions demonstrated greater response to older than to recent stimuli. Our results suggest a possible role of these neocortical regions in temporally dating items in memory and in establishing and maintaining memory traces throughout the lifespan. Theoretical implications of these findings for the two dominant models of remote memory functioning (Consolidation Theory and Multiple Trace Theory) are discussed. PMID:17583988
Obidziński, Michał; Nieznański, Marek
2017-10-01
The presented research was conducted in order to investigate the connections between developmental dyslexia and the functioning of verbatim and gist memory traces-assumed in the fuzzy-trace theory. The participants were 71 high school students (33 with dyslexia and 38 without learning difficulties). The modified procedure and multinomial model of Stahl and Klauer (simplified conjoint recognition model) was used to collect and analyze data. Results showed statistically significant differences in four of the model parameters: (a) the probability of verbatim trace recollection upon presentation of orthographically similar stimulus was higher in the control than dyslexia group, (b) the probability of verbatim trace recollection upon presentation of semantically similar stimulus was higher in the control than dyslexia group, (c) the probability of gist trace retrieval upon presentation of semantically similar stimulus was higher in the dyslexia than control group, and (d) the probability of gist trace retrieval upon target stimulus presentation (in the semantic condition) was higher in the control than dyslexia group. The obtained results suggest differences of memory functioning in terms of verbatim and gist trace retrieval between people with and without dyslexia on specific, elementary cognitive processes postulated by the fuzzy-trace theory. These can indicate new approaches in the education of persons with developmental dyslexia, focused on specific impairments and the strengths of their memory functioning.
Boon, Bronwyn; Greatbanks, Richard; Munro, Jenny; Gaffney, Michael
2017-03-01
This paper addresses the challenge reported in the research literature of providing adequate accounts of service quality and value to multiple stakeholders. Rather than starting with a particular accountability practice, we examine the accounts of complex service delivery and results from the perspective of five key stakeholder groups. The case study at the empirical centre of this research is a small New Zealand non-profit organisation that provides community-based wraparound casework to young people, and their families, with multiple and complex needs. This paper reports on data collected during 2009-2012 through interviews with five key stakeholders of this service: the young people, the caseworkers, the co-providers, the managers and the funders. Drawing on translation theory, the different points of reference and the consequential shifts in focus, content and meaning within the multiple stakeholder accounts are traced. The findings show that while each stakeholder group brings a unique point of reference to the service delivery, there are degrees of overlap in the focus and content of the accounts. This is particularly evident in the 'relationship' dimension. While overlaps may exist, points of invisibility are also revealed. Accountability tensions can be traced directly to these points of invisibility. As a result of this analysis, it is argued that more explicit attention to the impact of multiple stakeholders at the level of epistemology provides a mechanism for addressing some of the tensions routinely raised. © 2015 John Wiley & Sons Ltd.
How Fuzzy-Trace Theory Predicts True and False Memories for Words, Sentences, and Narratives
Reyna, Valerie F.; Corbin, Jonathan C.; Weldon, Rebecca B.; Brainerd, Charles J.
2016-01-01
Fuzzy-trace theory posits independent verbatim and gist memory processes, a distinction that has implications for such applied topics as eyewitness testimony. This distinction between precise, literal verbatim memory and meaning-based, intuitive gist accounts for memory paradoxes including dissociations between true and false memory, false memories outlasting true memories, and developmental increases in false memory. We provide an overview of fuzzy-trace theory, and, using mathematical modeling, also present results demonstrating verbatim and gist memory in true and false recognition of narrative sentences and inferences. Results supported fuzzy-trace theory's dual-process view of memory: verbatim memory was relied on to reject meaning-consistent, but unpresented, sentences (via recollection rejection). However, verbatim memory was often not retrieved, and gist memory supported acceptance of these sentences (via similarity judgment and phantom recollection). Thus, mathematical models of words can be extended to explain memory for complex stimuli, such as narratives, the kind of memory interrogated in law. PMID:27042402
NASA Astrophysics Data System (ADS)
Tan, Jun; Song, Peng; Li, Jinshan; Wang, Lei; Zhong, Mengxuan; Zhang, Xiaobo
2017-06-01
The surface-related multiple elimination (SRME) method is based on feedback formulation and has become one of the most preferred multiple suppression methods used. However, some differences are apparent between the predicted multiples and those in the source seismic records, which may result in conventional adaptive multiple subtraction methods being barely able to effectively suppress multiples in actual production. This paper introduces a combined adaptive multiple attenuation method based on the optimized event tracing technique and extended Wiener filtering. The method firstly uses multiple records predicted by SRME to generate a multiple velocity spectrum, then separates the original record to an approximate primary record and an approximate multiple record by applying the optimized event tracing method and short-time window FK filtering method. After applying the extended Wiener filtering method, residual multiples in the approximate primary record can then be eliminated and the damaged primary can be restored from the approximate multiple record. This method combines the advantages of multiple elimination based on the optimized event tracing method and the extended Wiener filtering technique. It is an ideal method for suppressing typical hyperbolic and other types of multiples, with the advantage of minimizing damage of the primary. Synthetic and field data tests show that this method produces better multiple elimination results than the traditional multi-channel Wiener filter method and is more suitable for multiple elimination in complicated geological areas.
Comparison of Methods to Trace Multiple Subskills: Is LR-DBN Best?
ERIC Educational Resources Information Center
Xu, Yanbo; Mostow, Jack
2012-01-01
A long-standing challenge for knowledge tracing is how to update estimates of multiple subskills that underlie a single observable step. We characterize approaches to this problem by how they model knowledge tracing, fit its parameters, predict performance, and update subskill estimates. Previous methods allocated blame or credit among subskills…
Is quantum theory a form of statistical mechanics?
NASA Astrophysics Data System (ADS)
Adler, S. L.
2007-05-01
We give a review of the basic themes of my recent book: Adler S L 2004 Quantum Theory as an Emergent Phenomenon (Cambridge: Cambridge University Press). We first give motivations for considering the possibility that quantum mechanics is not exact, but is instead an accurate asymptotic approximation to a deeper level theory. For this deeper level, we propose a non-commutative generalization of classical mechanics, that we call "trace dynamics", and we give a brief survey of how it works, considering for simplicity only the bosonic case. We then discuss the statistical mechanics of trace dynamics and give our argument that with suitable approximations, the Ward identities for trace dynamics imply that ensemble averages in the canonical ensemble correspond to Wightman functions in quantum field theory. Thus, quantum theory emerges as the statistical thermodynamics of trace dynamics. Finally, we argue that Brownian motion corrections to this thermodynamics lead to stochastic corrections to the Schrödinger equation, of the type that have been much studied in the "continuous spontaneous localization" model of objective state vector reduction. In appendices to the talk, we give details of the existence of a conserved operator in trace dynamics that encodes the structure of the canonical algebra, of the derivation of the Ward identities, and of the proof that the stochastically-modified Schrödinger equation leads to state vector reduction with Born rule probabilities.
Moscovitch, Morris; Rosenbaum, R Shayna; Gilboa, Asaf; Addis, Donna Rose; Westmacott, Robyn; Grady, Cheryl; McAndrews, Mary Pat; Levine, Brian; Black, Sandra; Winocur, Gordon; Nadel, Lynn
2005-01-01
We review lesion and neuroimaging evidence on the role of the hippocampus, and other structures, in retention and retrieval of recent and remote memories. We examine episodic, semantic and spatial memory, and show that important distinctions exist among different types of these memories and the structures that mediate them. We argue that retention and retrieval of detailed, vivid autobiographical memories depend on the hippocampal system no matter how long ago they were acquired. Semantic memories, on the other hand, benefit from hippocampal contribution for some time before they can be retrieved independently of the hippocampus. Even semantic memories, however, can have episodic elements associated with them that continue to depend on the hippocampus. Likewise, we distinguish between experientially detailed spatial memories (akin to episodic memory) and more schematic memories (akin to semantic memory) that are sufficient for navigation but not for re-experiencing the environment in which they were acquired. Like their episodic and semantic counterparts, the former type of spatial memory is dependent on the hippocampus no matter how long ago it was acquired, whereas the latter can survive independently of the hippocampus and is represented in extra-hippocampal structures. In short, the evidence reviewed suggests strongly that the function of the hippocampus (and possibly that of related limbic structures) is to help encode, retain, and retrieve experiences, no matter how long ago the events comprising the experience occurred, and no matter whether the memories are episodic or spatial. We conclude that the evidence favours a multiple trace theory (MTT) of memory over two other models: (1) traditional consolidation models which posit that the hippocampus is a time-limited memory structure for all forms of memory; and (2) versions of cognitive map theory which posit that the hippocampus is needed for representing all forms of allocentric space in memory. PMID:16011544
Moscovitch, Morris; Rosenbaum, R Shayna; Gilboa, Asaf; Addis, Donna Rose; Westmacott, Robyn; Grady, Cheryl; McAndrews, Mary Pat; Levine, Brian; Black, Sandra; Winocur, Gordon; Nadel, Lynn
2005-07-01
We review lesion and neuroimaging evidence on the role of the hippocampus, and other structures, in retention and retrieval of recent and remote memories. We examine episodic, semantic and spatial memory, and show that important distinctions exist among different types of these memories and the structures that mediate them. We argue that retention and retrieval of detailed, vivid autobiographical memories depend on the hippocampal system no matter how long ago they were acquired. Semantic memories, on the other hand, benefit from hippocampal contribution for some time before they can be retrieved independently of the hippocampus. Even semantic memories, however, can have episodic elements associated with them that continue to depend on the hippocampus. Likewise, we distinguish between experientially detailed spatial memories (akin to episodic memory) and more schematic memories (akin to semantic memory) that are sufficient for navigation but not for re-experiencing the environment in which they were acquired. Like their episodic and semantic counterparts, the former type of spatial memory is dependent on the hippocampus no matter how long ago it was acquired, whereas the latter can survive independently of the hippocampus and is represented in extra-hippocampal structures. In short, the evidence reviewed suggests strongly that the function of the hippocampus (and possibly that of related limbic structures) is to help encode, retain, and retrieve experiences, no matter how long ago the events comprising the experience occurred, and no matter whether the memories are episodic or spatial. We conclude that the evidence favours a multiple trace theory (MTT) of memory over two other models: (1) traditional consolidation models which posit that the hippocampus is a time-limited memory structure for all forms of memory; and (2) versions of cognitive map theory which posit that the hippocampus is needed for representing all forms of allocentric space in memory.
Dual Processes in Decision Making and Developmental Neuroscience: A Fuzzy-Trace Model
ERIC Educational Resources Information Center
Reyna, Valerie F.; Brainerd, Charles J.
2011-01-01
From Piaget to the present, traditional and dual-process theories have predicted improvement in reasoning from childhood to adulthood, and improvement has been observed. However, developmental reversals--that reasoning biases emerge with development--have also been observed in a growing list of paradigms. We explain how fuzzy-trace theory predicts…
Fuzzy-trace theory: dual processes in memory, reasoning, and cognitive neuroscience.
Brainerd, C J; Reyna, V F
2001-01-01
Fuzzy-trace theory has evolved in response to counterintuitive data on how memory development influences the development of reasoning. The two traditional perspectives on memory-reasoning relations--the necessity and constructivist hypotheses--stipulate that the accuracy of children's memory for problem information and the accuracy of their reasoning are closely intertwined, albeit for different reasons. However, contrary to necessity, correlational and experimental dissociations have been found between children's memory for problem information that is determinative in solving certain problems and their solutions of those problems. In these same tasks, age changes in memory for problem information appear to be dissociated from age changes in reasoning. Contrary to constructivism, correlational and experimental dissociations also have been found between children's performance on memory tests for actual experience and memory tests for the meaning of experience. As in memory-reasoning studies, age changes in one type of memory performance do not seem to be closely connected to age changes in the other type of performance. Subsequent experiments have led to dual-process accounts in both the memory and reasoning spheres. The account of memory development features four other principles: parallel verbatim-gist storage, dissociated verbatim-gist retrieval, memorial bases of conscious recollection, and identity/similarity processes. The account of the development of reasoning features three principles: gist extraction, fuzzy-to-verbatim continua, and fuzzy-processing preferences. The fuzzy-processing preference is a particularly important notion because it implies that gist-based intuitive reasoning often suffices to deliver "logical" solutions and that such reasoning confers multiple cognitive advantages that enhance accuracy. The explanation of memory-reasoning dissociations in cognitive development then falls out of fuzzy-trace theory's dual-process models of memory and reasoning. More explicitly, in childhood reasoning tasks, it is assumed that both verbatim and gist traces of problem information are stored. Responding accurately to memory tests for presented problem information depends primarily on verbatim memory abilities (preserving traces of that information and accessing them when the appropriate memory probes are administered). However, accurate solutions to reasoning problems depend primarily on gist-memory abilities (extracting the correct gist from problem information, focusing on that gist during reasoning, and accessing reasoning operations that process that gist). Because verbatim and gist memories exhibit considerable dissociation, both during storage and when they are subsequently accessed on memory tests, dissociations of verbatim-based memory performance from gist-based reasoning are predictable. Conversely, associations are predicted in situations in which memory and reasoning are based on the same verbatim traces (Brainerd & Reyna, 1988) and in situations in which memory and reasoning are based on the same gist traces (Reyna & Kiernan, 1994). Fuzzy-trace theory's memory and reasoning principles have been applied in other research domains. Four such domains are developmental cognitive neuroscience studies of false memory, studies of false memory in brain-damaged patients, studies of reasoning errors in judgment and decision making, and studies of retrieval mechanisms in recall. In the first domain, the principles of parallel verbatim-gist storage, dissociated verbatim-gist retrieval, and identity/similarity processes have been used to explain both spontaneous and implanted false reports in children and in the elderly. These explanations have produced some surprising predictions that have been verified: false reports do not merely decline with age during childhood but increase under theoretically specified conditions; reports of events that were not experienced can nevertheless be highly persistent over time; and false reports can be suppressed by retrieving verbatim traces of corresponding true events. In the second domain, the same principles have been invoked to explain why some forms of brain damage lead to elevated levels of false memory and other forms lead to reduced levels of false memory. In the third domain, the principles of gist extraction, fuzzy-to-verbatim continua, and fuzzy-processing preferences have been exploited to formulate a general theory of loci of processing failures in judgment and decision making, cluminating in a developmental account of degrees of rationality that distinguishes more and less advanced reasoning. This theory has in turn been used to formulate local models, such as the inclusion illusions model, that explain the characteristic reasoning errors that are observed on specific judgment and decision-making tasks. Finally, in the fourth domain, a dual-process conception of recall has been derived from the principles of parallel verbatim-gist storage and dissociated verbatim-gist retrieval. In this conception, which has been used to explain cognitive triage effects in recall and robust false recall, targets are recalled either by directly accessing their verbatim traces and reading the retrieved information out of consciousness or by reconstructively processing their gist traces.
Double-trace flows and the swampland
NASA Astrophysics Data System (ADS)
Giombi, Simone; Perlmutter, Eric
2018-03-01
We explore the idea that large N, non-supersymmetric conformal field theories with a parametrically large gap to higher spin single-trace operators may be obtained as infrared fixed points of relevant double-trace deformations of superconformal field theories. After recalling the AdS interpretation and some potential pathologies of such flows, we introduce a concrete example that appears to avoid them: the ABJM theory at finite k, deformed by \\int O^2, where O is the superconformal primary in the stress-tensor multiplet. We address its relation to recent conjectures based on weak gravity bounds, and discuss the prospects for a wider class of similarly viable flows. Next, we proceed to analyze the spectrum and correlation functions of the putative IR CFT, to leading non-trivial order in 1 /N. This includes analytic computations of the change under double-trace flow of connected four-point functions of ABJM superconformal primaries; and of the IR anomalous dimensions of infinite classes of double-trace composite operators. These would be the first analytic results for anomalous dimensions of finite-spin composite operators in any large N CFT3 with an Einstein gravity dual.
A new intuitionism: Meaning, memory, and development in Fuzzy-Trace Theory
Reyna, Valerie F.
2014-01-01
Combining meaning, memory, and development, the perennially popular topic of intuition can be approached in a new way. Fuzzy-trace theory integrates these topics by distinguishing between meaning-based gist representations, which support fuzzy (yet advanced) intuition, and superficial verbatim representations of information, which support precise analysis. Here, I review the counterintuitive findings that led to the development of the theory and its most recent extensions to the neuroscience of risky decision making. These findings include memory interference (worse verbatim memory is associated with better reasoning); nonnumerical framing (framing effects increase when numbers are deleted from decision problems); developmental decreases in gray matter and increases in brain connectivity; developmental reversals in memory, judgment, and decision making (heuristics and biases based on gist increase from childhood to adulthood, challenging conceptions of rationality); and selective attention effects that provide critical tests comparing fuzzy-trace theory, expected utility theory, and its variants (e.g., prospect theory). Surprising implications for judgment and decision making in real life are also discussed, notably, that adaptive decision making relies mainly on gist-based intuition in law, medicine, and public health. PMID:25530822
A new intuitionism: Meaning, memory, and development in Fuzzy-Trace Theory.
Reyna, Valerie F
2012-05-01
Combining meaning, memory, and development, the perennially popular topic of intuition can be approached in a new way. Fuzzy-trace theory integrates these topics by distinguishing between meaning-based gist representations, which support fuzzy (yet advanced) intuition, and superficial verbatim representations of information, which support precise analysis. Here, I review the counterintuitive findings that led to the development of the theory and its most recent extensions to the neuroscience of risky decision making. These findings include memory interference (worse verbatim memory is associated with better reasoning); nonnumerical framing (framing effects increase when numbers are deleted from decision problems); developmental decreases in gray matter and increases in brain connectivity; developmental reversals in memory, judgment, and decision making (heuristics and biases based on gist increase from childhood to adulthood, challenging conceptions of rationality); and selective attention effects that provide critical tests comparing fuzzy-trace theory, expected utility theory, and its variants (e.g., prospect theory). Surprising implications for judgment and decision making in real life are also discussed, notably, that adaptive decision making relies mainly on gist-based intuition in law, medicine, and public health.
Corbin, Jonathan C.; Reyna, Valerie F.; Weldon, Rebecca B.; Brainerd, Charles J.
2015-01-01
Fuzzy-trace theory distinguishes verbatim (literal, exact) from gist (meaningful) representations, predicting that reliance on gist increases with experience and expertise. Thus, many judgment-and-decision-making biases increase with development, such that cognition is colored by context in ways that violate logical coherence and probability theories. Nevertheless, this increase in gist-based intuition is adaptive: Gist is stable, less sensitive to interference, and easier to manipulate. Moreover, gist captures the functionally significant essence of information, supporting healthier and more robust decision processes. We describe how fuzzy-trace theory accounts for judgment-and-decision making phenomena, predicting the paradoxical arc of these processes with the development of experience and expertise. We present data linking gist memory processes to gist processing in decision making and provide illustrations of gist reliance in medicine, public health, and intelligence analysis. PMID:26664820
Corbin, Jonathan C; Reyna, Valerie F; Weldon, Rebecca B; Brainerd, Charles J
2015-12-01
Fuzzy-trace theory distinguishes verbatim (literal, exact) from gist (meaningful) representations, predicting that reliance on gist increases with experience and expertise. Thus, many judgment-and-decision-making biases increase with development, such that cognition is colored by context in ways that violate logical coherence and probability theories. Nevertheless, this increase in gist-based intuition is adaptive: Gist is stable, less sensitive to interference, and easier to manipulate. Moreover, gist captures the functionally significant essence of information, supporting healthier and more robust decision processes. We describe how fuzzy-trace theory accounts for judgment-and-decision making phenomena, predicting the paradoxical arc of these processes with the development of experience and expertise. We present data linking gist memory processes to gist processing in decision making and provide illustrations of gist reliance in medicine, public health, and intelligence analysis.
Anomalies, conformal manifolds, and spheres
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gomis, Jaume; Hsin, Po-Shen; Komargodski, Zohar
The two-point function of exactly marginal operators leads to a universal contribution to the trace anomaly in even dimensions. We study aspects of this trace anomaly, emphasizing its interpretation as a sigma model, whose target space $M$ is the space of conformal field theories (a.k.a. the conformal manifold). When the underlying quantum field theory is supersymmetric, this sigma model has to be appropriately supersymmetrized. As examples, we consider in some detail $N$ = (2; 2) and $N$ = (0; 2) supersymmetric theories in d = 2 and $N$ = 2 supersymmetric theories in d = 4. This reasoning leads tomore » new information about the conformal manifolds of these theories, for example, we show that the manifold is K ahler-Hodge and we further argue that it has vanishing K ahler class. For $N$ = (2; 2) theories in d = 2 and N = 2 theories in d = 4 we also show that the relation between the sphere partition function and the K ahler potential of $M$ follows immediately from the appropriate sigma models that we construct. Ultimately, along the way we find several examples of potential trace anomalies that obey the Wess-Zumino consistency conditions, but can be ruled out by a more detailed analysis.« less
Anomalies, conformal manifolds, and spheres
NASA Astrophysics Data System (ADS)
Gomis, Jaume; Hsin, Po-Shen; Komargodski, Zohar; Schwimmer, Adam; Seiberg, Nathan; Theisen, Stefan
2016-03-01
The two-point function of exactly marginal operators leads to a universal contribution to the trace anomaly in even dimensions. We study aspects of this trace anomaly, emphasizing its interpretation as a sigma model, whose target space {M} is the space of conformal field theories (a.k.a. the conformal manifold). When the underlying quantum field theory is supersymmetric, this sigma model has to be appropriately supersymmetrized. As examples, we consider in some detail {N}=(2,2) and {N}=(0,2) supersymmetric theories in d = 2 and {N}=2 supersymmetric theories in d = 4. This reasoning leads to new information about the conformal manifolds of these theories, for example, we show that the manifold is Kähler-Hodge and we further argue that it has vanishing Kähler class. For {N}=(2,2) theories in d = 2 and {N}=2 theories in d = 4 we also show that the relation between the sphere partition function and the Kähler potential of {M} follows immediately from the appropriate sigma models that we construct. Along the way we find several examples of potential trace anomalies that obey the Wess-Zumino consistency conditions, but can be ruled out by a more detailed analysis.
Anomalies, conformal manifolds, and spheres
Gomis, Jaume; Hsin, Po-Shen; Komargodski, Zohar; ...
2016-03-04
The two-point function of exactly marginal operators leads to a universal contribution to the trace anomaly in even dimensions. We study aspects of this trace anomaly, emphasizing its interpretation as a sigma model, whose target space $M$ is the space of conformal field theories (a.k.a. the conformal manifold). When the underlying quantum field theory is supersymmetric, this sigma model has to be appropriately supersymmetrized. As examples, we consider in some detail $N$ = (2; 2) and $N$ = (0; 2) supersymmetric theories in d = 2 and $N$ = 2 supersymmetric theories in d = 4. This reasoning leads tomore » new information about the conformal manifolds of these theories, for example, we show that the manifold is K ahler-Hodge and we further argue that it has vanishing K ahler class. For $N$ = (2; 2) theories in d = 2 and N = 2 theories in d = 4 we also show that the relation between the sphere partition function and the K ahler potential of $M$ follows immediately from the appropriate sigma models that we construct. Ultimately, along the way we find several examples of potential trace anomalies that obey the Wess-Zumino consistency conditions, but can be ruled out by a more detailed analysis.« less
Yousefzadeh, Behrooz; Hodgson, Murray
2012-09-01
A beam-tracing model was used to study the acoustical responses of three empty, rectangular rooms with different boundary conditions. The model is wave-based (accounting for sound phase) and can be applied to rooms with extended-reaction surfaces that are made of multiple layers of solid, fluid, or poroelastic materials-the acoustical properties of these surfaces are calculated using Biot theory. Three room-acoustical parameters were studied in various room configurations: sound strength, reverberation time, and RApid Speech Transmission Index. The main objective was to investigate the effects of modeling surfaces as either local or extended reaction on predicted values of these three parameters. Moreover, the significance of modeling interference effects was investigated, including the study of sound phase-change on surface reflection. Modeling surfaces as of local or extended reaction was found to be significant for surfaces consisting of multiple layers, specifically when one of the layers is air. For multilayers of solid materials with an air-cavity, this was most significant around their mass-air-mass resonance frequencies. Accounting for interference effects made significant changes in the predicted values of all parameters. Modeling phase change on reflection, on the other hand, was found to be relatively much less significant.
Rethinking Inhibition Theory: On the Problematic Status of the Inhibition Theory for Forgetting
ERIC Educational Resources Information Center
Raaijmakers, Jeroen G. W.; Jakab, Emoke
2013-01-01
The standard textbook account of interference and forgetting is based on the assumption that retrieval of a memory trace is affected by competition by other memory traces. In recent years, a number of researchers have questioned this view and have proposed an alternative account of forgetting based on a mechanism of suppression. In this inhibition…
ERIC Educational Resources Information Center
Martínez-Hernández, Cesar; Ulloa-Azpeitia, Ricardo
2017-01-01
Based on the theoretical elements of the instrumental approach to tool use known as Task-Technique-Theory (Artigue, 2002), this paper analyses and discusses the performance of graduate students enrolled in a Teacher Training program. The latter performance relates to tracing tangent lines to the curve of a quadratic function in Dynamic Geometry…
ERIC Educational Resources Information Center
Beach, Derek; Rohlfing, Ingo
2018-01-01
In recent years, there has been increasing interest in the combination of two methods on the basis of set theory. In our introduction and this special issue, we focus on two variants of cross-case set-theoretic methods--"qualitative comparative analysis" (QCA) and typological theory (TT)--and their combination with process tracing (PT).…
Multifold paths of neutrons in the three-beam interferometer detected by a tiny energy kick
NASA Astrophysics Data System (ADS)
Geppert-Kleinrath, Hermann; Denkmayr, Tobias; Sponar, Stephan; Lemmel, Hartmut; Jenke, Tobias; Hasegawa, Yuji
2018-05-01
A neutron optical experiment is presented to investigate the paths taken by neutrons in a three-beam interferometer. In various beam paths of the interferometer, the energy of the neutrons is partially shifted so that the faint traces are left along the beam path. By ascertaining an operational meaning to "the particle's path," which-path information is extracted from these faint traces with minimal perturbations. Theory is derived by simply following the time evolution of the wave function of the neutrons, which clarifies the observation in the framework of standard quantum mechanics. Which-way information is derived from the intensity, sinusoidally oscillating in time at different frequencies, which is considered to result from the interfering cross terms between stationary main component and the energy-shifted which-way signals. Final results give experimental evidence that the (partial) wave functions of the neutrons in each beam path are superimposed and present in multiple locations in the interferometer.
Terrain modeling for microwave landing system
NASA Technical Reports Server (NTRS)
Poulose, M. M.
1991-01-01
A powerful analytical approach for evaluating the terrain effects on a microwave landing system (MLS) is presented. The approach combines a multiplate model with a powerful and exhaustive ray tracing technique and an accurate formulation for estimating the electromagnetic fields due to the antenna array in the presence of terrain. Both uniform theory of diffraction (UTD) and impedance UTD techniques have been employed to evaluate these fields. Innovative techniques are introduced at each stage to make the model versatile to handle most general terrain contours and also to reduce the computational requirement to a minimum. The model is applied to several terrain geometries, and the results are discussed.
A study of electrically active traps in AlGaN/GaN high electron mobility transistor
NASA Astrophysics Data System (ADS)
Yang, Jie; Cui, Sharon; Ma, T. P.; Hung, Ting-Hsiang; Nath, Digbijoy; Krishnamoorthy, Sriram; Rajan, Siddharth
2013-10-01
We have studied electron conduction mechanisms and the associated roles of the electrically active traps in the AlGaN layer of an AlGaN/GaN high electron mobility transistor structure. By fitting the temperature dependent I-V (Current-Voltage) curves to the Frenkel-Poole theory, we have identified two discrete trap energy levels. Multiple traces of I-V measurements and constant-current injection experiment all confirm that the main role of the traps in the AlGaN layer is to enhance the current flowing through the AlGaN barrier by trap-assisted electron conduction without causing electron trapping.
An integrated theory of attention and decision making in visual signal detection.
Smith, Philip L; Ratcliff, Roger
2009-04-01
The simplest attentional task, detecting a cued stimulus in an otherwise empty visual field, produces complex patterns of performance. Attentional cues interact with backward masks and with spatial uncertainty, and there is a dissociation in the effects of these variables on accuracy and on response time. A computational theory of performance in this task is described. The theory links visual encoding, masking, spatial attention, visual short-term memory (VSTM), and perceptual decision making in an integrated dynamic framework. The theory assumes that decisions are made by a diffusion process driven by a neurally plausible, shunting VSTM. The VSTM trace encodes the transient outputs of early visual filters in a durable form that is preserved for the time needed to make a decision. Attention increases the efficiency of VSTM encoding, either by increasing the rate of trace formation or by reducing the delay before trace formation begins. The theory provides a detailed, quantitative account of attentional effects in spatial cuing tasks at the level of response accuracy and the response time distributions. (c) 2009 APA, all rights reserved
The loss of episodic memories in retrograde amnesia: single-case and group studies.
Kopelman, M D; Kapur, N
2001-09-29
Retrograde amnesia in neurological disorders is a perplexing and fascinating research topic. The severity of retrograde amnesia is not well correlated with that of anterograde amnesia, and there can be disproportionate impairments of either. Within retrograde amnesia, there are various dissociations which have been claimed-for example, between the more autobiographical (episodic) and more semantic components of memory. However, the associations of different types of retrograde amnesia are also important, and clarification of these issues is confounded by the fact that retrograde amnesia seems to be particularly vulnerable to psychogenic factors. Large frontal and temporal lobe lesions have been postulated as critical in producing retrograde amnesia. Theories of retrograde amnesia have encompassed storage versus access disruption, physiological processes of 'consolidation', the progressive transformation of episodic memories into a more 'semantic' form, and multiple-trace theory. Single-case investigations, group studies and various forms of neuroimaging can all contribute to the resolution of these controversies.
Brust-Renck, Priscila G; Reyna, Valerie F; Wilhelms, Evan A; Wolfe, Christopher R; Widmer, Colin L; Cedillos-Whynott, Elizabeth M; Morant, A Kate
2017-08-01
We used Sharable Knowledge Objects (SKOs) to create an Intelligent Tutoring System (ITS) grounded in Fuzzy-Trace Theory to teach women about obesity prevention: GistFit, getting the gist of healthy eating and exercise. The theory predicts that reliance on gist mental representations (as opposed to verbatim) is more effective in reducing health risks and improving decision making. Technical information was translated into decision-relevant gist representations and gist principles (i.e., healthy values). The SKO was hypothesized to facilitate extracting these gist representations and principles by engaging women in dialogue, "understanding" their responses, and replying appropriately to prompt additional engagement. Participants were randomly assigned to either the obesity prevention tutorial (GistFit) or a control tutorial containing different content using the same technology. Participants were administered assessments of knowledge about nutrition and exercise, gist comprehension, gist principles, behavioral intentions and self-reported behavior. An analysis of engagement in tutorial dialogues and responses to multiple-choice questions to check understanding throughout the tutorial revealed significant correlations between these conversations and scores on subsequent knowledge tests and gist comprehension. Knowledge and comprehension measures correlated with healthier behavior and greater intentions to perform healthy behavior. Differences between GistFit and control tutorials were greater for participants who engaged more fully. Thus, results are consistent with the hypothesis that active engagement with a new gist-based ITS, rather than a passive memorization of verbatim details, was associated with an array of known psychosocial mediators of preventive health decisions, such as knowledge acquisition, and gist comprehension.
Ray Tracing with Virtual Objects.
ERIC Educational Resources Information Center
Leinoff, Stuart
1991-01-01
Introduces the method of ray tracing to analyze the refraction or reflection of real or virtual images from multiple optical devices. Discusses ray-tracing techniques for locating images using convex and concave lenses or mirrors. (MDH)
Kopelman, M D; Bright, P
2012-11-01
Andrew Mayes's contribution to the neuropsychology of memory has consisted in steadily teasing out the nature of the memory deficit in the amnesic syndrome. This has been done with careful attention to matters of method at all stages. This particularly applies to his investigations of forgetting rates in amnesia and to his studies of retrograde amnesia. Following a brief outline of his work, the main current theories of retrograde amnesia are considered: consolidation theory, episodic-to-semantic shift theory, and multiple trace theory. Findings across the main studies in Alzheimer dementia are reviewed to illustrate what appears to be consistently found, and what is much more inconsistent. A number of problems and issues in current theories are then highlighted--including the nature of the temporal gradient, correlations with the extent of temporal lobe damage, what we would expect 'normal' remote memory curves to look like, how they would appear in focal retrograde amnesia, and whether we can pinpoint retrograde amnesia to hippocampal/medial temporal damage on the basis of existing studies. A recent study of retrograde amnesia is re-analysed to demonstrate temporal gradients on recollected episodic memories in hippocampal/medial temporal patients. It is concluded that there are two requirements for better understanding of the nature of retrograde amnesia: (i) a tighter, Mayesian attention to method in terms of both the neuropsychology and neuroimaging in investigations of retrograde amnesia; and (ii) acknowledging that there may be multiple factors underlying a temporal gradient, and that episodic and semantic memory show important interdependencies at both encoding and retrieval. Such factors may be critical to understanding what is remembered and what is forgotten from our autobiographical pasts. Copyright © 2012 Elsevier Ltd. All rights reserved.
Framing Effects are Robust to Linguistic Disambiguation: A Critical Test of Contemporary Theory
Chick, Christina F.; Reyna, Valerie F.; Corbin, Jonathan C.
2015-01-01
Theoretical accounts of risky choice framing effects assume that decision makers interpret framing options as extensionally equivalent, such that if 600 lives are at stake, saving 200 implies that 400 die. However, many scholars have argued that framing effects are caused, instead, by filling in pragmatically implied information. This linguistic ambiguity hypothesis is grounded in neo-Gricean pragmatics, information leakage, and schema theory. In two experiments, we conducted a critical test of the linguistic ambiguity hypothesis and its relation to framing. We controlled for this crucial implied information by disambiguating it using instructions and detailed examples, followed by multiple quizzes. After disambiguating missing information, we presented standard framing problems plus truncated versions, varying types of missing information. Truncations were also critical tests of prospect theory and fuzzy trace theory. Participants were not only college students, but also middle-aged adults (who showed similar results). Contrary to the ambiguity hypothesis, participants who interpreted missing information as complementary to stated information none the less showed robust framing effects. Although adding words like “at least” can change interpretations of framing information, this form of linguistic ambiguity is not necessary to observe risky choice framing effects. PMID:26348200
Framing effects are robust to linguistic disambiguation: A critical test of contemporary theory.
Chick, Christina F; Reyna, Valerie F; Corbin, Jonathan C
2016-02-01
Theoretical accounts of risky choice framing effects assume that decision makers interpret framing options as extensionally equivalent, such that if 600 lives are at stake, saving 200 implies that 400 die. However, many scholars have argued that framing effects are caused, instead, by filling in pragmatically implied information. This linguistic ambiguity hypothesis is grounded in neo-Gricean pragmatics, information leakage, and schema theory. In 2 experiments, we conducted critical tests of the linguistic ambiguity hypothesis and its relation to framing. We controlled for this crucial implied information by disambiguating it using instructions and detailed examples, followed by multiple quizzes. After disambiguating missing information, we presented standard framing problems plus truncated versions, varying types of missing information. Truncations were also critical tests of prospect theory and fuzzy trace theory. Participants were not only college students, but also middle-age adults (who showed similar results). Contrary to the ambiguity hypothesis, participants who interpreted missing information as complementary to stated information nonetheless showed robust framing effects. Although adding words like "at least" can change interpretations of framing information, this form of linguistic ambiguity is not necessary to observe risky choice framing effects. (c) 2016 APA, all rights reserved).
Interleaved Practice with Multiple Representations: Analyses with Knowledge Tracing Based Techniques
ERIC Educational Resources Information Center
Rau, Martina A.; Pardos, Zachary A.
2012-01-01
The goal of this paper is to use Knowledge Tracing to augment the results obtained from an experiment that investigated the effects of practice schedules using an intelligent tutoring system for fractions. Specifically, this experiment compared different practice schedules of multiple representations of fractions: representations were presented to…
The spectro-contextual encoding and retrieval theory of episodic memory.
Watrous, Andrew J; Ekstrom, Arne D
2014-01-01
The spectral fingerprint hypothesis, which posits that different frequencies of oscillations underlie different cognitive operations, provides one account for how interactions between brain regions support perceptual and attentive processes (Siegel etal., 2012). Here, we explore and extend this idea to the domain of human episodic memory encoding and retrieval. Incorporating findings from the synaptic to cognitive levels of organization, we argue that spectrally precise cross-frequency coupling and phase-synchronization promote the formation of hippocampal-neocortical cell assemblies that form the basis for episodic memory. We suggest that both cell assembly firing patterns as well as the global pattern of brain oscillatory activity within hippocampal-neocortical networks represents the contents of a particular memory. Drawing upon the ideas of context reinstatement and multiple trace theory, we argue that memory retrieval is driven by internal and/or external factors which recreate these frequency-specific oscillatory patterns which occur during episodic encoding. These ideas are synthesized into a novel model of episodic memory (the spectro-contextual encoding and retrieval theory, or "SCERT") that provides several testable predictions for future research.
Toward a Unified Theory of Context Dependence.
ERIC Educational Resources Information Center
Hanna, Gerald S.; Oaster, Thomas R.
1978-01-01
Traces a major source of confusion in the literature on passage dependence and integrates the relevant concepts into a general theory of context dependence. Sample items and data illustrate practical applications of the theory. (AA)
Tests of a Structural Theory of the Memory Trace.
ERIC Educational Resources Information Center
Jones, Gregory V.
1978-01-01
Jones (1976) has shown that the memory trace resulting from the viewing of a picture corresponds to a "fragment" of that picture. This research shows that the fragmentation hypothesis also correctly represents the recall of memories derived from sentences, i.e., the functional unit of memory, the mnemonic trace, is a fragment of the original item.…
Compressing and querying multiple GPS traces for transportation planning.
DOT National Transportation Integrated Search
2013-07-01
In recent years, there has been a significant increase in the number of vehicles which have been equipped with : GPS devices. These devices generate huge volumes of trace data, and information extracted from these traces : could significantly help tr...
Theory and Practice of Lineage Tracing.
Hsu, Ya-Chieh
2015-11-01
Lineage tracing is a method that delineates all progeny produced by a single cell or a group of cells. The possibility of performing lineage tracing initiated the field of Developmental Biology and continues to revolutionize Stem Cell Biology. Here, I introduce the principles behind a successful lineage-tracing experiment. In addition, I summarize and compare different methods for conducting lineage tracing and provide examples of how these strategies can be implemented to answer fundamental questions in development and regeneration. The advantages and limitations of each method are also discussed. © 2015 AlphaMed Press.
NASA Technical Reports Server (NTRS)
Ukeiley, L.; Varghese, M.; Glauser, M.; Valentine, D.
1991-01-01
A 'lobed mixer' device that enhances mixing through secondary flows and streamwise vorticity is presently studied within the framework of multifractal-measures theory, in order to deepen understanding of velocity time trace data gathered on its operation. Proper orthogonal decomposition-based knowledge of coherent structures has been applied to obtain the generalized fractal dimensions and multifractal spectrum of several proper eigenmodes for data samples of the velocity time traces; this constitutes a marked departure from previous multifractal theory applications to self-similar cascades. In certain cases, a single dimension may suffice to capture the entire spectrum of scaling exponents for the velocity time trace.
Paradigms and Plastic Facts in the History of Valence.
ERIC Educational Resources Information Center
Zavaleta, David
1988-01-01
Traces the development of bonding theory and notes the influence of preconceived theory upon this development. Considers ideas of alchemy, Newton, Dalton, Lewis, and quantum mechanics. Suggests a move away from conservative descriptive approaches of bonding theory. (ML)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Phillips, Mark C.; Taubman, Matthew S.; Kriesel, Jason M.
2015-02-08
We describe a prototype trace gas sensor designed for real-time detection of multiple chemicals. The sensor uses an external cavity quantum cascade laser (ECQCL) swept over its tuning range of 940-1075 cm-1 (9.30-10.7 µm) at a 10 Hz repetition rate.
Dual Processes in Decision Making and Developmental Neuroscience: A Fuzzy-Trace Model.
Reyna, Valerie F; Brainerd, Charles J
2011-09-01
From Piaget to the present, traditional and dual-process theories have predicted improvement in reasoning from childhood to adulthood, and improvement has been observed. However, developmental reversals-that reasoning biases emerge with development -have also been observed in a growing list of paradigms. We explain how fuzzy-trace theory predicts both improvement and developmental reversals in reasoning and decision making. Drawing on research on logical and quantitative reasoning, as well as on risky decision making in the laboratory and in life, we illustrate how the same small set of theoretical principles apply to typical neurodevelopment, encompassing childhood, adolescence, and adulthood, and to neurological conditions such as autism and Alzheimer's disease. For example, framing effects-that risk preferences shift when the same decisions are phrases in terms of gains versus losses-emerge in early adolescence as gist-based intuition develops. In autistic individuals, who rely less on gist-based intuition and more on verbatim-based analysis, framing biases are attenuated (i.e., they outperform typically developing control subjects). In adults, simple manipulations based on fuzzy-trace theory can make framing effects appear and disappear depending on whether gist-based intuition or verbatim-based analysis is induced. These theoretical principles are summarized and integrated in a new mathematical model that specifies how dual modes of reasoning combine to produce predictable variability in performance. In particular, we show how the most popular and extensively studied model of decision making-prospect theory-can be derived from fuzzy-trace theory by combining analytical (verbatim-based) and intuitive (gist-based) processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adachi, Satoshi; Toda, Mikito; Kubotani, Hiroto
The fixed-trace ensemble of random complex matrices is the fundamental model that excellently describes the entanglement in the quantum states realized in a coupled system by its strongly chaotic dynamical evolution [see H. Kubotani, S. Adachi, M. Toda, Phys. Rev. Lett. 100 (2008) 240501]. The fixed-trace ensemble fully takes into account the conservation of probability for quantum states. The present paper derives for the first time the exact analytical formula of the one-body distribution function of singular values of random complex matrices in the fixed-trace ensemble. The distribution function of singular values (i.e. Schmidt eigenvalues) of a quantum state ismore » so important since it describes characteristics of the entanglement in the state. The derivation of the exact analytical formula utilizes two recent achievements in mathematics, which appeared in 1990s. The first is the Kaneko theory that extends the famous Selberg integral by inserting a hypergeometric type weight factor into the integrand to obtain an analytical formula for the extended integral. The second is the Petkovsek-Wilf-Zeilberger theory that calculates definite hypergeometric sums in a closed form.« less
Contemporary Translation Theories. Second Revised Edition. Topics in Translation 21.
ERIC Educational Resources Information Center
Gentzler, Edwin
This book traces the growth of translation theory from its traditional roots through the recent proliferation of theories, fueled by research in feminism, poststructural, and postcolonial investigations. It examines 5 new approaches: the North American translation workshop, the science of translation, early translation studies, polysystem theory,…
Multiple-Input Multiple-Output (MIMO) Linear Systems Extreme Inputs/Outputs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smallwood, David O.
2007-01-01
A linear structure is excited at multiple points with a stationary normal random process. The response of the structure is measured at multiple outputs. If the autospectral densities of the inputs are specified, the phase relationships between the inputs are derived that will minimize or maximize the trace of the autospectral density matrix of the outputs. If the autospectral densities of the outputs are specified, the phase relationships between the outputs that will minimize or maximize the trace of the input autospectral density matrix are derived. It is shown that other phase relationships and ordinary coherence less than one willmore » result in a trace intermediate between these extremes. Least favorable response and some classes of critical response are special cases of the development. It is shown that the derivation for stationary random waveforms can also be applied to nonstationary random, transients, and deterministic waveforms.« less
Theories of Learning and Their Impact on OPAC Instruction.
ERIC Educational Resources Information Center
Frick, Elizabeth
1989-01-01
Describes four major types of learning theories (behavioral, cognitive, cybernetic, and andragogical); examines pertinent literature for each; and traces links between the literature of learning theory and that of the design of online public access catalog instructional systems. (32 references) (Author/CLB)
The Evolution of Macroeconomic Theory and Implications for Teaching Intermediate Macroeconomics.
ERIC Educational Resources Information Center
Froyen, Richard T.
1996-01-01
Traces the development of macroeconomic theory from John Maynard Keynes to modern endogenous growth theory. Maintains that a combination of interest in growth theory and related policy questions will play a prominent role in macroeconomics in the future. Recommends narrowing the gap between graduate school and undergraduate economics instruction.…
Ryan, Lee; Lin, Chun-Yu; Ketcham, Katie; Nadel, Lynn
2010-01-01
This study examined the involvement of medial temporal lobe, especially the hippocampus, in processing spatial and nonspatial relations using episodic and semantic versions of a relational judgment task. Participants studied object arrays and were tested on different types of relations between pairs of objects. Three prevalent views of hippocampal function were considered. Cognitive map theory (O'Keefe and Nadel (1978) The Hippocampus as a Cognitive Map. USA: Oxford University Press) emphasizes hippocampal involvement in spatial relational tasks. Multiple trace theory (Nadel and Moscovitch (1997) Memory consolidation, retrograde amnesia and the hippocampal complex Curr Opin Neurobiol 7:217-227) emphasizes hippocampal involvement in episodic tasks. Eichenbaum and Cohen's ((2001) From Conditioning to Conscious Recollection: Memory Systems of the Brain. USA: Oxford University Press) relational theory predicts equivalent hippocampal involvement in all relational tasks within both semantic and episodic memory. The fMRI results provided partial support for all three theories, though none of them fit the data perfectly. We observed hippocampal activation during all relational tasks, with increased activation for spatial compared to nonspatial relations, and for episodic compared to semantic relations. The placement of activation along the anterior-posterior axis of the hippocampus also differentiated the conditions. We suggest a view of hippocampal function in memory that incorporates aspects of all three theories. Copyright 2009 Wiley-Liss, Inc.
Incompleteness and limit of security theory of quantum key distribution
NASA Astrophysics Data System (ADS)
Hirota, Osamu; Murakami, Dan; Kato, Kentaro; Futami, Fumio
2012-10-01
It is claimed in the many papers that a trace distance: d guarantees the universal composition security in quantum key distribution (QKD) like BB84 protocol. In this introduction paper, at first, it is explicitly explained what is the main misconception in the claim of the unconditional security for QKD theory. In general terms, the cause of the misunderstanding on the security claim is the Lemma in the paper of Renner. It suggests that the generation of the perfect random key is assured by the probability (1-d), and its failure probability is d. Thus, it concludes that the generated key provides the perfect random key sequence when the protocol is success. So the QKD provides perfect secrecy to the one time pad. This is the reason for the composition claim. However, the quantity of the trace distance (or variational distance) is not the probability for such an event. If d is not small enough, always the generated key sequence is not uniform. Now one needs the reconstruction of the evaluation of the trace distance if one wants to use it. One should first go back to the indistinguishability theory in the computational complexity based, and to clarify the meaning of the value of the variational distance. In addition, the same analysis for the information theoretic case is necessary. The recent serial papers by H.P.Yuen have given the answer on such questions. In this paper, we show more concise description of Yuen's theory, and clarify that the upper bound theories for the trace distance by Tomamichel et al and Hayashi et al are constructed by the wrong reasoning of Renner and it is unsuitable as the security analysis. Finally, we introduce a new macroscopic quantum communication to replace Q-bit QKD.
On-Line Planning and Mapping for Chemical Plume Tracing
2004-06-01
09 - 2005 Final Report 01/04/2001 - 31/10/2004 4. TITLE AND SUBTITLE Sa. CONTRACT NUMBER On-line Planning and Mapping for Chemical Plume Tracing 5b...PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION REPORT NUMBER Department of Electrical Engineering University of California...develop, and implement on-vehicle planning and mapping theory and software to find, trace, and map chemical plumes. This objective included accurate
Time-Based Loss in Visual Short-Term Memory is from Trace Decay, not Temporal Distinctiveness
Ricker, Timothy J.; Spiegel, Lauren R.; Cowan, Nelson
2014-01-01
There is no consensus as to why forgetting occurs in short-term memory tasks. In past work, we have shown that forgetting occurs with the passage of time, but there are two classes of theories that can explain this effect. In the present work, we investigate the reason for time-based forgetting by contrasting the predictions of temporal distinctiveness and trace decay in the procedure in which we have observed such loss, involving memory for arrays of characters or letters across several seconds. The first theory, temporal distinctiveness, predicts that increasing the amount of time between trials will lead to less proactive interference, resulting in less forgetting across a retention interval. In the second theory, trace decay, temporal distinctiveness between trials is irrelevant to the loss over a retention interval. Using visual array change detection tasks in four experiments, we find small proactive interference effects on performance under some specific conditions, but no concomitant change in the effect of a retention interval. We conclude that trace decay is the more suitable class of explanations of the time-based forgetting in short-term memory that we have observed, and we suggest the need for further clarity in what the exact basis of that decay may be. PMID:24884646
Grounded Theory as a "Family of Methods": A Genealogical Analysis to Guide Research
ERIC Educational Resources Information Center
Babchuk, Wayne A.
2011-01-01
This study traces the evolution of grounded theory from a nuclear to an extended family of methods and considers the implications that decision-making based on informed choices throughout all phases of the research process has for realizing the potential of grounded theory for advancing adult education theory and practice. [This paper was…
The Birth of Elementary-Particle Physics.
ERIC Educational Resources Information Center
Brown, Laurie M.; Hoddeson, Lillian
1982-01-01
Traces the origin and development of particle physics, concentrating on the roles of cosmic rays and theory. Includes charts highlighting significant events in the development of cosmic-ray physics and quantum field theory. (SK)
An instance theory of associative learning.
Jamieson, Randall K; Crump, Matthew J C; Hannah, Samuel D
2012-03-01
We present and test an instance model of associative learning. The model, Minerva-AL, treats associative learning as cued recall. Memory preserves the events of individual trials in separate traces. A probe presented to memory contacts all traces in parallel and retrieves a weighted sum of the traces, a structure called the echo. Learning of a cue-outcome relationship is measured by the cue's ability to retrieve a target outcome. The theory predicts a number of associative learning phenomena, including acquisition, extinction, reacquisition, conditioned inhibition, external inhibition, latent inhibition, discrimination, generalization, blocking, overshadowing, overexpectation, superconditioning, recovery from blocking, recovery from overshadowing, recovery from overexpectation, backward blocking, backward conditioned inhibition, and second-order retrospective revaluation. We argue that associative learning is consistent with an instance-based approach to learning and memory.
Fuzzy-Trace Theory and Lifespan Cognitive Development
Brainerd, C J.; Reyna, Valerie F.
2015-01-01
Fuzzy-trace theory (FTT) emphasizes the use of core theoretical principles, such as the verbatim-gist distinction, to predict new findings about cognitive development that are counterintuitive from the perspective of other theories or of common-sense. To the extent that such predictions are confirmed, the range of phenomena that are explained expands without increasing the complexity of the theory's assumptions. We examine research on recent examples of such predictions during four epochs of cognitive development: childhood, adolescence, young adulthood, and late adulthood. During the first two, the featured predictions are surprising developmental reversals in false memory (childhood) and in risky decision making (adolescence). During young adulthood, FTT predicts that a retrieval operation that figures centrally in dual-process theories of memory, recollection, is bivariate rather than univariate. During the late adulthood, FTT identifies a retrieval operation, reconstruction, that has been omitted from current theories of normal memory declines in aging and pathological declines in dementia. The theory predicts that reconstruction is a major factor in such declines and that it is able to forecast future dementia. PMID:26644632
Fuzzy-Trace Theory and Lifespan Cognitive Development.
Brainerd, C J; Reyna, Valerie F
2015-12-01
Fuzzy-trace theory (FTT) emphasizes the use of core theoretical principles, such as the verbatim-gist distinction, to predict new findings about cognitive development that are counterintuitive from the perspective of other theories or of common-sense. To the extent that such predictions are confirmed, the range of phenomena that are explained expands without increasing the complexity of the theory's assumptions. We examine research on recent examples of such predictions during four epochs of cognitive development: childhood, adolescence, young adulthood, and late adulthood. During the first two, the featured predictions are surprising developmental reversals in false memory (childhood) and in risky decision making (adolescence). During young adulthood, FTT predicts that a retrieval operation that figures centrally in dual-process theories of memory, recollection, is bivariate rather than univariate. During the late adulthood, FTT identifies a retrieval operation, reconstruction, that has been omitted from current theories of normal memory declines in aging and pathological declines in dementia. The theory predicts that reconstruction is a major factor in such declines and that it is able to forecast future dementia.
Miklius, Asta; Flower, M.F.J.; Huijsmans, J.P.P.; Mukasa, S.B.; Castillo, P.
1991-01-01
Taal lava series can be distinguished from each other by differences in major and trace element trends and trace element ratios, indicating multiple magmatic systems associated with discrete centers in time and space. On Volcano Island, contemporaneous lava series range from typically calc-alkaline to iron-enriched. Major and trace element variation in these series can be modelled by fractionation of similar assemblages, with early fractionation of titano-magnetite in less iron-enriched series. However, phase compositional and petrographic evidence of mineral-liquid disequilibrium suggests that magma mixing played an important role in the evolution of these series. -from Authors
Beam-tracing model for predicting sound fields in rooms with multilayer bounding surfaces
NASA Astrophysics Data System (ADS)
Wareing, Andrew; Hodgson, Murray
2005-10-01
This paper presents the development of a wave-based room-prediction model for predicting steady-state sound fields in empty rooms with specularly reflecting, multilayer surfaces. A triangular beam-tracing model with phase, and a transfer-matrix approach to model the surfaces, were involved. Room surfaces were modeled as multilayers of fluid, solid, or porous materials. Biot theory was used in the transfer-matrix formulation of the porous layer. The new model consisted of the transfer-matrix model integrated into the beam-tracing algorithm. The transfer-matrix model was validated by comparing predictions with those by theory, and with experiment. The test surfaces were a glass plate, double drywall panels, double steel panels, a carpeted floor, and a suspended-acoustical ceiling. The beam-tracing model was validated in the cases of three idealized room configurations-a small office, a corridor, and a small industrial workroom-with simple boundary conditions. The number of beams, the reflection order, and the frequency resolution required to obtain accurate results were investigated. Beam-tracing predictions were compared with those by a method-of-images model with phase. The model will be used to study sound fields in rooms with local- or extended-reaction multilayer surfaces.
NASA Astrophysics Data System (ADS)
Sahu, Sunil Kumar; Singh, Reena; Kathiresan, Kandasamy
2016-12-01
Mangroves are taxonomically diverse group of salt-tolerant, mainly arboreal, flowering plants that grow in tropical and sub-tropical regions and have adapted themselves to thrive in such obdurate surroundings. While evolution is often understood exclusively in terms of adaptation, innovation often begins when a feature adapted for one function is co-opted for a different purpose and the co-opted features are called exaptations. Thus, one of the fundamental issues is what features of mangroves have evolved through exaptation. We attempt to address these questions through molecular phylogenetic approach using chloroplast and nuclear markers. First, we determined if these mangroves specific traits have evolved multiple times in the phylogeny. Once the multiple origins were established, we then looked at related non-mangrove species for characters that could have been co-opted by mangrove species. We also assessed the efficacy of these molecular sequences in distinguishing mangroves at the species level. This study revealed the multiple origin of mangroves and shed light on the ancestral characters that might have led certain lineages of plants to adapt to estuarine conditions and also traces the evolutionary history of mangroves and hitherto unexplained theory that mangroves traits (aerial roots and viviparous propagules) evolved as a result of exaptation rather than adaptation to saline habitats.
Learning theory and its application to the use of social media in medical education.
Flynn, Leslie; Jalali, Alireza; Moreau, Katherine A
2015-10-01
There is rapidly increasing pressure to employ social media in medical education, but a review of the literature demonstrates that its value and role are uncertain. To determine if medical educators have a conceptual framework that informs their use of social media and whether this framework can be mapped to learning theory. Thirty-six participants engaged in an iterative, consensus building process that identified their conceptual framework and determined if it aligned with one or more learning theories. The results show that the use of social media by the participants could be traced to two dominant theories-Connectivism and Constructivism. They also suggest that many medical educators may not be fully informed of these theories. Medical educators' use of social media can be traced to learning theories, but these theories may not be explicitly utilised in instructional design. It is recommended that formal education (faculty development) around learning theory would further enhance the use of social media in medical education. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Dual Processes in Decision Making and Developmental Neuroscience: A Fuzzy-Trace Model
Reyna, Valerie F.; Brainerd, Charles J.
2011-01-01
From Piaget to the present, traditional and dual-process theories have predicted improvement in reasoning from childhood to adulthood, and improvement has been observed. However, developmental reversals—that reasoning biases emerge with development —have also been observed in a growing list of paradigms. We explain how fuzzy-trace theory predicts both improvement and developmental reversals in reasoning and decision making. Drawing on research on logical and quantitative reasoning, as well as on risky decision making in the laboratory and in life, we illustrate how the same small set of theoretical principles apply to typical neurodevelopment, encompassing childhood, adolescence, and adulthood, and to neurological conditions such as autism and Alzheimer's disease. For example, framing effects—that risk preferences shift when the same decisions are phrases in terms of gains versus losses—emerge in early adolescence as gist-based intuition develops. In autistic individuals, who rely less on gist-based intuition and more on verbatim-based analysis, framing biases are attenuated (i.e., they outperform typically developing control subjects). In adults, simple manipulations based on fuzzy-trace theory can make framing effects appear and disappear depending on whether gist-based intuition or verbatim-based analysis is induced. These theoretical principles are summarized and integrated in a new mathematical model that specifies how dual modes of reasoning combine to produce predictable variability in performance. In particular, we show how the most popular and extensively studied model of decision making—prospect theory—can be derived from fuzzy-trace theory by combining analytical (verbatim-based) and intuitive (gist-based) processes. PMID:22096268
Piaget's Theory of Child Development
ERIC Educational Resources Information Center
Case, Robbie
1972-01-01
This article traces Piaget's theory of child development from its philosophic foundations in Kantian organization and then describes in sequence Piaget's four stages. (A follow-up article on Piaget and educational practice will appear in a later issue.) (JA)
Communication Theory and the Consumer Movement-
ERIC Educational Resources Information Center
Newsom, Doug
1977-01-01
Defines and traces the origins of the consumer movement and uses communication theories to explain the effects of the movement. Available from: Public Relations Review, Ray Hiebert, Dean, College of Journalism, University of Maryland, College Park, MD 20742. (MH)
Differential Models for B-Type Open-Closed Topological Landau-Ginzburg Theories
NASA Astrophysics Data System (ADS)
Babalic, Elena Mirela; Doryn, Dmitry; Lazaroiu, Calin Iuliu; Tavakol, Mehdi
2018-05-01
We propose a family of differential models for B-type open-closed topological Landau-Ginzburg theories defined by a pair (X,W), where X is any non-compact Calabi-Yau manifold and W is any holomorphic complex-valued function defined on X whose critical set is compact. The models are constructed at cochain level using smooth data, including the twisted Dolbeault algebra of polyvector-valued forms and a twisted Dolbeault category of holomorphic factorizations of W. We give explicit proposals for cochain level versions of the bulk and boundary traces and for the bulk-boundary and boundary-bulk maps of the Landau-Ginzburg theory. We prove that most of the axioms of an open-closed TFT (topological field theory) are satisfied on cohomology and conjecture that the remaining two axioms (namely non-degeneracy of bulk and boundary traces and the topological Cardy constraint) are also satisfied.
Quantifying Bell nonlocality with the trace distance
NASA Astrophysics Data System (ADS)
Brito, S. G. A.; Amaral, B.; Chaves, R.
2018-02-01
Measurements performed on distant parts of an entangled quantum state can generate correlations incompatible with classical theories respecting the assumption of local causality. This is the phenomenon known as quantum nonlocality that, apart from its fundamental role, can also be put to practical use in applications such as cryptography and distributed computing. Clearly, developing ways of quantifying nonlocality is an important primitive in this scenario. Here, we propose to quantify the nonlocality of a given probability distribution via its trace distance to the set of classical correlations. We show that this measure is a monotone under the free operations of a resource theory and, furthermore, that it can be computed efficiently with a linear program. We put our framework to use in a variety of relevant Bell scenarios also comparing the trace distance to other standard measures in the literature.
Fast solar radiation pressure modelling with ray tracing and multiple reflections
NASA Astrophysics Data System (ADS)
Li, Zhen; Ziebart, Marek; Bhattarai, Santosh; Harrison, David; Grey, Stuart
2018-05-01
Physics based SRP (Solar Radiation Pressure) models using ray tracing methods are powerful tools when modelling the forces on complex real world space vehicles. Currently high resolution (1 mm) ray tracing with secondary intersections is done on high performance computers at UCL (University College London). This study introduces the BVH (Bounding Volume Hierarchy) into the ray tracing approach for physics based SRP modelling and makes it possible to run high resolution analysis on personal computers. The ray tracer is both general and efficient enough to cope with the complex shape of satellites and multiple reflections (three or more, with no upper limit). In this study, the traditional ray tracing technique is introduced in the first place and then the BVH is integrated into the ray tracing. Four aspects of the ray tracer were tested for investigating the performance including runtime, accuracy, the effects of multiple reflections and the effects of pixel array resolution.Test results in runtime on GPS IIR and Galileo IOV (In Orbit Validation) satellites show that the BVH can make the force model computation 30-50 times faster. The ray tracer has an absolute accuracy of several nanonewtons by comparing the test results for spheres and planes with the analytical computations. The multiple reflection effects are investigated both in the intersection number and acceleration on GPS IIR, Galileo IOV and Sentinel-1 spacecraft. Considering the number of intersections, the 3rd reflection can capture 99.12 %, 99.14 % , and 91.34 % of the total reflections for GPS IIR, Galileo IOV satellite bus and the Sentinel-1 spacecraft respectively. In terms of the multiple reflection effects on the acceleration, the secondary reflection effect for Galileo IOV satellite and Sentinel-1 can reach 0.2 nm /s2 and 0.4 nm /s2 respectively. The error percentage in the accelerations magnitude results show that the 3rd reflection should be considered in order to make it less than 0.035 % . The pixel array resolution tests show that the dimensions of the components have to be considered when choosing the spacing of the pixel in order not to miss some components of the satellite in ray tracing. This paper presents the first systematic and quantitative study of the secondary and higher order intersection effects. It shows conclusively the effect is non-negligible for certain classes of misson.
The probability density function (PDF) of Lagrangian Turbulence
NASA Astrophysics Data System (ADS)
Birnir, B.
2012-12-01
The statistical theory of Lagrangian turbulence is derived from the stochastic Navier-Stokes equation. Assuming that the noise in fully-developed turbulence is a generic noise determined by the general theorems in probability, the central limit theorem and the large deviation principle, we are able to formulate and solve the Kolmogorov-Hopf equation for the invariant measure of the stochastic Navier-Stokes equations. The intermittency corrections to the scaling exponents of the structure functions require a multiplicative (multipling the fluid velocity) noise in the stochastic Navier-Stokes equation. We let this multiplicative noise, in the equation, consists of a simple (Poisson) jump process and then show how the Feynmann-Kac formula produces the log-Poissonian processes, found by She and Leveque, Waymire and Dubrulle. These log-Poissonian processes give the intermittency corrections that agree with modern direct Navier-Stokes simulations (DNS) and experiments. The probability density function (PDF) plays a key role when direct Navier-Stokes simulations or experimental results are compared to theory. The statistical theory of turbulence is determined, including the scaling of the structure functions of turbulence, by the invariant measure of the Navier-Stokes equation and the PDFs for the various statistics (one-point, two-point, N-point) can be obtained by taking the trace of the corresponding invariant measures. Hopf derived in 1952 a functional equation for the characteristic function (Fourier transform) of the invariant measure. In distinction to the nonlinear Navier-Stokes equation, this is a linear functional differential equation. The PDFs obtained from the invariant measures for the velocity differences (two-point statistics) are shown to be the four parameter generalized hyperbolic distributions, found by Barndorff-Nilsen. These PDF have heavy tails and a convex peak at the origin. A suitable projection of the Kolmogorov-Hopf equations is the differential equation determining the generalized hyperbolic distributions. Then we compare these PDFs with DNS results and experimental data.
Time-based loss in visual short-term memory is from trace decay, not temporal distinctiveness.
Ricker, Timothy J; Spiegel, Lauren R; Cowan, Nelson
2014-11-01
There is no consensus as to why forgetting occurs in short-term memory tasks. In past work, we have shown that forgetting occurs with the passage of time, but there are 2 classes of theories that can explain this effect. In the present work, we investigate the reason for time-based forgetting by contrasting the predictions of temporal distinctiveness and trace decay in the procedure in which we have observed such loss, involving memory for arrays of characters or letters across several seconds. The 1st theory, temporal distinctiveness, predicts that increasing the amount of time between trials will lead to less proactive interference, resulting in less forgetting across a retention interval. In the 2nd theory, trace decay, temporal distinctiveness between trials is irrelevant to the loss over a retention interval. Using visual array change detection tasks in 4 experiments, we find small proactive interference effects on performance under some specific conditions, but no concomitant change in the effect of a retention interval. We conclude that trace decay is the more suitable class of explanations of the time-based forgetting in short-term memory that we have observed, and we suggest the need for further clarity in what the exact basis of that decay may be. PsycINFO Database Record (c) 2014 APA, all rights reserved.
Gibs, J.; Wicklund, A.; Suffet, I.H.
1986-01-01
The 'rule of thumb' that large volumes of water can be sampled for trace organic pollutants by XAD resin columns which are designed by small column laboratory studies or pure compounds is examined and shown to be a problem. A theory of multicomponent breakthrough is presented as a frame of reference to help solve the problem and develop useable criteria to aid the design of resin columns. An important part of the theory is the effect of humic substances on the breakthrough character of multicomponent chemical systems.
The mysteries of remote memory.
Albo, Zimbul; Gräff, Johannes
2018-03-19
Long-lasting memories form the basis of our identity as individuals and lie central in shaping future behaviours that guide survival. Surprisingly, however, our current knowledge of how such memories are stored in the brain and retrieved, as well as the dynamics of the circuits involved, remains scarce despite seminal technical and experimental breakthroughs in recent years. Traditionally, it has been proposed that, over time, information initially learnt in the hippocampus is stored in distributed cortical networks. This process-the standard theory of memory consolidation-would stabilize the newly encoded information into a lasting memory, become independent of the hippocampus, and remain essentially unmodifiable throughout the lifetime of the individual. In recent years, several pieces of evidence have started to challenge this view and indicate that long-lasting memories might already ab ovo be encoded, and subsequently stored in distributed cortical networks, akin to the multiple trace theory of memory consolidation. In this review, we summarize these recent findings and attempt to identify the biologically plausible mechanisms based on which a contextual memory becomes remote by integrating different levels of analysis: from neural circuits to cell ensembles across synaptic remodelling and epigenetic modifications. From these studies, remote memory formation and maintenance appear to occur through a multi-trace, dynamic and integrative cellular process ranging from the synapse to the nucleus, and represent an exciting field of research primed to change quickly as new experimental evidence emerges.This article is part of a discussion meeting issue 'Of mice and mental health: facilitating dialogue between basic and clinical neuroscientists'. © 2018 The Authors.
The mysteries of remote memory
2018-01-01
Long-lasting memories form the basis of our identity as individuals and lie central in shaping future behaviours that guide survival. Surprisingly, however, our current knowledge of how such memories are stored in the brain and retrieved, as well as the dynamics of the circuits involved, remains scarce despite seminal technical and experimental breakthroughs in recent years. Traditionally, it has been proposed that, over time, information initially learnt in the hippocampus is stored in distributed cortical networks. This process—the standard theory of memory consolidation—would stabilize the newly encoded information into a lasting memory, become independent of the hippocampus, and remain essentially unmodifiable throughout the lifetime of the individual. In recent years, several pieces of evidence have started to challenge this view and indicate that long-lasting memories might already ab ovo be encoded, and subsequently stored in distributed cortical networks, akin to the multiple trace theory of memory consolidation. In this review, we summarize these recent findings and attempt to identify the biologically plausible mechanisms based on which a contextual memory becomes remote by integrating different levels of analysis: from neural circuits to cell ensembles across synaptic remodelling and epigenetic modifications. From these studies, remote memory formation and maintenance appear to occur through a multi-trace, dynamic and integrative cellular process ranging from the synapse to the nucleus, and represent an exciting field of research primed to change quickly as new experimental evidence emerges. This article is part of a discussion meeting issue ‘Of mice and mental health: facilitating dialogue between basic and clinical neuroscientists’. PMID:29352028
ERIC Educational Resources Information Center
Hoffman, Sharon C.
2008-01-01
The purpose of this historical review was to trace the credible leadership construct of trustworthiness, integrity, honesty, and consistency in leadership theory development during the last 100 years in the United States. Theory focus, key U.S. pivotal events, and follower importance influenced the construct's occurrence in leadership theory. …
Milne boost from Galilean gauge theory
NASA Astrophysics Data System (ADS)
Banerjee, Rabin; Mukherjee, Pradip
2018-03-01
Physical origin of Milne boost invariance of the Newton Cartan spacetime is traced to the effect of local Galilean boosts in its metric structure, using Galilean gauge theory. Specifically, we do not require any gauge field to understand Milne boost invariance.
Development and application of social learning theory.
Price, V; Archbold, J
This article traces the development of social learning theory over the last 30 years, relating the developments to clinical nursing practice. Particular attention is focused on the contribution of Albert Bandura, the American psychologist, and his work on modelling.
A new method for automated discontinuity trace mapping on rock mass 3D surface model
NASA Astrophysics Data System (ADS)
Li, Xiaojun; Chen, Jianqin; Zhu, Hehua
2016-04-01
This paper presents an automated discontinuity trace mapping method on a 3D surface model of rock mass. Feature points of discontinuity traces are first detected using the Normal Tensor Voting Theory, which is robust to noisy point cloud data. Discontinuity traces are then extracted from feature points in four steps: (1) trace feature point grouping, (2) trace segment growth, (3) trace segment connection, and (4) redundant trace segment removal. A sensitivity analysis is conducted to identify optimal values for the parameters used in the proposed method. The optimal triangular mesh element size is between 5 cm and 6 cm; the angle threshold in the trace segment growth step is between 70° and 90°; the angle threshold in the trace segment connection step is between 50° and 70°, and the distance threshold should be at least 15 times the mean triangular mesh element size. The method is applied to the excavation face trace mapping of a drill-and-blast tunnel. The results show that the proposed discontinuity trace mapping method is fast and effective and could be used as a supplement to traditional direct measurement of discontinuity traces.
Retrograde Amnesia for Episodic and Semantic Memories in Amnestic Mild Cognitive Impairment.
De Simone, Maria Stefania; Fadda, Lucia; Perri, Roberta; De Tollis, Massimo; Aloisi, Marta; Caltagirone, Carlo; Carlesimo, Giovanni Augusto
2017-01-01
Retrograde amnesia (RA), which includes loss of memory for past personal events (autobiographical RA) and for acquired knowledge (semantic RA), has been largely documented in patients with amnestic mild cognitive impairment (aMCI). However, previous studies have produced controversial results particularly concerning the temporal extent of memory impairment. Here we investigated whether, with the onset of hippocampal pathology, age of memory acquisition and retrieval frequency play different roles in modulating the progressive loss of semantic and episodic contents of retrograde memory respectively. For this purpose, aMCI patients and healthy controls were tested for the ability to recall semantic and autobiographical information related to famous public events as a function of both age of acquisition and retrieval frequency. In aMCI patients, we found that the impairment in recollecting past personal incidents was modulated by the combined action of memory age and retrieval frequency, because older and more frequently retrieved episodes are less susceptible to loss than more recent and less frequently retrieved ones. On the other side, we found that the loss of semantic information depended only on memory age, because the remoteness of the trace allows for better preservation of the memory. Our results provide evidence that the loss of the two components of retrograde memory is regulated by different mechanisms. This supports the view that diverse neural mechanisms are involved in episodic and semantic memory trace storage and retrieval, as postulated by the Multiple Trace Theory.
Smith, Troy A; Kimball, Daniel R
2012-01-01
Leading theories of false memory predict that veridical and false recall of lists of semantically associated words can be dissociated by varying the presentation speed during study. Specifically, as presentation rate increases from milliseconds to seconds, veridical recall is predicted to increase monotonically while false recall is predicted to show a rapid rise and then a slow decrease--a pattern shown by McDermott and Watson (2001) in a study using immediate recall tests. In three experiments we tested the generality of the effects of rapid presentation rates on veridical and false memory. In Experiments 1 and 2 participants exhibited high levels of false recall on a delayed recall test, even for very fast stimulus onset asynchronies (SOA)--contrary to predictions from leading theories of false memory. When we switched to an immediate recall test in Experiment 3 we replicated the pattern predicted by the theories and observed by McDermott and Watson. Follow-up analyses further showed that the relative output position of false recalls is not affected by presentation rate, contrary to predictions from fuzzy trace theory. Implications for theories of false memory, including activation monitoring theory and fuzzy trace theory, are discussed.
Nam, Haewon
2017-01-01
We propose a novel metal artifact reduction (MAR) algorithm for CT images that completes a corrupted sinogram along the metal trace region. When metal implants are located inside a field of view, they create a barrier to the transmitted X-ray beam due to the high attenuation of metals, which significantly degrades the image quality. To fill in the metal trace region efficiently, the proposed algorithm uses multiple prior images with residual error compensation in sinogram space. Multiple prior images are generated by applying a recursive active contour (RAC) segmentation algorithm to the pre-corrected image acquired by MAR with linear interpolation, where the number of prior image is controlled by RAC depending on the object complexity. A sinogram basis is then acquired by forward projection of the prior images. The metal trace region of the original sinogram is replaced by the linearly combined sinogram of the prior images. Then, the additional correction in the metal trace region is performed to compensate the residual errors occurred by non-ideal data acquisition condition. The performance of the proposed MAR algorithm is compared with MAR with linear interpolation and the normalized MAR algorithm using simulated and experimental data. The results show that the proposed algorithm outperforms other MAR algorithms, especially when the object is complex with multiple bone objects. PMID:28604794
On the Performance Characteristics of Latent-Factor and Knowledge Tracing Models
ERIC Educational Resources Information Center
Klingler, Severin; Käser, Tanja; Solenthaler, Barbara; Gross, Markus
2015-01-01
Modeling student knowledge is a fundamental task of an intelligent tutoring system. A popular approach for modeling the acquisition of knowledge is Bayesian Knowledge Tracing (BKT). Various extensions to the original BKT model have been proposed, among them two novel models that unify BKT and Item Response Theory (IRT). Latent Factor Knowledge…
Decay uncovered in nonverbal short-term memory.
Mercer, Tom; McKeown, Denis
2014-02-01
Decay theory posits that memory traces gradually fade away over the passage of time unless they are actively rehearsed. Much recent work exploring verbal short-term memory has challenged this theory, but there does appear to be evidence for trace decay in nonverbal auditory short-term memory. Numerous discrimination studies have reported a performance decline as the interval separating two tones is increased, consistent with a decay process. However, most of this tone comparison research can be explained in other ways, without reference to decay, and these alternative accounts were tested in the present study. In Experiment 1, signals were employed toward the end of extended retention intervals to ensure that listeners were alert to the presence and frequency content of the memoranda. In Experiment 2, a mask stimulus was employed in an attempt to distinguish between a highly detailed sensory trace and a longer-lasting short-term memory, and the distinctiveness of the stimuli was varied. Despite these precautions, slow-acting trace decay was observed. It therefore appears that the mere passage of time can lead to forgetting in some forms of short-term memory.
Consequences of Diffusion of Innovations.
ERIC Educational Resources Information Center
Goss, Kevin F.
1979-01-01
The article traces evolution of diffusion theory; illustrates undesirable consequences in a cross-cultural setting, reviews criticisms of several scholars; considers distributional effects and unanticipated consequences for potential ameliorative impact on diffusion theory; and codifies these factors into a framework for research into consequences…
ERIC Educational Resources Information Center
Marks, Stephen R.
1974-01-01
Durkheim's theory of anomie is traced and argued to be a major development that followed the publication of "Suicide." Recognition of anomie as a macrosociological problem rendered it insoluble by Durkeheim's practical-humanistic orientation. In this connection his remedial proposals -- occupational, political, education, and…
ERIC Educational Resources Information Center
Laidlaw, Toni; Malmo, Cheryl
1991-01-01
Traces roots of feminist therapy and its independence from traditional and prevalent theories and therapy practices. Asserts that Freudian theory and humanistic assumptions are sexist and contribute to powerlessness of women. In contrast, feminist therapy is seen as dealing directly with client-counselor relationships, trust, advocacy, and…
NASA Technical Reports Server (NTRS)
Iesan, D.
1980-01-01
The development of the theory of thermoelasticity, which examines the interactions between the deformation of elastic media and the thermal field, is traced and the fundamental problems of the theory are presented. Results of recent studies on the subject are presented. Emphasis is primarily on media with generalized anisotropy, or isotropy media. Thermomechanical problems and mathematical formulations and resolutions are included.
From Myths to Models: The (Re)Production of World Culture in Comparative Education
ERIC Educational Resources Information Center
Silova, Iveta; Brehm, William C.
2015-01-01
This article traces the emergence of the world culture theory in comparative education using critical discourse analysis. By chronicling the emergence and expansion of world culture theory over the past four decades, we highlight the (unintended) limitations and exclusive regimes of thought that have resulted. We argue that the theory's…
Constructing a Grounded Theory of E-Learning Assessment
ERIC Educational Resources Information Center
Alonso-Díaz, Laura; Yuste-Tosina, Rocío
2015-01-01
This study traces the development of a grounded theory of assessment in e-learning environments, a field in need of research to establish the parameters of an assessment that is both reliable and worthy of higher learning accreditation. Using grounded theory as a research method, we studied an e-assessment model that does not require physical…
Tennessee to Texas: Tracing the Evolution Controversy in Public Education
ERIC Educational Resources Information Center
Armenta, Tony; Lane, Kenneth E.
2010-01-01
Darwin's Theory of Evolution has stirred controversy since its inception. Public schools in the United States, pressed by special interest groups on both sides of the controversy, have struggled with how best to teach the theory, if at all. Court cases have dealt with whether states can ban the teaching of evolutionary theory, whether Creationism…
ERIC Educational Resources Information Center
Wright, Ruth; Froehlich, Hildegard
2012-01-01
This article describes Basil Bernstein's theory of the pedagogic device as applied to school music instruction. Showing that educational practices are not personal choices alone, but the result of socio-political mandates, the article traces how education functions as a vehicle for social reproduction. Bernstein called this process the…
ERIC Educational Resources Information Center
Wilks, Duffy
2003-01-01
This review traces the development of counseling theory in relation to the philosophical constructs of free will and determinism. Problems associated with free will are discussed, and an analysis of related theoretical trends and convergent paradigms is provided. Results indicate that no major theory of counseling addresses the free will versus…
Asbridge, Mark
2004-01-01
While much is known about the impact of law and public policy, we know considerably less about their antecedents. Theories of policy adoption suggest that a variety of policy inputs help to shape legislative change. This research considers the enactment of municipal smoking bylaws in Canada between 1970 and 1995. The emergence of second-hand smoke (SHS) has been offered as a viable explanation for the increased enactment of local smoking restrictions. A number of indicators confirm the rising public health concern around SHS. Using Health Canada data on municipal smoking bylaw enactment in Canada, this paper employs an event history analysis to trace the role of four indicators of the increased recognition of SHS as a public health concern-scientific research, parliamentary debate, print media, and health advocacy. Findings indicate that the print media and health advocacy play the strongest role in explaining the increase in the adoption of municipal smoking bylaws in Canada. Results lend support to the quantitative study of the policy adoption process and to theories of policy making that consider multiple influences on policy adoption.
On Mars too, expect macroweather
NASA Astrophysics Data System (ADS)
Boisvert, Jean-Philippe; Lovejoy, Shaun; Muller, Jan-Peter
2015-04-01
Terrestrial atmospheric and oceanic spectra show drastic transitions at τw ˜ 10 days and τow ˜ 1 year respectively; this has been theorized as the lifetime of planetary scale structures. For wind and temperature, the forms of the low and high frequency parts of the spectra (macroweather, weather) as well as the τw can be theoretically estimated, the latter depending notably on the solar induced turbulent energy flux. We extend the theory to other planets and test it using Viking lander and reanalysis data from Mars. When the Martian spectra are scaled by the theoretical amount, they agree very well with their terrestrial atmospheric counterparts. Although the usual interpretation of Martian atmospheric dynamics is highly mechanistic (e.g. wave and tidal explanations are invoked), trace moment analysis of the reanalysis fields shows that the statistics well respect the predictions of multiplicative cascade theories. This shows that statistical scaling can be compatible with conventional deterministic thinking. However, since we are usually interested in statistical knowledge, it is the former not the latter that is of primary interest. We discuss the implications for understanding planetary fluid dynamical systems.
Correlative weighted stacking for seismic data in the wavelet domain
Zhang, S.; Xu, Y.; Xia, J.; ,
2004-01-01
Horizontal stacking plays a crucial role for modern seismic data processing, for it not only compresses random noise and multiple reflections, but also provides a foundational data for subsequent migration and inversion. However, a number of examples showed that random noise in adjacent traces exhibits correlation and coherence. The average stacking and weighted stacking based on the conventional correlative function all result in false events, which are caused by noise. Wavelet transform and high order statistics are very useful methods for modern signal processing. The multiresolution analysis in wavelet theory can decompose signal on difference scales, and high order correlative function can inhibit correlative noise, for which the conventional correlative function is of no use. Based on the theory of wavelet transform and high order statistics, high order correlative weighted stacking (HOCWS) technique is presented in this paper. Its essence is to stack common midpoint gathers after the normal moveout correction by weight that is calculated through high order correlative statistics in the wavelet domain. Synthetic examples demonstrate its advantages in improving the signal to noise (S/N) ration and compressing the correlative random noise.
Mønster, Jacob G; Samuelsson, Jerker; Kjeldsen, Peter; Rella, Chris W; Scheutz, Charlotte
2014-08-01
Using a dual species methane/acetylene instrument based on cavity ring down spectroscopy (CRDS), the dynamic plume tracer dispersion method for quantifying the emission rate of methane was successfully tested in four measurement campaigns: (1) controlled methane and trace gas release with different trace gas configurations, (2) landfill with unknown emission source locations, (3) landfill with closely located emission sources, and (4) comparing with an Fourier transform infrared spectroscopy (FTIR) instrument using multiple trace gasses for source separation. The new real-time, high precision instrument can measure methane plumes more than 1.2 km away from small sources (about 5 kg h(-1)) in urban areas with a measurement frequency allowing plume crossing at normal driving speed. The method can be used for quantification of total methane emissions from diffuse area sources down to 1 kg per hour and can be used to quantify individual sources with the right choice of wind direction and road distance. The placement of the trace gas is important for obtaining correct quantification and uncertainty of up to 36% can be incurred when the trace gas is not co-located with the methane source. Measurements made at greater distances are less sensitive to errors in trace gas placement and model calculations showed an uncertainty of less than 5% in both urban and open-country for placing the trace gas 100 m from the source, when measurements were done more than 3 km away. Using the ratio of the integrated plume concentrations of tracer gas and methane gives the most reliable results for measurements at various distances to the source, compared to the ratio of the highest concentration in the plume, the direct concentration ratio and using a Gaussian plume model. Under suitable weather and road conditions, the CRDS system can quantify the emission from different sources located close to each other using only one kind of trace gas due to the high time resolution, while the FTIR system can measure multiple trace gasses but with a lower time resolution. Copyright © 2014 Elsevier Ltd. All rights reserved.
Historical foundations and future directions in macrosystems ecology.
Rose, Kevin C; Graves, Rose A; Hansen, Winslow D; Harvey, Brian J; Qiu, Jiangxiao; Wood, Stephen A; Ziter, Carly; Turner, Monica G
2017-02-01
Macrosystems ecology is an effort to understand ecological processes and interactions at the broadest spatial scales and has potential to help solve globally important social and ecological challenges. It is important to understand the intellectual legacies underpinning macrosystems ecology: How the subdiscipline fits within, builds upon, differs from and extends previous theories. We trace the rise of macrosystems ecology with respect to preceding theories and present a new hypothesis that integrates the multiple components of macrosystems theory. The spatio-temporal anthropogenic rescaling (STAR) hypothesis suggests that human activities are altering the scales of ecological processes, resulting in interactions at novel space-time scale combinations that are diverse and predictable. We articulate four predictions about how human actions are "expanding", "shrinking", "speeding up" and "slowing down" ecological processes and interactions, and thereby generating new scaling relationships for ecological patterns and processes. We provide examples of these rescaling processes and describe ecological consequences across terrestrial, freshwater and marine ecosystems. Rescaling depends in part on characteristics including connectivity, stability and heterogeneity. Our STAR hypothesis challenges traditional assumptions about how the spatial and temporal scales of processes and interactions operate in different types of ecosystems and provides a lens through which to understand macrosystem-scale environmental change. © 2016 John Wiley & Sons Ltd/CNRS.
Democracy's Aristocrat: The Gifted Child in America, 1910-1960.
ERIC Educational Resources Information Center
Hildenbrand, Suzanne
The author traces the gifted education movement in the United States from the beginnings in the early 1900s of the intelligence testing movement. Societal conceptions about the ignorance of the masses fed the movement. The emergence of gifted child theory is traced to Lewis Terman and Leta Hollingworth. Terman's association of mental ability with…
Nonexotic matter wormholes in a trace of the energy-momentum tensor squared gravity
NASA Astrophysics Data System (ADS)
Moraes, P. H. R. S.; Sahoo, P. K.
2018-01-01
Wormholes are tunnels connecting two different points in space-time. In Einstein's general relativity theory, wormholes are expected to be filled by exotic matter, i.e., matter that does not satisfy the energy conditions and may have negative density. We propose, in this paper, the achievement of wormhole solutions with no need for exotic matter. In order to achieve so, we consider a gravity theory that starts from linear and quadratic terms on the trace of the energy-momentum tensor in the gravitational action. We show that by following this formalism, it is possible, indeed, to obtain nonexotic matter wormhole solutions.
Entanglement entropy between real and virtual particles in ϕ4 quantum field theory
NASA Astrophysics Data System (ADS)
Ardenghi, Juan Sebastián
2015-04-01
The aim of this work is to compute the entanglement entropy of real and virtual particles by rewriting the generating functional of ϕ4 theory as a mean value between states and observables defined through the correlation functions. Then the von Neumann definition of entropy can be applied to these quantum states and in particular, for the partial traces taken over the internal or external degrees of freedom. This procedure can be done for each order in the perturbation expansion showing that the entanglement entropy for real and virtual particles behaves as ln (m0). In particular, entanglement entropy is computed at first order for the correlation function of two external points showing that mutual information is identical to the external entropy and that conditional entropies are negative for all the domain of m0. In turn, from the definition of the quantum states, it is possible to obtain general relations between total traces between different quantum states of a ϕr theory. Finally, discussion about the possibility of taking partial traces over external degrees of freedom is considered, which implies the introduction of some observables that measure space-time points where an interaction occurs.
Explicit formulae for Yang-Mills-Einstein amplitudes from the double copy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chiodaroli, Marco; Günaydin, Murat; Johansson, Henrik
Using the double-copy construction of Yang-Mills-Einstein theories formulated in our earlier work, we obtain compact presentations for single-trace Yang-Mills-Einstein tree amplitudes with up to five external gravitons and an arbitrary number of gluons. These are written as linear combinations of color-ordered Yang-Mills trees, where the coefficients are given by color/kinematics-satisfying numerators in a Yang-Mills + φ 3 theory. The construction outlined in this paper holds in general dimension and extends straightforwardly to supergravity theories. For one, two, and three external gravitons, our expressions give identical or simpler presentations of amplitudes already constructed through string-theory considerations or the scattering equations formalism.more » Our results are based on color/kinematics duality and gauge invariance, and strongly hint at a recursive structure underlying the single-trace amplitudes with an arbitrary number of gravitons. We also present explicit expressions for all-loop single-graviton Einstein-Yang-Mills amplitudes in terms of Yang-Mills amplitudes and, through gauge invariance, derive new all-loop amplitude relations for Yang-Mills theory.« less
Explicit formulae for Yang-Mills-Einstein amplitudes from the double copy
Chiodaroli, Marco; Günaydin, Murat; Johansson, Henrik; ...
2017-07-03
Using the double-copy construction of Yang-Mills-Einstein theories formulated in our earlier work, we obtain compact presentations for single-trace Yang-Mills-Einstein tree amplitudes with up to five external gravitons and an arbitrary number of gluons. These are written as linear combinations of color-ordered Yang-Mills trees, where the coefficients are given by color/kinematics-satisfying numerators in a Yang-Mills + φ 3 theory. The construction outlined in this paper holds in general dimension and extends straightforwardly to supergravity theories. For one, two, and three external gravitons, our expressions give identical or simpler presentations of amplitudes already constructed through string-theory considerations or the scattering equations formalism.more » Our results are based on color/kinematics duality and gauge invariance, and strongly hint at a recursive structure underlying the single-trace amplitudes with an arbitrary number of gravitons. We also present explicit expressions for all-loop single-graviton Einstein-Yang-Mills amplitudes in terms of Yang-Mills amplitudes and, through gauge invariance, derive new all-loop amplitude relations for Yang-Mills theory.« less
Lu, Shaoyou; Ren, Lu; Fang, Jianzhang; Ji, Jiajia; Liu, Guihua; Zhang, Jianqing; Zhang, Huimin; Luo, Ruorong; Lin, Kai; Fan, Ruifang
2016-05-01
Many trace heavy elements are carcinogenic and increase the incidence of cancer. However, a comprehensive study of the correlation between multiple trace elements and DNA oxidative damage is still lacking. The aim of this study is to investigate the relationships between the body burden of multiple trace elements and DNA oxidative stress in college students in Guangzhou, China. Seventeen trace elements in urine samples were determined by inductively coupled plasma-mass spectrometry (ICP-MS). Urinary 8-hydroxy-2'-deoxyguanosine (8-OHdG), a biomarker of DNA oxidative stress, was also measured using liquid chromatography tandem mass spectrometer (LC-MS/MS). The concentrations of six essential elements including manganese (Mn), copper (Cu), nickel (Ni), selenium (Se), strontium (Sr), and molybdenum (Mo), and five non-essential elements including arsenic (As), cadmium (Cd), aluminum (Al), stibium (Sb), and thallium (Tl), were found to be significantly correlated with urinary 8-OHdG levels. Moreover, urinary levels of Ni, Se, Mo, As, Sr, and Tl were strongly significantly correlated with 8-OHdG (P < 0.01) concentration. Environmental exposure and dietary intake of these trace elements may play important roles in DNA oxidative damage in the population of Guangzhou, China.
ERIC Educational Resources Information Center
Matthews, P. H.
A survey of the history of linguistic theory concerning grammar in the United States traces the development of theory since 1910. It begins with a general historical review of American linguistics. The subsequent three chapters focus on grammar. The first of these deals with morphology, beginning with Leonard Bloomfield's ideas in both his early…
Hypertext Theory: Rethinking and Reformulating What We Know, Web 2.0
ERIC Educational Resources Information Center
Baehr, Craig; Lang, Susan M.
2012-01-01
This article traces the influences of hypertext theory throughout the various genres of online publication in technical communication. It begins with a look back at some of the important concepts and theorists writing about hypertext theory from the post-World War II era, to the early years of the World Wide Web 2.0, and the very differing notions…
Fujita, Masahiko
2016-03-01
Lesions of the cerebellum result in large errors in movements. The cerebellum adaptively controls the strength and timing of motor command signals depending on the internal and external environments of movements. The present theory describes how the cerebellar cortex can control signals for accurate and timed movements. A model network of the cerebellar Golgi and granule cells is shown to be equivalent to a multiple-input (from mossy fibers) hierarchical neural network with a single hidden layer of threshold units (granule cells) that receive a common recurrent inhibition (from a Golgi cell). The weighted sum of the hidden unit signals (Purkinje cell output) is theoretically analyzed regarding the capability of the network to perform two types of universal function approximation. The hidden units begin firing as the excitatory inputs exceed the recurrent inhibition. This simple threshold feature leads to the first approximation theory, and the network final output can be any continuous function of the multiple inputs. When the input is constant, this output becomes stationary. However, when the recurrent unit activity is triggered to decrease or the recurrent inhibition is triggered to increase through a certain mechanism (metabotropic modulation or extrasynaptic spillover), the network can generate any continuous signals for a prolonged period of change in the activity of recurrent signals, as the second approximation theory shows. By incorporating the cerebellar capability of two such types of approximations to a motor system, in which learning proceeds through repeated movement trials with accompanying corrections, accurate and timed responses for reaching the target can be adaptively acquired. Simple models of motor control can solve the motor error vs. sensory error problem, as well as the structural aspects of credit (or error) assignment problem. Two physiological experiments are proposed for examining the delay and trace conditioning of eyelid responses, as well as saccade adaptation, to investigate this novel idea of cerebellar processing. Copyright © 2015 Elsevier Ltd. All rights reserved.
Postretrieval new learning does not reliably induce human memory updating via reconsolidation.
Hardwicke, Tom E; Taqi, Mahdi; Shanks, David R
2016-05-10
Reconsolidation theory proposes that retrieval can destabilize an existing memory trace, opening a time-dependent window during which that trace is amenable to modification. Support for the theory is largely drawn from nonhuman animal studies that use invasive pharmacological or electroconvulsive interventions to disrupt a putative postretrieval restabilization ("reconsolidation") process. In human reconsolidation studies, however, it is often claimed that postretrieval new learning can be used as a means of "updating" or "rewriting" existing memory traces. This proposal warrants close scrutiny because the ability to modify information stored in the memory system has profound theoretical, clinical, and ethical implications. The present study aimed to replicate and extend a prominent 3-day motor-sequence learning study [Walker MP, Brakefield T, Hobson JA, Stickgold R (2003) Nature 425(6958):616-620] that is widely cited as a convincing demonstration of human reconsolidation. However, in four direct replication attempts (n = 64), we did not observe the critical impairment effect that has previously been taken to indicate disruption of an existing motor memory trace. In three additional conceptual replications (n = 48), we explored the broader validity of reconsolidation-updating theory by using a declarative recall task and sequences similar to phone numbers or computer passwords. Rather than inducing vulnerability to interference, memory retrieval appeared to aid the preservation of existing sequence knowledge relative to a no-retrieval control group. These findings suggest that memory retrieval followed by new learning does not reliably induce human memory updating via reconsolidation.
Rethinking the Question of Quality in Art.
ERIC Educational Resources Information Center
Ewens, Thomas
1994-01-01
Discusses the concept of quality in art from the standpoint of the theory of mediation. Traces the idea of quality from Aristotelian criticism to Gagnepain's theory of mediation. Concludes that mediation aesthetics seek inspiration and quality only from the art work, not its contemporary meaning. (CFR)
Blanton, Hart; Jaccard, James
2006-01-01
Theories that posit multiplicative relationships between variables are common in psychology. A. G. Greenwald et al. recently presented a theory that explicated relationships between group identification, group attitudes, and self-esteem. Their theory posits a multiplicative relationship between concepts when predicting a criterion variable. Greenwald et al. suggested analytic strategies to test their multiplicative model that researchers might assume are appropriate for testing multiplicative models more generally. The theory and analytic strategies of Greenwald et al. are used as a case study to show the strong measurement assumptions that underlie certain tests of multiplicative models. It is shown that the approach used by Greenwald et al. can lead to declarations of theoretical support when the theory is wrong as well as rejection of the theory when the theory is correct. A simple strategy for testing multiplicative models that makes weaker measurement assumptions than the strategy proposed by Greenwald et al. is suggested and discussed.
Development of Concepts in the History of Quantum Theory
ERIC Educational Resources Information Center
Heisenberg, Werner
1975-01-01
Traces the development of quantum theory from the concept of the discrete stationary states, to the generalized concept of state, to the search for the elementary particle. States that the concept of the elementary particle should be replaced by the concept of a fundamental symmetry. (MLH)
ERIC Educational Resources Information Center
Klein, H. Arthur
Holography is a process which numbers among its many applications the creation of holograms--unique three dimensional photographs that show spatial relations and shifts just as they exist in reality. This book recounts the history of holography, tracing its development from Euclid's theory of light rays through Huygens' theory of wave motion to…
A Super Contribution to Vocational Theory: Work Values.
ERIC Educational Resources Information Center
Zytowski, Donald G.
1994-01-01
Traces influence of Donald Super in introducing work values into career development/vocational theory. Reviews conceptualization, taxonomy, and assessment of work values. Presents research bearing on Super's "onion model," representing his views on relationship of work values to other affective variables. Reviews research regarding functional role…
The Literate Lives of Chamorro Women in Modern Guam
ERIC Educational Resources Information Center
Santos-Bamba, Sharleen J.Q.
2010-01-01
This ethnographic study traces the language and literacy attitudes, perceptions, and practices of three generations of indigenous Chamorro women in modern Guam. Through the lens of postcolonial theory, cultural literacy, intergenerational transmission theory, community of practice, and language and identity, this study examines how literacy is…
ERIC Educational Resources Information Center
Ohlsson, Stellan
Recent theoretical developments in cognitive psychology imply both a need and a possibility for methodological development. In particular, the theory of problem solving proposed by Allen Newell and Herbert A. Simon (1972) provides the rationale for a new empirical method for the processing of think-aloud protocols--trace analysis. A detailed…
2012-01-01
We propose a tripartite biochemical mechanism for memory. Three physiologic components are involved, namely, the neuron (individual and circuit), the surrounding neural extracellular matrix, and the various trace metals distributed within the matrix. The binding of a metal cation affects a corresponding nanostructure (shrinking, twisting, expansion) and dielectric sensibility of the chelating node (address) within the matrix lattice, sensed by the neuron. The neural extracellular matrix serves as an electro-elastic lattice, wherein neurons manipulate multiple trace metals (n > 10) to encode, store, and decode coginive information. The proposed mechanism explains brains low energy requirements and high rates of storage capacity described in multiples of Avogadro number (NA = 6 × 1023). Supportive evidence correlates memory loss to trace metal toxicity or deficiency, or breakdown in the delivery/transport of metals to the matrix, or its degradation. Inherited diseases revolving around dysfunctional trace metal metabolism and memory dysfunction, include Alzheimer's disease (Al, Zn, Fe), Wilson’s disease (Cu), thalassemia (Fe), and autism (metallothionein). The tripartite mechanism points to the electro-elastic interactions of neurons with trace metals distributed within the neural extracellular matrix, as the molecular underpinning of “synaptic plasticity” affecting short-term memory, long-term memory, and forgetting. PMID:23050060
Application of zinc isotope tracer technology in tracing soil heavy metal pollution
NASA Astrophysics Data System (ADS)
Norbu, Namkha; Wang, Shuguang; Xu, Yan; Yang, Jianqiang; Liu, Qiang
2017-08-01
Recent years the soil heavy metal pollution has become increasingly serious, especially the zinc pollution. Due to the complexity of this problem, in order to prevent and treat the soil pollution, it's crucial to accurately and quickly find out the pollution sources and control them. With the development of stable isotope tracer technology, it's able to determine the composition of zinc isotope. Based on the theory of zinc isotope tracer technique, and by means of doing some latest domestic and overseas literature research about the zinc isotope multi-receiving cups of inductively coupled plasma mass spectrometer (MC-ICP-MS) testing technology, this paper summarized the latest research results about the pollution tracer of zinc isotope, and according to the deficiencies and existing problems of previous research, made outlooks of zinc isotope fractionation mechanism, repository establishment and tracer multiple solutions.
NASA Technical Reports Server (NTRS)
Dorband, John E.
1987-01-01
Generating graphics to faithfully represent information can be a computationally intensive task. A way of using the Massively Parallel Processor to generate images by ray tracing is presented. This technique uses sort computation, a method of performing generalized routing interspersed with computation on a single-instruction-multiple-data (SIMD) computer.
Mental Space Theory and Misunderstanding
ERIC Educational Resources Information Center
Liu, Hui; Gao, Yueqin
2010-01-01
This essay attempts to conduct an explanatory research on MIS within the framework of mental space theory to demonstrate the cognitive operating process of MIS in people's social interaction and explore the deep causes lying behind the phenomenon. By text analysis, the author elaborates on the generating process of MIS, thus tracing cognitive…
Influences on Preservice Teacher Socialization: A Qualitative Study
ERIC Educational Resources Information Center
Marks, Melissa J.
2007-01-01
This qualitative two-year study traces the changes in beliefs and actions of four preservice teachers through the final two years of their university education program. Dialectical Theory of Socialization and Cognitive Dissonance Theory provide the theoretical framework. The findings show that three main factors affect the transfer of learning…
The Implications of Modern Approaches to Language for Teacher Training.
ERIC Educational Resources Information Center
Williams, Huw
1984-01-01
Connections between recent developments in theories about language, learning theory, and language teaching are traced from Chomsky's work elaborating the distinction between competence and performance. The evolution of the concepts of function and notion from the study of how language and communication come together in linguistic philosophy is…
Why Realism Seems So Naive: Romanticism, Professionalism, and Contemporary Critical Theory.
ERIC Educational Resources Information Center
Fischer, Michael
1979-01-01
Discusses the emergence of a new kind of criticism which has as its philosophical starting point the rejections of mimesis, and traces the process leading up to the development of this critical theory--a process which began in English criticism of the romantic period. (DD)
An Autoethnography of Masculinities: Flexibility and Flexing in Guyland
ERIC Educational Resources Information Center
Sweet, Joseph D.
2017-01-01
This autoethnography traces the author's shifting masculine identities as they have evolved across time and contexts. This piece splices journals and blogs from the author's past with prevailing masculinities theory, spectral data (Nordstrom, 2013), post-structural feminist theory and the author's present gender identity to investigate what can be…
A Perspective on the History of Process and Outcome Research in Counseling Psychology.
ERIC Educational Resources Information Center
Hill, Clara E.; Corbett, Maureen M.
1993-01-01
Traces development of process and outcome research from before foundation of counseling psychology in 1946 to present. Describes influence of Carl Rogers's theory, behavior, psychoanalytic, systems, interpersonal, and social influence theories. Covers Eysenck's challenge to efficacy of psychotherapy; uniformity myth that process and outcome are…
From Weber to Parsons and Shutz: The Eclipse of History in Modern Social Theory.
ERIC Educational Resources Information Center
Zaret, David
1980-01-01
Compares the relationship between theoretical synthesis and historical research in light of research by Max Weber, Talcott Parsons, and Alfred Schutz. Traces theoretical developments within one subfield of sociology (action theory) and relates these developments to research problems confronting contemporary theoretical work in sociology. (DB)
Formula for the rms blur circle radius of Wolter telescope based on aberration theory
NASA Technical Reports Server (NTRS)
Shealy, David L.; Saha, Timo T.
1990-01-01
A formula for the rms blur circle for Wolter telescopes has been derived using the transverse ray aberration expressions of Saha (1985), Saha (1984), and Saha (1986). The resulting formula for the rms blur circle radius over an image plane and a formula for the surface of best focus based on third-, fifth-, and seventh-order aberration theory predict results in good agreement with exact ray tracing. It has also been shown that one of the two terms in the empirical formula of VanSpeybroeck and Chase (1972), for the rms blur circle radius of a Wolter I telescope can be justified by the aberration theory results. Numerical results are given comparing the rms blur radius and the surface of best focus vs the half-field angle computed by skew ray tracing and from analytical formulas for grazing incidence Wolter I-II telescopes and a normal incidence Cassegrain telescope.
Einstein-Yang-Mills scattering amplitudes from scattering equations
NASA Astrophysics Data System (ADS)
Cachazo, Freddy; He, Song; Yuan, Ellis Ye
2015-01-01
We present the building blocks that can be combined to produce tree-level S-matrix elements of a variety of theories with various spins mixed in arbitrary dimensions. The new formulas for the scattering of n massless particles are given by integrals over the positions of n points on a sphere restricted to satisfy the scattering equations. As applications, we obtain all single-trace amplitudes in Einstein-Yang-Mills (EYM) theory, and generalizations to include scalars. Also in EYM but extended by a B-field and a dilaton, we present all double-trace gluon amplitudes. The building blocks are made of Pfaffians and Parke-Taylor-like factors of subsets of particle labels.
Holographic reconstruction of AdS exchanges from crossing symmetry
Alday, Luis F.; Bissi, Agnese; Perlmutter, Eric
2017-08-31
Motivated by AdS/CFT, we address the following outstanding question in large N conformal field theory: given the appearance of a single-trace operator in the O x O OPE of a scalar primary O, what is its total contribution to the vacuum four-point function (OOOO) as dictated by crossing symmetry? We solve this problem in 4d conformal field theories at leading order in 1/N. Viewed holographically, this provides a field theory reconstruction of crossing-symmetric, four-point exchange amplitudes in AdS 5. Our solution takes the form of a resummation of the large spin solution to the crossing equations, supplemented by corrections atmore » finite spin, required by crossing. The method can be applied to the exchange of operators of arbitrary twist τ and spin s, although it vastly simplifies for even-integer twist, where we give explicit results. The output is the set of OPE data for the exchange of all double-trace operators [OO] n,ℓ. We find that the double-trace anomalous dimensions γ n,ℓ are negative, monotonic and convex functions of ℓ, for all n and all ℓ > s. This constitutes a holographic signature of bulk causality and classical dynamics of even-spin fields. We also find that the “derivative relation” between double-trace anomalous dimensions and OPE coefficients does not hold in general, and derive the explicit form of the deviation in several cases. Finally, we study large n limits of γ n,ℓ, relevant for the Regge and bulk-point regimes.« less
A Biographic Comparison Tracing the Origin of Their Ideas of Jean Piaget and Lev Vygotsky.
ERIC Educational Resources Information Center
Pass, Susan
This paper compares the early life, background, and education of Jean Piaget and Lev Vygotsky. It makes the case that an adaptation of the curve developed by C. Quigley can be used to trace the motivations of both Piaget and Vygotsky in creating their respective theories. The analysis also reveals the adversity that each man faced. Although they…
Theory of Multiple Intelligences: Is It a Scientific Theory?
ERIC Educational Resources Information Center
Chen, Jie-Qi
2004-01-01
This essay discusses the status of multiple intelligences (MI) theory as a scientific theory by addressing three issues: the empirical evidence Gardner used to establish MI theory, the methodology he employed to validate MI theory, and the purpose or function of MI theory.
Integrating Multiple Intelligences in EFL/ESL Classrooms
ERIC Educational Resources Information Center
Bas, Gokhan
2008-01-01
This article deals with the integration of the theory of Multiple Intelligences in EFL/ESL classrooms. In this study, after the theory of multiple intelligences was presented shortly, the integration of this theory into English classrooms. Intelligence types in MI Theory were discussed and some possible application ways of these intelligence types…
A trace ratio maximization approach to multiple kernel-based dimensionality reduction.
Jiang, Wenhao; Chung, Fu-lai
2014-01-01
Most dimensionality reduction techniques are based on one metric or one kernel, hence it is necessary to select an appropriate kernel for kernel-based dimensionality reduction. Multiple kernel learning for dimensionality reduction (MKL-DR) has been recently proposed to learn a kernel from a set of base kernels which are seen as different descriptions of data. As MKL-DR does not involve regularization, it might be ill-posed under some conditions and consequently its applications are hindered. This paper proposes a multiple kernel learning framework for dimensionality reduction based on regularized trace ratio, termed as MKL-TR. Our method aims at learning a transformation into a space of lower dimension and a corresponding kernel from the given base kernels among which some may not be suitable for the given data. The solutions for the proposed framework can be found based on trace ratio maximization. The experimental results demonstrate its effectiveness in benchmark datasets, which include text, image and sound datasets, for supervised, unsupervised as well as semi-supervised settings. Copyright © 2013 Elsevier Ltd. All rights reserved.
Quantum Hamiltonian identification from measurement time traces.
Zhang, Jun; Sarovar, Mohan
2014-08-22
Precise identification of parameters governing quantum processes is a critical task for quantum information and communication technologies. In this Letter, we consider a setting where system evolution is determined by a parametrized Hamiltonian, and the task is to estimate these parameters from temporal records of a restricted set of system observables (time traces). Based on the notion of system realization from linear systems theory, we develop a constructive algorithm that provides estimates of the unknown parameters directly from these time traces. We illustrate the algorithm and its robustness to measurement noise by applying it to a one-dimensional spin chain model with variable couplings.
Consilience and Life History Theory: From Genes to Brain to Reproductive Strategy
ERIC Educational Resources Information Center
Figueredo, Aurelio Jose; Vasquez, Geneva; Brumbach, Barbara H.; Schneider, Stephanie M. R.; Sefcek, Jon A.; Tal, Ilanit R.; Hill, Dawn; Wenner, Christopher J.; Jacobs, W. Jake
2006-01-01
We describe an integrated theory of individual differences that traces the behavioral development of life history from genes to brain to reproductive strategy. We provide evidence that a single common factor, the K-Factor, underpins a variety of life-history parameters, including an assortment of sexual, reproductive, parental, familial, and…
Students' Development in Theory and Practice: The Doubtful Role of Research
ERIC Educational Resources Information Center
Egan, Kieran
2005-01-01
In this article, Kieran Egan contests the scientific foundations of Piaget's developmental theories and the scientific basis of much educational research. In so doing, he pushes researchers and practitioners alike to rethink the centrality of Piaget's tenets to teaching and learning. Egan traces the history of the developmental literature that…
Peripatetic and Euclidean theories of the visual ray.
Jones, A
1994-01-01
The visual ray of Euclid's Optica is endowed with properties that reveal the concept to be an abstraction of a specific physical account of vision. The evolution of a physical theory of vision compatible with the Euclidean model can be traced in Peripatetic writings of the late fourth and third centuries B.C.
The Role of the Mass Media in Shaping Public Opinion.
ERIC Educational Resources Information Center
Porter, Michael J.
This discussion of agenda setting reviews early theories of mass communication and traces the beginnings of agenda setting theory to the 1968 United States presidential campaign, during which researchers found a high correlation between what the media were saying about issues and what the people thought were important issues. The results of more…
The Throws: Contemporary Theory, Technique and Training.
ERIC Educational Resources Information Center
Wilt, Fred, Ed.
This compilation of articles covers the subject of four throwing events--shot put, discus throw, hammer throw, and javelin throw. The history of the art and science of throwing is traced from ancient to modern times in the introduction. Theories on training and techniques of throwing are presented in essays contributed by coaches from the United…
"Bildung"--A Construction of a History of Philosophy of Education
ERIC Educational Resources Information Center
Horlacher, Rebekka
2004-01-01
The paper examines the "prehistory" in the 18th century of the theory of "Bildung". Pedagogical historiography commonly traces the theory back to the influence of Anthony Ashley Cooper, third Earl of Shaftesbury, who is held to be the founder of the concept of "innere Bildung", on the grounds that Shaftesbury's…
Towards a Theoretical Basis for Programs of Student Behavior.
ERIC Educational Resources Information Center
Howick, William H.
The historical background, principles, and practices of two major theories concerning student behavior are described. Theory A is religiously based and can be traced back to the biblical "Garden of Eden." It views human nature as fundamentally evil, the school as a means of both controlling and overcoming the child's innate propensities to…
Strange Bedfellows: The New Neoliberalism of Catholic Schooling in the United States
ERIC Educational Resources Information Center
Burke, Kevin J.
2012-01-01
The article utilizes critical social theory and critical religious theory to examine the emergent and historically aberrant alignment between Catholic schools and neoliberal market-based reforms in the United States. The author traces the historical split between Catholic and public schooling, attending to the role of the litigious in shaping…
Landmarks in the Literature: Two Views of Human Nature.
ERIC Educational Resources Information Center
Bordin, Edward S.
1981-01-01
The author contrasts the psychological theories of Carl Rogers and B. F. Skinner as developed in their books, "Client-Centered Therapy" and "Science and Human Behavior," published in the early 1950s. He traces the continuing impact of these two theories on psychology, education, and the debate between humanism and science. (SJL)
Analyzing Digital Library Initiatives: 5S Theory Perspective
ERIC Educational Resources Information Center
Isah, Abdulmumin; Mutshewa, Athulang; Serema, Batlang; Kenosi, Lekoko
2015-01-01
This article traces the historical development of Digital Libraries (DLs), examines some DL initiatives in developed and developing countries and uses 5S Theory as a lens for analyzing the focused DLs. The analysis shows that present-day systems, in both developed and developing nations, are essentially content and user centric, with low level…
Queering Place: The Intersection of Feminist Body Theory and Australian Aboriginal Collaboration
ERIC Educational Resources Information Center
Somerville, Margaret
2016-01-01
In this article the author used an auto-ethnographic philosophical approach to construct a fragile history of the present. Margaret Somerville reports doing this through tracing key moments and movements of queering feminist poststructural theory and evolving a queering method of body/place writing through her embeddedness in Aboriginal stories.…
ERIC Educational Resources Information Center
Viens, Julie; Kallenbach, Silja
2001-01-01
Dr. Howard Gardner's introduction of multiple intelligences theory (MI theory) in 1983 generated considerable interest in the educational community. Multiple intelligences was a provocative new theory, claiming at least seven relatively independent intelligences. MI theory presented a conception of intelligence that was in marked contrast to the…
Spudich, Paul A.; Chiou, Brian
2015-01-01
We present a two-dimensional system of generalized coordinates for use with geometrically complex fault ruptures that are neither straight nor continuous. The coordinates are a generalization of the conventional strike-normal and strike-parallel coordinates of a single straight fault. The presented conventions and formulations are applicable to a single curved trace, as well as multiple traces representing the rupture of branching faults or noncontiguous faults. An early application of our generalized system is in the second round of the Next Generation of Ground-Motion Attenuation Model project for the Western United States (NGA-West2), where they were used in the characterization of the hanging-wall effects. We further improve the NGA-West2 strike-parallel formulation for multiple rupture traces with a more intuitive definition of the nominal strike direction. We also derive an analytical expression for the gradient of the generalized strike-normal coordinate. The direction of this gradient may be used as the strike-normal direction in the study of polarization effects on ground motions.
Quantum theory of multiple-input-multiple-output Markovian feedback with diffusive measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chia, A.; Wiseman, H. M.
2011-07-15
Feedback control engineers have been interested in multiple-input-multiple-output (MIMO) extensions of single-input-single-output (SISO) results of various kinds due to its rich mathematical structure and practical applications. An outstanding problem in quantum feedback control is the extension of the SISO theory of Markovian feedback by Wiseman and Milburn [Phys. Rev. Lett. 70, 548 (1993)] to multiple inputs and multiple outputs. Here we generalize the SISO homodyne-mediated feedback theory to allow for multiple inputs, multiple outputs, and arbitrary diffusive quantum measurements. We thus obtain a MIMO framework which resembles the SISO theory and whose additional mathematical structure is highlighted by the extensivemore » use of vector-operator algebra.« less
The Theory of Multiple Intelligences.
ERIC Educational Resources Information Center
Gardner, Howard
1987-01-01
The multiple intelligence theory is based on cultural contexts, biological analysis, developmental theories, and a vertical theory of faculties. Seven intelligences are identified: linguistic, logical mathematical, musical, spatial, bodily kinesthetic, interpersonal, and intrapersonal. The theory's educational implications are described,…
Maternal empathy, family chaos, and the etiology of borderline personality disorder.
Golomb, A; Ludolph, P; Westen, D; Block, M J; Maurer, P; Wiss, F C
1994-01-01
Psychoanalytic writers have traced the etiology of borderline personality disorder (BPD) to be a preoedipal disturbance in the mother-child relationship. Despite the prevalence of theories focusing on the role of mothering in the development of BPD, few empirical studies have tested the hypothesis that borderlines were the recipients of unempathic mothering. The current preliminary study compared 13 mothers of borderline adolescents with 13 mothers of normal adolescents. This study found that mothers of borderlines tended to conceive of their children egocentrically, as need-gratifying objects, rather than as individuals with distinct and evolving personalities. This study also found that the mothers of borderlines reported raising their daughters in extremely chaotic families struggling to cope with multiple hardships, including divorce and financial worries. The stressful environmental circumstances reported by the mothers likely affected the borderline daughters directly as well as the mothers' ability to parent effectively and empathically. The results of this study suggest that, as predicted by psychoanalytic theory, a problematic mother-child relationship may play a significant role in the genesis of borderline pathology; however, the life circumstances that contextualize the mother-child relationship also need to be considered when accounting for the etiology of BPD.
Miniature modified Faraday cup for micro electron beams
Teruya, Alan T.; Elmer, John W.; Palmer, Todd A.; Walton, Chris C.
2008-05-27
A micro beam Faraday cup assembly includes a refractory metal layer with an odd number of thin, radially positioned traces in this refractory metal layer. Some of the radially positioned traces are located at the edge of the micro modified Faraday cup body and some of the radially positioned traces are located in the central portion of the micro modified Faraday cup body. Each set of traces is connected to a separate data acquisition channel to form multiple independent diagnostic networks. The data obtained from the two diagnostic networks are combined and inputted into a computed tomography algorithm to reconstruct the beam shape, size, and power density distribution.
W. J. Massman
2004-01-01
Atmospheric trace gas fluxes measured with an eddy covariance sensor that detects a constituent's density fluctuations within the in situ air need to include terms resulting from concurrent heat and moisture fluxes, the so called 'density' or 'WPL corrections' (Webb et al. 1980). The theory behind these additional terms is well established. But...
On multivariate trace inequalities of Sutter, Berta, and Tomamichel
NASA Astrophysics Data System (ADS)
Lemm, Marius
2018-01-01
We consider a family of multivariate trace inequalities recently derived by Sutter, Berta, and Tomamichel. These inequalities generalize the Golden-Thompson inequality and Lieb's triple matrix inequality to an arbitrary number of matrices in a way that features complex matrix powers (i.e., certain unitaries). We show that their inequalities can be rewritten as an n-matrix generalization of Lieb's original triple matrix inequality. The complex matrix powers are replaced by resolvents and appropriate maximally entangled states. We expect that the technically advantageous properties of resolvents, in particular for perturbation theory, can be of use in applications of the n-matrix inequalities, e.g., for analyzing the performance of the rotated Petz recovery map in quantum information theory and for removing the unitaries altogether.
ERIC Educational Resources Information Center
Blanton, Hart; Jaccard, James
2006-01-01
Theories that posit multiplicative relationships between variables are common in psychology. A. G. Greenwald et al. recently presented a theory that explicated relationships between group identification, group attitudes, and self-esteem. Their theory posits a multiplicative relationship between concepts when predicting a criterion variable.…
A reservoir of time constants for memory traces in cortical neurons
Bernacchia, Alberto; Seo, Hyojung; Lee, Daeyeol; Wang, Xiao-Jing
2011-01-01
According to reinforcement learning theory of decision making, reward expectation is computed by integrating past rewards with a fixed timescale. By contrast, we found that a wide range of time constants is available across cortical neurons recorded from monkeys performing a competitive game task. By recognizing that reward modulates neural activity multiplicatively, we found that one or two time constants of reward memory can be extracted for each neuron in prefrontal, cingulate, and parietal cortex. These timescales ranged from hundreds of milliseconds to tens of seconds, according to a power-law distribution, which is consistent across areas and reproduced by a “reservoir” neural network model. These neuronal memory timescales were weakly but significantly correlated with those of monkey's decisions. Our findings suggest a flexible memory system, where neural subpopulations with distinct sets of long or short memory timescales may be selectively deployed according to the task demands. PMID:21317906
Acetonitrile Ion Suppression in Atmospheric Pressure Ionization Mass Spectrometry
NASA Astrophysics Data System (ADS)
Colizza, Kevin; Mahoney, Keira E.; Yevdokimov, Alexander V.; Smith, James L.; Oxley, Jimmie C.
2016-11-01
Efforts to analyze trace levels of cyclic peroxides by liquid chromatography/mass spectrometry gave evidence that acetonitrile suppressed ion formation. Further investigations extended this discovery to ketones, linear peroxides, esters, and possibly many other types of compounds, including triazole and menadione. Direct ionization suppression caused by acetonitrile was observed for multiple adduct types in both electrospray ionization and atmospheric pressure chemical ionization. The addition of only 2% acetonitrile significantly decreased the sensitivity of analyte response. Efforts to identify the mechanism were made using various nitriles. The ion suppression was reduced by substitution of an acetonitrile hydrogen with an electron-withdrawing group, but was exacerbated by electron-donating or steric groups adjacent to the nitrile. Although current theory does not explain this phenomenon, we propose that polar interactions between the various functionalities and the nitrile may be forming neutral aggregates that manifest as ionization suppression.
Stable-isotope analysis: a neglected tool for placing parasites in food webs.
Sabadel, A J M; Stumbo, A D; MacLeod, C D
2018-02-28
Parasites are often overlooked in the construction of food webs, despite their ubiquitous presence in almost every type of ecosystem. Researchers who do recognize their importance often struggle to include parasites using classical food-web theory, mainly due to the parasites' multiple hosts and life stages. A novel approach using compound-specific stable-isotope analysis promises to provide considerable insight into the energetic exchanges of parasite and host, which may solve some of the issues inherent in incorporating parasites using a classical approach. Understanding the role of parasites within food webs, and tracing the associated biomass transfers, are crucial to constructing new models that will expand our knowledge of food webs. This mini-review focuses on stable-isotope studies published in the past decade, and introduces compound-specific stable-isotope analysis as a powerful, but underutilized, newly developed tool that may answer many unresolved questions regarding the role of parasites in food webs.
NASA Astrophysics Data System (ADS)
Faedi, F.; Gómez Maqueo Chew, Y.; Fossati, L.; Pollacco, D.; McQuillan, A.; Hebb, L.; Chaplin, W. J.; Aigrain, S.
2013-04-01
The wealth of information rendered by Kepler planets and planet candidates is indispensable for statistically significant studies of distinct planet populations, in both single and multiple systems. Empirical evidences suggest that Kepler's planet population shows different physical properties as compared to the bulk of known exoplanets. The SOAPS project, aims to shed light on Kepler's planets formation, their migration and architecture. By measuring v sini accurately for Kepler hosts with rotation periods measured from their high-precision light curves, we will assess the alignment of the planetary orbit with respect to the stellar spin axis. This degree of alignment traces the formation history and evolution of the planetary systems, and thus, allows to distinguish between different proposed migration theories. SOAPS will increase by a factor of 2 the number of spin-orbit alignment measurements pushing the parameters space down to the SuperEarth domain. Here we present our preliminary results.
ERIC Educational Resources Information Center
Kallenbach, Silja; Viens, Julie
The Adult Multiple Intelligences (AMI) Study investigated how multiple intelligences (MI) theory can support instruction and assessment in adult literacy education across different adult learning contexts. Two interwoven qualitative research projects focused on applying MI theory in practice. One involved 10 teacher-conducted and AMI…
NASA Astrophysics Data System (ADS)
Nazarian, Robert H.; Legg, Sonya
2017-10-01
When internal waves interact with topography, such as continental slopes, they can transfer wave energy to local dissipation and diapycnal mixing. Submarine canyons comprise approximately ten percent of global continental slopes, and can enhance the local dissipation of internal wave energy, yet parameterizations of canyon mixing processes are currently missing from large-scale ocean models. As a first step in the development of such parameterizations, we conduct a parameter space study of M2 tidal-frequency, low-mode internal waves interacting with idealized V-shaped canyon topographies. Specifically, we examine the effects of varying the canyon mouth width, shape and slope of the thalweg (line of lowest elevation). This effort is divided into two parts. In the first part, presented here, we extend the theory of 3-dimensional internal wave reflection to a rotated coordinate system aligned with our idealized V-shaped canyons. Based on the updated linear internal wave reflection solution that we derive, we construct a ray tracing algorithm which traces a large number of rays (the discrete analog of a continuous wave) into the canyon region where they can scatter off topography. Although a ray tracing approach has been employed in other studies, we have, for the first time, used ray tracing to calculate changes in wavenumber and ray density which, in turn, can be used to calculate the Froude number (a measure of the likelihood of instability). We show that for canyons of intermediate aspect ratio, large spatial envelopes of instability can form in the presence of supercritical sidewalls. Additionally, the canyon height and length can modulate the Froude number. The second part of this study, a diagnosis of internal wave scattering in continental slope canyons using both numerical simulations and this ray tracing algorithm, as well as a test of robustness of the ray tracing, is presented in the companion article.
The Mismeasure of Monkeys: Education Policy Research and the Evolution of Social Capital
ERIC Educational Resources Information Center
Gearin, B.
2017-01-01
This conceptual history traces the rise of "social capital" from the theories of James Coleman and Pierre Bourdieu to its eventual adoption in fields such as primatology and evolutionary psychology. It argues that the earliest theories of social capital were formulated in response to a growing perception that education was an economic…
Theory Development and Application in Higher Education Research: Tribes and Territories
ERIC Educational Resources Information Center
Tight, Malcolm
2015-01-01
This paper examines the idea of tribes and territories, as an example of a theory developed and applied within higher education research of relevance to higher education policy. It traces the origins and meaning of the term, reviews its application by higher education researchers and discusses the issues it raises and the critiques it has…
Psychology of Aging in America: A Historical Account of Theoretical Developments.
ERIC Educational Resources Information Center
Rogers, Sharon; Luepnitz, Roy
This document traces theoretical developments in the psychology of aging during the last 50 years. The concept of theory is discussed as well as the bringing together of theories to form a model. After summarizing the early beginnings of American interest in aging, the work of major theoreticians is explored including Hall (senescence), Thorndike…
Theory Application in Higher Education Research: The Case of Communities of Practice
ERIC Educational Resources Information Center
Tight, Malcolm
2015-01-01
This article examines communities of practice as an example of a theory applied within higher education research. It traces its origins and meaning, reviews its application by higher education researchers and discusses the issues it raises and the critiques it has attracted. This article concludes that while, like all theoretical frameworks,…
On the Nature of Applied Linguistics: Theory and Practice Relationships from a Critical Perspective
ERIC Educational Resources Information Center
Sánchez, William
2007-01-01
This article explores the relationships between Applied Linguistics and other related disciplines concerning language use and language teaching issues. It seeks to trace the changes in the view of the relationship between theory and practice in Applied Linguistics, to explain the reason for those changes, and to discuss the implications for…
Theoretical Shifts: Tracing the Transactional Turn in Scholarship on Reading Education
ERIC Educational Resources Information Center
Martinez-Schaum, Allison
2009-01-01
In the academic world, citations can provide insights into the impact of a particular theoretical orientation on scholarship in a field of study, showing epistemological shifts as numbers of citations to that theory increase or diminish. This study explored the impact of the transactional theory of reading, articulated by Louise Rosenblatt, (e.g.,…
Standard errors in forest area
Joseph McCollum
2002-01-01
I trace the development of standard error equations for forest area, beginning with the theory behind double sampling and the variance of a product. The discussion shifts to the particular problem of forest area - at which time the theory becomes relevant. There are subtle difficulties in figuring out which variance of a product equation should be used. The equations...
ERIC Educational Resources Information Center
Forster, Alan Mark; Pilcher, Nick; Tennant, Stuart; Murray, Mike; Craig, Nigel; Copping, Alex
2017-01-01
From the mid-20th C., construction and engineering pedagogy and curricula have moved from long-held traditional experiential apprenticeship approaches to one ostensibly decoupling practice and theory. This paper traces this decoupling and explores modern-day opportunities and challenges for recoupling university education with industry practice.…
Performance Factors Analysis -- A New Alternative to Knowledge Tracing
ERIC Educational Resources Information Center
Pavlik, Philip I., Jr.; Cen, Hao; Koedinger, Kenneth R.
2009-01-01
Knowledge tracing (KT)[1] has been used in various forms for adaptive computerized instruction for more than 40 years. However, despite its long history of application, it is difficult to use in domain model search procedures, has not been used to capture learning where multiple skills are needed to perform a single action, and has not been used…
Teng, Xian; Pei, Sen; Morone, Flaviano; Makse, Hernán A
2016-10-26
Identifying the most influential spreaders that maximize information flow is a central question in network theory. Recently, a scalable method called "Collective Influence (CI)" has been put forward through collective influence maximization. In contrast to heuristic methods evaluating nodes' significance separately, CI method inspects the collective influence of multiple spreaders. Despite that CI applies to the influence maximization problem in percolation model, it is still important to examine its efficacy in realistic information spreading. Here, we examine real-world information flow in various social and scientific platforms including American Physical Society, Facebook, Twitter and LiveJournal. Since empirical data cannot be directly mapped to ideal multi-source spreading, we leverage the behavioral patterns of users extracted from data to construct "virtual" information spreading processes. Our results demonstrate that the set of spreaders selected by CI can induce larger scale of information propagation. Moreover, local measures as the number of connections or citations are not necessarily the deterministic factors of nodes' importance in realistic information spreading. This result has significance for rankings scientists in scientific networks like the APS, where the commonly used number of citations can be a poor indicator of the collective influence of authors in the community.
The effects of dorsal bundle lesions on serial and trace conditioning.
Tsaltas, E; Preston, G C; Gray, J A
1983-12-01
The performance of rats with neurotoxic lesions of the dorsal ascending noradrenergic bundle (DB) was compared with that of sham-operated control animals under two behavioural conditions. Animals with DB lesions were slower than controls to acquire a classically-conditioned emotional response (conditioned suppression) with a trace interval interposed between the clicker conditioned stimulus (CS) and the shock reinforcer. However, if the latter half of the trace interval was filled by a second stimulus, a light, the DB-lesioned animals acquired conditioned suppression to the clicker faster than did controls under the same conditions. These results are discussed in terms of the attentional theory of DB function.
Scalar gravitational waves in the effective theory of gravity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mottola, Emil
As a low energy effective field theory, classical General Relativity receives an infrared relevant modification from the conformal trace anomaly of the energy-momentum tensor of massless, or nearly massless, quantum fields. The local form of the effective action associated with the trace anomaly is expressed in terms of a dynamical scalar field that couples to the conformal factor of the spacetime metric, allowing it to propagate over macroscopic distances. Linearized around flat spacetime, this semi-classical EFT admits scalar gravitational wave solutions in addition to the transversely polarized tensor waves of the classical Einstein theory. The amplitude of the scalar wavemore » modes, as well as their energy and energy flux which are positive and contain a monopole moment, are computed. As a result, astrophysical sources for scalar gravitational waves are considered, with the excited gluonic condensates in the interiors of neutron stars in merger events with other compact objects likely to provide the strongest burst signals.« less
Risk Taking Under the Influence: A Fuzzy-Trace Theory of Emotion in Adolescence
Rivers, Susan E.; Reyna, Valerie F.; Mills, Britain
2008-01-01
Fuzzy-trace theory explains risky decision making in children, adolescents, and adults, incorporating social and cultural factors as well as differences in impulsivity. Here, we provide an overview of the theory, including support for counterintuitive predictions (e.g., when adolescents “rationally” weigh costs and benefits, risk taking increases, but it decreases when the core gist of a decision is processed). Then, we delineate how emotion shapes adolescent risk taking—from encoding of representations of options, to retrieval of values/principles, to application of those values/principles to representations of options. Our review indicates that: (i) Gist representations often incorporate emotion including valence, arousal, feeling states, and discrete emotions; and (ii) Emotion determines whether gist or verbatim representations are processed. We recommend interventions to reduce unhealthy risk-taking that inculcate stable gist representations, enabling adolescents to identify quickly and automatically danger even when experiencing emotion, which differs sharply from traditional approaches emphasizing deliberation and precise analysis. PMID:19255597
Scalar gravitational waves in the effective theory of gravity
Mottola, Emil
2017-07-10
As a low energy effective field theory, classical General Relativity receives an infrared relevant modification from the conformal trace anomaly of the energy-momentum tensor of massless, or nearly massless, quantum fields. The local form of the effective action associated with the trace anomaly is expressed in terms of a dynamical scalar field that couples to the conformal factor of the spacetime metric, allowing it to propagate over macroscopic distances. Linearized around flat spacetime, this semi-classical EFT admits scalar gravitational wave solutions in addition to the transversely polarized tensor waves of the classical Einstein theory. The amplitude of the scalar wavemore » modes, as well as their energy and energy flux which are positive and contain a monopole moment, are computed. As a result, astrophysical sources for scalar gravitational waves are considered, with the excited gluonic condensates in the interiors of neutron stars in merger events with other compact objects likely to provide the strongest burst signals.« less
2010-12-02
Theory Defined 48 Doctrine 48 Interwar Doctrine a Historical Perspective 52 Adaptive Campaigning 53 Carl von Clausewitz 53 Comparative Analysis of...order effects that may be traced through an understanding of the environment. Carl von Clausewitz suggested using history as a tool, to provide a lens...Cworks/Works.htm (accessed 10/2, 2010). 6 Carl von Clausewitz, On War, trans. Ed. Michael Howard and Peter Paret (Princeton: Princeton University
Multiple Intelligences, the Mozart Effect, and Emotional Intelligence: A Critical Review
ERIC Educational Resources Information Center
Waterhouse, Lynn
2006-01-01
This article reviews evidence for multiple intelligences theory, the Mozart effect theory, and emotional intelligence theory and argues that despite their wide currency in education these theories lack adequate empirical support and should not be the basis for educational practice. Each theory is compared to theory counterparts in cognitive…
ERIC Educational Resources Information Center
Rank, Mark R.; LeCroy, Craig W.
1983-01-01
Examines the complementarity of three often-used theories in family research: social exchange theory, symbolic interactionism, and conflict theory. Provides a case example in which a multiple perspective is applied to a problem of marital discord. Discusses implications for the clinician. (Author/WAS)
Colour-dressed hexagon tessellations for correlation functions and non-planar corrections
NASA Astrophysics Data System (ADS)
Eden, Burkhard; Jiang, Yunfeng; le Plat, Dennis; Sfondrini, Alessandro
2018-02-01
We continue the study of four-point correlation functions by the hexagon tessellation approach initiated in [38] and [39]. We consider planar tree-level correlation functions in N=4 supersymmetric Yang-Mills theory involving two non-protected operators. We find that, in order to reproduce the field theory result, it is necessary to include SU( N) colour factors in the hexagon formalism; moreover, we find that the hexagon approach as it stands is naturally tailored to the single-trace part of correlation functions, and does not account for multi-trace admixtures. We discuss how to compute correlators involving double-trace operators, as well as more general 1 /N effects; in particular we compute the whole next-to-leading order in the large- N expansion of tree-level BMN two-point functions by tessellating a torus with punctures. Finally, we turn to the issue of "wrapping", Lüscher-like corrections. We show that SU( N) colour-dressing reproduces an earlier empirical rule for incorporating single-magnon wrapping, and we provide a direct interpretation of such wrapping processes in terms of N=2 supersymmetric Feynman diagrams.
Chau, Lily S.; Prakapenka, Alesia V.; Zendeli, Liridon; Davis, Ashley S.; Galvez, Roberto
2014-01-01
Studies utilizing general learning and memory tasks have suggested the importance of neocortical structural plasticity for memory consolidation. However, these learning tasks typically result in learning of multiple different tasks over several days of training, making it difficult to determine the synaptic time course mediating each learning event. The current study used trace-eyeblink conditioning to determine the time course for neocortical spine modification during learning. With eyeblink conditioning, subjects are presented with a neutral, conditioned stimulus (CS) paired with a salient, unconditioned stimulus (US) to elicit an unconditioned response (UR). With multiple CS-US pairings, subjects learn to associate the CS with the US and exhibit a conditioned response (CR) when presented with the CS. Trace conditioning is when there is a stimulus free interval between the CS and the US. Utilizing trace-eyeblink conditioning with whisker stimulation as the CS (whisker-trace-eyeblink: WTEB), previous findings have shown that primary somatosensory (barrel) cortex is required for both acquisition and retention of the trace-association. Additionally, prior findings demonstrated that WTEB acquisition results in an expansion of the cytochrome oxidase whisker representation and synaptic modification in layer IV of barrel cortex. To further explore these findings and determine the time course for neocortical learning-induced spine modification, the present study utilized WTEB conditioning to examine Golgi-Cox stained neurons in layer IV of barrel cortex. Findings from this study demonstrated a training-dependent spine proliferation in layer IV of barrel cortex during trace associative learning. Furthermore, findings from this study showing that filopodia-like spines exhibited a similar pattern to the overall spine density further suggests that reorganization of synaptic contacts set the foundation for learning-induced neocortical modifications through the different neocortical layers. PMID:24760074
UAV Communication Management and Coordination for Multitarget Tracking
2009-02-26
6.3 Weighted Trace Penalty 16 6.4 Results with WTP for ECTG 17 7 Multiple UAV Case 20 7.1 Extension of WTP 20 7.2 Coordinated sensor motion...growth by a weighted trace penalty ( WTP ) term, which is a product of the current covariancc trace and the minimum distance to observability (MDO) for a...Specifically, the terminal cost or ECTG term using the WTP has the form J(b) = JWTP(b) := iD(s, e) Tfc P\\ (6.1) where 7 is a positive constant, i is the
Follow-up of serious offender patients in the community: multiple methods of tracing.
Jamieson, Elizabeth; Taylor, Pamela J
2002-01-01
Longitudinal studies of people with mental disorder are important in understanding outcome and intervention effects but attrition rates can be high. This study aimed to evaluate use of multiple record sources to trace, over 12 years, a one-year discharge cohort of high-security hospital patients. Everyone leaving such a hospital in 1984 was traced until a census date of 31 December 1995. Data were collected from several national databases (Office for National Statistics (ONS), Home Office (HO) Offenders' Index, Police National Computer Records, the Electoral Roll) and by hand-searching responsible agency records (HO, National Health Service). Using all methods, only three of the 204 patients had no follow-up information. Home Office Mental Health Unit data were an excellent source, but only for people still under discharge restrictions (<50% after eight years). Sequential tracing of hospital placements for people never or no longer under such restrictions was laborious and also produced only group-specific yield. The best indicator of community residence was ONS information on general practitioner (GP/primary care) registration. The electoral roll was useful when other sources were exhausted. Follow-up of offenders/offender-patients has generally focused on event data, such as re-offending. People untraced by that method alone, however, are unlikely to be lost to follow-up on casting a wider records net. Using multiple records, attrition at the census was 38%, but, after certain assumptions, reduced further to 5%.
The International Postal Network and Other Global Flows as Proxies for National Wellbeing.
Hristova, Desislava; Rutherford, Alex; Anson, Jose; Luengo-Oroz, Miguel; Mascolo, Cecilia
2016-01-01
The digital exhaust left by flows of physical and digital commodities provides a rich measure of the nature, strength and significance of relationships between countries in the global network. With this work, we examine how these traces and the network structure can reveal the socioeconomic profile of different countries. We take into account multiple international networks of physical and digital flows, including the previously unexplored international postal network. By measuring the position of each country in the Trade, Postal, Migration, International Flights, IP and Digital Communications networks, we are able to build proxies for a number of crucial socioeconomic indicators such as GDP per capita and the Human Development Index ranking along with twelve other indicators used as benchmarks of national well-being by the United Nations and other international organisations. In this context, we have also proposed and evaluated a global connectivity degree measure applying multiplex theory across the six networks that accounts for the strength of relationships between countries. We conclude by showing how countries with shared community membership over multiple networks have similar socioeconomic profiles. Combining multiple flow data sources can help understand the forces which drive economic activity on a global level. Such an ability to infer proxy indicators in a context of incomplete information is extremely timely in light of recent discussions on measurement of indicators relevant to the Sustainable Development Goals.
The International Postal Network and Other Global Flows as Proxies for National Wellbeing
Rutherford, Alex; Anson, Jose; Luengo-Oroz, Miguel; Mascolo, Cecilia
2016-01-01
The digital exhaust left by flows of physical and digital commodities provides a rich measure of the nature, strength and significance of relationships between countries in the global network. With this work, we examine how these traces and the network structure can reveal the socioeconomic profile of different countries. We take into account multiple international networks of physical and digital flows, including the previously unexplored international postal network. By measuring the position of each country in the Trade, Postal, Migration, International Flights, IP and Digital Communications networks, we are able to build proxies for a number of crucial socioeconomic indicators such as GDP per capita and the Human Development Index ranking along with twelve other indicators used as benchmarks of national well-being by the United Nations and other international organisations. In this context, we have also proposed and evaluated a global connectivity degree measure applying multiplex theory across the six networks that accounts for the strength of relationships between countries. We conclude by showing how countries with shared community membership over multiple networks have similar socioeconomic profiles. Combining multiple flow data sources can help understand the forces which drive economic activity on a global level. Such an ability to infer proxy indicators in a context of incomplete information is extremely timely in light of recent discussions on measurement of indicators relevant to the Sustainable Development Goals. PMID:27248142
Vazquez-Leal, H.; Jimenez-Fernandez, V. M.; Benhammouda, B.; Filobello-Nino, U.; Sarmiento-Reyes, A.; Ramirez-Pinero, A.; Marin-Hernandez, A.; Huerta-Chua, J.
2014-01-01
We present a homotopy continuation method (HCM) for finding multiple operating points of nonlinear circuits composed of devices modelled by using piecewise linear (PWL) representations. We propose an adaptation of the modified spheres path tracking algorithm to trace the homotopy trajectories of PWL circuits. In order to assess the benefits of this proposal, four nonlinear circuits composed of piecewise linear modelled devices are analysed to determine their multiple operating points. The results show that HCM can find multiple solutions within a single homotopy trajectory. Furthermore, we take advantage of the fact that homotopy trajectories are PWL curves meant to replace the multidimensional interpolation and fine tuning stages of the path tracking algorithm with a simple and highly accurate procedure based on the parametric straight line equation. PMID:25184157
ERIC Educational Resources Information Center
Beuls, Katrien
2013-01-01
Construction Grammar (CxG) is a well-established linguistic theory that takes the notion of a construction as the basic unit of language. Yet, because the potential of this theory for language teaching or SLA has largely remained ignored, this paper demonstrates the benefits of adopting the CxG approach for modelling a student's linguistic…
Theory Development and Application in Higher Education Research: The Case of Academic Drift
ERIC Educational Resources Information Center
Tight, Malcolm
2015-01-01
This article examines the case of academic drift, as an example of a theory developed and applied within higher education research. It traces the origins and meaning of the term, reviews its application by higher education researchers, and discusses the issues it raises and the critiques it has attracted. It concludes that academic drift is at the…
Ohtsuka, Masahiro; Muto, Shunsuke; Tatsumi, Kazuyoshi; Kobayashi, Yoshinori; Kawata, Tsunehiro
2016-04-01
The occupation sites and the occupancies of trace dopants in La/Co co-doped Sr-M-type ferrite, SrFe12O19, were quantitatively and precisely determined by beam-rocking energy-dispersive X-ray spectroscopy (EDXS) on the basis of electron-channeling effects. Because the Co atoms, in particular, should be partially substituted for the five crystallographically inequivalent sites, which could be key parameters in improving the magneto-crystalline anisotropy, it is difficult yet intriguing to discover their occupation sites and occupancies without using the methods of large-scale facilities, such as neutron diffraction and synchrotron radiation. In the present study, we tackled this problem by applying an extended statistical atom location by channeling enhanced microanalysis method, using conventional transmission electron microscopy, EDXS and dynamical electron elastic/inelastic scattering theories. The results show that the key occupation sites of Co were the 2a, 4f1 and 12k sites. The quantified occupancies of Co were consistent with those of the previous study, which involved a combination of neutron diffraction and extended X-ray absorption fine structure analysis, as well as energetics considerations based on by first-principles calculations. © The Author 2015. Published by Oxford University Press on behalf of The Japanese Society of Microscopy. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
The Hippocampus Remains Activated over the Long Term for the Retrieval of Truly Episodic Memories
Harand, Caroline; Bertran, Françoise; La Joie, Renaud; Landeau, Brigitte; Mézenge, Florence; Desgranges, Béatrice; Peigneux, Philippe; Eustache, Francis; Rauchs, Géraldine
2012-01-01
The role of the hippocampus in declarative memory consolidation is a matter of intense debate. We investigated the neural substrates of memory retrieval for recent and remote information using functional magnetic resonance imaging (fMRI). 18 young, healthy participants learned a series of pictures. Then, during two fMRI recognition sessions, 3 days and 3 months later, they had to determine whether they recognized or not each picture using the “Remember/Know” procedure. Presentation of the same learned images at both delays allowed us to track the evolution of memories and distinguish consistently episodic memories from those that were initially episodic and then became familiar or semantic over time and were retrieved without any contextual detail. Hippocampal activation decreased over time for initially episodic, later semantic memories, but remained stable for consistently episodic ones, at least in its posterior part. For both types of memories, neocortical activations were observed at both delays, notably in the ventromedial prefrontal and anterior cingulate cortices. These activations may reflect a gradual reorganization of memory traces within neural networks. Our data indicate maintenance and strengthening of hippocampal and cortico-cortical connections in the consolidation and retrieval of episodic memories over time, in line with the Multiple Trace theory (Nadel and Moscovitch, 1997). At variance, memories becoming semantic over time consolidate through strengthening of cortico-cortical connections and progressive disengagement of the hippocampus. PMID:22937055
Sonnemann, Eckart
2008-10-01
The introduction of sequentially rejective multiple test procedures (Einot and Gabriel, 1975; Naik, 1975; Holm, 1977; Holm, 1979) has caused considerable progress in the theory of multiple comparisons. Emphasizing the closure of multiple tests we give a survey of the general theory and its recent results in applications. Some new applications are given including a discussion of the connection with the theory of confidence regions.
Newton's Experimentum Crucis Reconsidered
ERIC Educational Resources Information Center
Holtsmark, Torger
1970-01-01
Certain terminological inconsistencies in the teaching of optical theory at the elementary level are traced back to Newton who derived them from Euclidean geometrical optics. Discusses this terminological ambiguity which influenced later textbooks. (LS)
Extremal black holes in dynamical Chern-Simons gravity
NASA Astrophysics Data System (ADS)
McNees, Robert; Stein, Leo C.; Yunes, Nicolás
2016-12-01
Rapidly rotating black hole (BH) solutions in theories beyond general relativity (GR) play a key role in experimental gravity, as they allow us to compute observables in extreme spacetimes that deviate from the predictions of GR. Such solutions are often difficult to find in beyond-general-relativity theories due to the inclusion of additional fields that couple to the metric nonlinearly and non-minimally. In this paper, we consider rotating BH solutions in one such theory, dynamical Chern-Simons (dCS) gravity, where the Einstein-Hilbert action is modified by the introduction of a dynamical scalar field that couples to the metric through the Pontryagin density. We treat dCS gravity as an effective field theory and work in the decoupling limit, where corrections are treated as small perturbations from GR. We perturb about the maximally rotating Kerr solution, the so-called extremal limit, and develop mathematical insight into the analysis techniques needed to construct solutions for generic spin. First we find closed-form, analytic expressions for the extremal scalar field, and then determine the trace of the metric perturbation, giving both in terms of Legendre decompositions. Retaining only the first three and four modes in the Legendre representation of the scalar field and the trace, respectively, suffices to ensure a fidelity of over 99% relative to full numerical solutions. The leading-order mode in the Legendre expansion of the trace of the metric perturbation contains a logarithmic divergence at the extremal Kerr horizon, which is likely to be unimportant as it occurs inside the perturbed dCS horizon. The techniques employed here should enable the construction of analytic, closed-form expressions for the scalar field and metric perturbations on a background with arbitrary rotation.
Radiative double copy for Einstein-Yang-Mills theory
NASA Astrophysics Data System (ADS)
Chester, David
2018-04-01
Recently, a double-copy formalism was used to calculate gravitational radiation from classical Yang-Mills radiation solutions. This work shows that the Yang-Mills theory coupled to a biadjoint scalar field admits a radiative double copy that agrees with solutions in the Einstein-Yang-Mills theory at the lowest finite order. Within this context, the trace-reversed metric h¯μ ν is a natural double copy of the gauge boson Aμ a . This work provides additional evidence that solutions in gauge and gravity theories are related, even though their respective Lagrangians and nonlinear equations of motion appear to be different.
Are memory traces localized or distributed?
Thompson, R F
1991-01-01
Evidence supports the view that "memory traces" are formed in the hippocampus and in the cerebellum in classical conditioning of discrete behavioral responses (e.g. eyeblink conditioning). In the hippocampus, learning results in long-lasting increases in excitability of pyramidal neurons that appear to be localized to these neurons (i.e. changes in membrane properties and receptor function). However, these learning-altered pyramidal neurons are distributed widely throughout CA3 and CA1. Although it plays a key role in certain aspects of classical conditioning, the hippocampus is not necessary for learning and memory of the basic conditioned responses. The cerebellum and its associated brain stem circuitry, on the other hand, does appear to be essential (necessary and sufficient) for learning and memory of the conditioned response. Evidence to date is most consistent with a localized trace in the interpositus nucleus and multiple localized traces in cerebellar cortex, each involving relatively large ensembles of neurons. Perhaps "procedural" memory traces are relatively localized and "declarative" traces more widely distributed.
NASA Astrophysics Data System (ADS)
Wang, Zu-liang; Zhang, Ting; Xie, Shi-yang
2017-01-01
In order to improve the agricultural tracing efficiency and reduce tracking and monitoring cost, agricultural products quality tracking and tracing based on Radio-Frequency Identification(RFID) technology is studied, then tracing and tracking model is set up. Three-layer structure model is established to realize the high quality of agricultural products traceability and tracking. To solve the collision problems between multiple RFID tags and improve the identification efficiency a new reservation slot allocation mechanism is proposed. And then we analyze and optimize the parameter by numerical simulation method.
Low-Power Architecture for an Optical Life Gas Analyzer
NASA Technical Reports Server (NTRS)
Pilgrim, Jeffrey; Vakhtin, Andrei
2012-01-01
Analog and digital electronic control architecture has been combined with an operating methodology for an optical trace gas sensor platform that allows very low power consumption while providing four independent gas measurements in essentially real time, as well as a user interface and digital data storage and output. The implemented design eliminates the cross-talk between the measurement channels while maximizing the sensitivity, selectivity, and dynamic range for each measured gas. The combination provides for battery operation on a simple camcorder battery for as long as eight hours. The custom, compact, rugged, self-contained design specifically targets applications of optical major constituent and trace gas detection for multiple gases using multiple lasers and photodetectors in an integrated package.
ERIC Educational Resources Information Center
Annamma, Subini Ancy; Ferri, Beth A.; Connor, David J.
2018-01-01
In this review, we explore how intersectionality has been engaged with through the lens of disability critical race theory (DisCrit) to produce new knowledge. In this chapter, we (1) trace the intellectual lineage for developing DisCrit, (2) review the body of interdisciplinary scholarship incorporating DisCrit to date, and (3) propose the future…
1993-01-01
Opportunities in Business: Ethical Decision Making in the Trenches, and finally a general ( Kantian ) criticism of the utilitarian ethic which strictly questions...particular-case intuitionism and is opposed to utilitarianism and Kantianism as well as the divine-command theory. The original idea can be traced back...theory, intuitionism, which was so summarily dismissed on the first page of this thesis, and Immanuel Kant’s duty-based ethic. First, what is
ERIC Educational Resources Information Center
Lee, Joseph J.; Murphy, John; Baker, Amanda
2015-01-01
This study traces the reception history of Freeman and Johnson's (1998) widely cited article dedicated to theory and practices of second language teacher education (SLTE). It illuminates the degree to which that article has impacted SLTE theory, research, and potentially instructional practices. The reception study analysis is based on a data set…
First-principles multiple-barrier diffusion theory. The case study of interstitial diffusion in CdTe
Yang, Ji -Hui; Park, Ji -Sang; Kang, Joongoo; ...
2015-02-17
The diffusion of particles in solid-state materials generally involves several sequential thermal-activation processes. However, presently, diffusion coefficient theory only deals with a single barrier, i.e., it lacks an accurate description to deal with multiple-barrier diffusion. Here, we develop a general diffusion coefficient theory for multiple-barrier diffusion. Using our diffusion theory and first-principles calculated hopping rates for each barrier, we calculate the diffusion coefficients of Cd, Cu, Te, and Cl interstitials in CdTe for their full multiple-barrier diffusion pathways. As a result, we found that the calculated diffusivity agrees well with the experimental measurement, thus justifying our theory, which is generalmore » for many other systems.« less
ERIC Educational Resources Information Center
Cooper, M. L.
1970-01-01
This short biography of Fresnel traces his early education, his work as an engineer and his theories and discoveries in optics. The importance of Fresnel's ideas on diffraction, interference and double refraction are discussed. Bibliography. (LC)
CYBER 200 Applications Seminar
NASA Technical Reports Server (NTRS)
Gary, J. P. (Compiler)
1984-01-01
Applications suited for the CYBER 200 digital computer are discussed. Various areas of application including meteorology, algorithms, fluid dynamics, monte carlo methods, petroleum, electronic circuit simulation, biochemistry, lattice gauge theory, economics and ray tracing are discussed.
Four-body correlation embedded in antisymmetrized geminal power wave function.
Kawasaki, Airi; Sugino, Osamu
2016-12-28
We extend the Coleman's antisymmetrized geminal power (AGP) to develop a wave function theory that can incorporate up to four-body correlation in a region of strong correlation. To facilitate the variational determination of the wave function, the total energy is rewritten in terms of the traces of geminals. This novel trace formula is applied to a simple model system consisting of one dimensional Hubbard ring with a site of strong correlation. Our scheme significantly improves the result obtained by the AGP-configuration interaction scheme of Uemura et al. and also achieves more efficient compression of the degrees of freedom of the wave function. We regard the result as a step toward a first-principles wave function theory for a strongly correlated point defect or adsorbate embedded in an AGP-based mean-field medium.
Multiple internal seal right micro-electro-mechanical system vacuum package
NASA Technical Reports Server (NTRS)
Shcheglov, Kirill V. (Inventor); Wiberg, Dean V. (Inventor); Hayworth, Ken J. (Inventor); Yee, Karl Y. (Inventor); Bae, Youngsam (Inventor); Challoner, A. Dorian (Inventor); Peay, Chris S. (Inventor)
2007-01-01
A Multiple Internal Seal Ring (MISR) Micro-Electro-Mechanical System (MEMS) vacuum package that hermetically seals MEMS devices using MISR. The method bonds a capping plate having metal seal rings to a base plate having metal seal rings by wafer bonding the capping plate wafer to the base plate wafer. Bulk electrodes may be used to provide conductive paths between the seal rings on the base plate and the capping plate. All seals are made using only metal-to-metal seal rings deposited on the polished surfaces of the base plate and capping plate wafers. However, multiple electrical feed-through metal traces are provided by fabricating via holes through the capping plate for electrical connection from the outside of the package through the via-holes to the inside of the package. Each metal seal ring serves the dual purposes of hermetic sealing and providing the electrical feed-through metal trace.
Multiple internal seal ring micro-electro-mechanical system vacuum packaging method
NASA Technical Reports Server (NTRS)
Hayworth, Ken J. (Inventor); Yee, Karl Y. (Inventor); Shcheglov, Kirill V. (Inventor); Bae, Youngsam (Inventor); Wiberg, Dean V. (Inventor); Challoner, A. Dorian (Inventor); Peay, Chris S. (Inventor)
2008-01-01
A Multiple Internal Seal Ring (MISR) Micro-Electro-Mechanical System (MEMS) vacuum packaging method that hermetically seals MEMS devices using MISR. The method bonds a capping plate having metal seal rings to a base plate having metal seal rings by wafer bonding the capping plate wafer to the base plate wafer. Bulk electrodes may be used to provide conductive paths between the seal rings on the base plate and the capping plate. All seals are made using only metal-to-metal seal rings deposited on the polished surfaces of the base plate and capping plate wafers. However, multiple electrical feed-through metal traces are provided by fabricating via holes through the capping plate for electrical connection from the outside of the package through the via-holes to the inside of the package. Each metal seal ring serves the dual purposes of hermetic sealing and providing the electrical feed-through metal trace.
NASA Astrophysics Data System (ADS)
Phillips, Mark C.; Taubman, Matthew S.; Kriesel, Jason
2015-01-01
We describe a prototype trace gas sensor designed for real-time detection of multiple chemicals. The sensor uses an external cavity quantum cascade laser (ECQCL) swept over its tuning range of 940-1075 cm-1 (9.30-10.7 μm) at a 10 Hz repetition rate. The sensor was designed for operation in multiple modes, including gas sensing within a multi-pass Heriott cell and intracavity absorption sensing using the ECQCL compliance voltage. In addition, the ECQCL compliance voltage was used to reduce effects of long-term drifts in the ECQCL output power. The sensor was characterized for noise, drift, and detection of chemicals including ammonia, methanol, ethanol, isopropanol, Freon- 134a, Freon-152a, and diisopropyl methylphosphonate (DIMP). We also present use of the sensor for mobile detection of ammonia downwind of cattle facilities, in which concentrations were recorded at 1-s intervals.
System-wide versus component-specific trust using multiple aids.
Keller, David; Rice, Stephen
2010-01-01
Previous research in operator trust toward automated aids has focused primarily on single aids. The current study focuses on how operator trust is affected by the presence of multiple aids. Two competing theories of multiple-trust are presented. A component-specific trust theory predicts that operators will differentially place their trust in automated aids that vary in reliability. A system-wide trust theory predicts that operators will treat multiple imperfect aids as one "system" and merge their trust across aids despite differences in the aids' reliability. A simulated flight task was used to test these theories, whereby operators performed a pursuit tracking task while concurrently monitoring multiple system gauges that were augmented with perfect or imperfect automated aids. The data revealed that a system-wide trust theory best predicted the data; operators merged their trust across both aids, behaving toward a perfectly reliable aid in the same manner as they did towards unreliable aids.
The Perceptions of STEM from Eighth-Grade African-American Girls in a High-Minority Middle School
NASA Astrophysics Data System (ADS)
Hare, LaChanda N.
Even with the existence of STEM curriculum and STEM programs that target women and minorities, African-American females still lag behind other ethnic groups in STEM fields. Reasons for the underrepresentation of females in STEM fields can be traced back to the early years of schooling. The purpose of this study was to identify the factors that impact African-American females' perspectives of STEM subjects and STEM careers. An explanatory sequential mixed-methods approach was used for data collection with a survey, focus group, and interview. Forty male (N=12) and female (N=28) students from different ethnic groups were surveyed. The focus group and interview sessions consisted of 21 African-American females from two distinct groups: those enrolled in the school's STEM program (STEM) and those who were not enrolled in the STEM program (Non-STEM). The self-efficacy theory and social cognitive career theory served as the theoretical constructs guiding the data analysis. Multiple regression results showed that outcome expectation and personal disposition had the greatest influence on the females' interest in STEM content and STEM careers. Results from the qualitative portion of the study revealed that the learning environment and STEM self-efficacy had a significant impact on African-American females' interest in STEM.
Quantum mechanics: The Bayesian theory generalized to the space of Hermitian matrices
NASA Astrophysics Data System (ADS)
Benavoli, Alessio; Facchini, Alessandro; Zaffalon, Marco
2016-10-01
We consider the problem of gambling on a quantum experiment and enforce rational behavior by a few rules. These rules yield, in the classical case, the Bayesian theory of probability via duality theorems. In our quantum setting, they yield the Bayesian theory generalized to the space of Hermitian matrices. This very theory is quantum mechanics: in fact, we derive all its four postulates from the generalized Bayesian theory. This implies that quantum mechanics is self-consistent. It also leads us to reinterpret the main operations in quantum mechanics as probability rules: Bayes' rule (measurement), marginalization (partial tracing), independence (tensor product). To say it with a slogan, we obtain that quantum mechanics is the Bayesian theory in the complex numbers.
The role of the medial prefrontal cortex in trace fear extinction
Kwapis, Janine L.; Jarome, Timothy J.
2015-01-01
The extinction of delay fear conditioning relies on a neural circuit that has received much attention and is relatively well defined. Whether this established circuit also supports the extinction of more complex associations, however, is unclear. Trace fear conditioning is a better model of complex relational learning, yet the circuit that supports extinction of this memory has received very little attention. Recent research has indicated that trace fear extinction requires a different neural circuit than delay extinction; trace extinction requires the participation of the retrosplenial cortex, but not the amygdala, as noted in a previous study. Here, we tested the roles of the prelimbic and infralimbic regions of the medial prefrontal cortex in trace and delay fear extinction by blocking NMDA receptors during extinction learning. We found that the prelimbic cortex is necessary for trace, but not for delay fear extinction, whereas the infralimbic cortex is involved in both types of extinction. These results are consistent with the idea that trace fear associations require plasticity in multiple cortical areas for successful extinction. Further, the infralimbic cortex appears to play a role in extinction regardless of whether the animal was initially trained in trace or delay conditioning. Together, our results provide new information about how the neural circuits supporting trace and delay fear extinction differ. PMID:25512576
Inadequate Evidence for Multiple Intelligences, Mozart Effect, and Emotional Intelligence Theories
ERIC Educational Resources Information Center
Waterhouse, Lynn
2006-01-01
I (Waterhouse, 2006) argued that, because multiple intelligences, the Mozart effect, and emotional intelligence theories have inadequate empirical support and are not consistent with cognitive neuroscience findings, these theories should not be applied in education. Proponents countered that their theories had sufficient empirical support, were…
A first proposal for a general description model of forensic traces
NASA Astrophysics Data System (ADS)
Lindauer, Ina; Schäler, Martin; Vielhauer, Claus; Saake, Gunter; Hildebrandt, Mario
2012-06-01
In recent years, the amount of digitally captured traces at crime scenes increased rapidly. There are various kinds of such traces, like pick marks on locks, latent fingerprints on various surfaces as well as different micro traces. Those traces are different from each other not only in kind but also in which information they provide. Every kind of trace has its own properties (e.g., minutiae for fingerprints, or raking traces for locks) but there are also large amounts of metadata which all traces have in common like location, time and other additional information in relation to crime scenes. For selected types of crime scene traces, type-specific databases already exist, such as the ViCLAS for sexual offences, the IBIS for ballistic forensics or the AFIS for fingerprints. These existing forensic databases strongly differ in the trace description models. For forensic experts it would be beneficial to work with only one database capable of handling all possible forensic traces acquired at a crime scene. This is especially the case when different kinds of traces are interrelated (e.g., fingerprints and ballistic marks on a bullet casing). Unfortunately, current research on interrelated traces as well as general forensic data models and structures is not mature enough to build such an encompassing forensic database. Nevertheless, recent advances in the field of contact-less scanning make it possible to acquire different kinds of traces with the same device. Therefore the data of these traces is structured similarly what simplifies the design of a general forensic data model for different kinds of traces. In this paper we introduce a first common description model for different forensic trace types. Furthermore, we apply for selected trace types from the well established database schema development process the phases of transferring expert knowledge in the corresponding forensic fields into an extendible, database-driven, generalised forensic description model. The trace types considered here are fingerprint traces, traces at locks, micro traces and ballistic traces. Based on these basic trace types, also combined traces (multiple or overlapped fingerprints, fingerprints on bullet casings, etc) and partial traces are considered.
Conceptual Developments of 20th Century Field Theories
NASA Astrophysics Data System (ADS)
Cao, Tian Yu
1998-06-01
This volume provides a broad synthesis of conceptual developments of twentieth century field theories, from the general theory of relativity to quantum field theory and gauge theory. The book traces the foundations and evolution of these theories within a historio-critical context. Theoretical physicists and students of theoretical physics will find this a valuable account of the foundational problems of their discipline that will help them understand the internal logic and dynamics of theoretical physics. It will also provide professional historians and philosophers of science, particularly philosophers of physics, with a conceptual basis for further historical, cultural and sociological analysis of the theories discussed. Finally, the scientifically qualified general reader will find in this book a deeper analysis of contemporary conceptions of the physical world than can be found in popular accounts of the subject.
Conceptual Developments of 20th Century Field Theories
NASA Astrophysics Data System (ADS)
Cao, Tian Yu
1997-02-01
This volume provides a broad synthesis of conceptual developments of twentieth century field theories, from the general theory of relativity to quantum field theory and gauge theory. The book traces the foundations and evolution of these theories within a historio-critical context. Theoretical physicists and students of theoretical physics will find this a valuable account of the foundational problems of their discipline that will help them understand the internal logic and dynamics of theoretical physics. It will also provide professional historians and philosophers of science, particularly philosophers of physics, with a conceptual basis for further historical, cultural and sociological analysis of the theories discussed. Finally, the scientifically qualified general reader will find in this book a deeper analysis of contemporary conceptions of the physical world than can be found in popular accounts of the subject.
The history of imitation in learning theory: the language acquisition process.
Kymissis, E; Poulson, C L
1990-01-01
The concept of imitation has undergone different analyses in the hands of different learning theorists throughout the history of psychology. From Thorndike's connectionism to Pavlov's classical conditioning, Hull's monistic theory, Mowrer's two-factor theory, and Skinner's operant theory, there have been several divergent accounts of the conditions that produce imitation and the conditions under which imitation itself may facilitate language acquisition. In tracing the roots of the concept of imitation in the history of learning theory, the authors conclude that generalized imitation, as defined and analyzed by operant learning theorists, is a sufficiently robust formulation of learned imitation to facilitate a behavior-analytic account of first-language acquisition. PMID:2230633
Solar test of Dirac's large number hypothesis. [multiplicative creation model for solar evolution
NASA Technical Reports Server (NTRS)
Chin, C.-W.; Stothers, R.
1975-01-01
An investigation is conducted regarding the implications of Dirac's theories (1973, 1974) concerning the creation of new matter. It is found that Dirac's theory of multiplicative creation, but not his theory of additive creation, is not in contradiction with known facts about the sun. According to the theory of additive creation, matter is formed uniformly throughout space. The concept of multiplicative creation implies that existing matter multiplies itself in proportion to the amount of matter already present.
NASA Astrophysics Data System (ADS)
Iwakoshi, Takehisa; Hirota, Osamu
2014-10-01
This study will test an interpretation in quantum key distribution (QKD) that trace distance between the distributed quantum state and the ideal mixed state is a maximum failure probability of the protocol. Around 2004, this interpretation was proposed and standardized to satisfy both of the key uniformity in the context of universal composability and operational meaning of the failure probability of the key extraction. However, this proposal has not been verified concretely yet for many years while H. P. Yuen and O. Hirota have thrown doubt on this interpretation since 2009. To ascertain this interpretation, a physical random number generator was employed to evaluate key uniformity in QKD. In this way, we calculated statistical distance which correspond to trace distance in quantum theory after a quantum measurement is done, then we compared it with the failure probability whether universal composability was obtained. As a result, the degree of statistical distance of the probability distribution of the physical random numbers and the ideal uniformity was very large. It is also explained why trace distance is not suitable to guarantee the security in QKD from the view point of quantum binary decision theory.
ERIC Educational Resources Information Center
Peariso, Jamon F.
2008-01-01
Howard Gardner's Multiple Intelligences (MI) theory has been widely accepted in the field of education for the past two decades. Most educators have been subjugated to the MI theory and to the many issues that its implementation in the classroom brings. This is often done without ever looking at or being presented the critic's view or research on…
Paxton, Alexandra; Griffiths, Thomas L
2017-10-01
Today, people generate and store more data than ever before as they interact with both real and virtual environments. These digital traces of behavior and cognition offer cognitive scientists and psychologists an unprecedented opportunity to test theories outside the laboratory. Despite general excitement about big data and naturally occurring datasets among researchers, three "gaps" stand in the way of their wider adoption in theory-driven research: the imagination gap, the skills gap, and the culture gap. We outline an approach to bridging these three gaps while respecting our responsibilities to the public as participants in and consumers of the resulting research. To that end, we introduce Data on the Mind ( http://www.dataonthemind.org ), a community-focused initiative aimed at meeting the unprecedented challenges and opportunities of theory-driven research with big data and naturally occurring datasets. We argue that big data and naturally occurring datasets are most powerfully used to supplement-not supplant-traditional experimental paradigms in order to understand human behavior and cognition, and we highlight emerging ethical issues related to the collection, sharing, and use of these powerful datasets.
Seismic wavefield propagation in 2D anisotropic media: Ray theory versus wave-equation simulation
NASA Astrophysics Data System (ADS)
Bai, Chao-ying; Hu, Guang-yi; Zhang, Yan-teng; Li, Zhong-sheng
2014-05-01
Despite the ray theory that is based on the high frequency assumption of the elastic wave-equation, the ray theory and the wave-equation simulation methods should be mutually proof of each other and hence jointly developed, but in fact parallel independent progressively. For this reason, in this paper we try an alternative way to mutually verify and test the computational accuracy and the solution correctness of both the ray theory (the multistage irregular shortest-path method) and the wave-equation simulation method (both the staggered finite difference method and the pseudo-spectral method) in anisotropic VTI and TTI media. Through the analysis and comparison of wavefield snapshot, common source gather profile and synthetic seismogram, it is able not only to verify the accuracy and correctness of each of the methods at least for kinematic features, but also to thoroughly understand the kinematic and dynamic features of the wave propagation in anisotropic media. The results show that both the staggered finite difference method and the pseudo-spectral method are able to yield the same results even for complex anisotropic media (such as a fault model); the multistage irregular shortest-path method is capable of predicting similar kinematic features as the wave-equation simulation method does, which can be used to mutually test each other for methodology accuracy and solution correctness. In addition, with the aid of the ray tracing results, it is easy to identify the multi-phases (or multiples) in the wavefield snapshot, common source point gather seismic section and synthetic seismogram predicted by the wave-equation simulation method, which is a key issue for later seismic application.
NASA Astrophysics Data System (ADS)
Maeno, Tsuyoshi; Ueyama, Hiroya; Iida, Michihira; Fujiwara, Osamu
It is well known that electromagnetic disturbances in vehicle-mounted radios are mainly caused by conducted noise currents flowing through wiring-harnesses from vehicle-mounted printed circuit boards (PCBs) with common ground patterns with slits. To suppress the noise current outflows from the PCBs of this kind, we previously measured noise current outflows from simple two-layer PCBs having two parallel signal traces and different ground patterns with/without slits, which revealed that making slits with open ends on the ground patterns in parallel with the traces can reduce the conducted noise currents. In the present study, with the FDTD simulation, we investigated reduction characteristics of the FM-band cross-talk noise levels between two parallel signal traces for eighteen PCBs, which have different ground patterns with/without slits parallel to the traces and dielectric layers with different thickness. As a result, we found that the cross-talk reduction effect due to slits is obtained by 3.6-5.3dB, while the cross-talks between signal traces are reduced in inverse proportion to the square of the dielectric-layer thickness and in proportion to the square of the trace interval and, which can quantitatively be explained from an inductive coupling theory.
Resource Theory of Superposition
NASA Astrophysics Data System (ADS)
Theurer, T.; Killoran, N.; Egloff, D.; Plenio, M. B.
2017-12-01
The superposition principle lies at the heart of many nonclassical properties of quantum mechanics. Motivated by this, we introduce a rigorous resource theory framework for the quantification of superposition of a finite number of linear independent states. This theory is a generalization of resource theories of coherence. We determine the general structure of operations which do not create superposition, find a fundamental connection to unambiguous state discrimination, and propose several quantitative superposition measures. Using this theory, we show that trace decreasing operations can be completed for free which, when specialized to the theory of coherence, resolves an outstanding open question and is used to address the free probabilistic transformation between pure states. Finally, we prove that linearly independent superposition is a necessary and sufficient condition for the faithful creation of entanglement in discrete settings, establishing a strong structural connection between our theory of superposition and entanglement theory.
NASA Astrophysics Data System (ADS)
Bai, Chao-Ying; Huang, Guo-Jiao; Li, Xiao-Ling; Zhou, Bing; Greenhalgh, Stewart
2013-11-01
To overcome the deficiency of some current grid-/cell-based ray tracing algorithms, which are only able to handle first arrivals or primary reflections (or conversions) in anisotropic media, we have extended the functionality of the multistage irregular shortest-path method to 2-D/3-D tilted transversely isotropic (TTI) media. The new approach is able to track multiple transmitted/reflected/converted arrivals composed of any kind of combinations of transmissions, reflections and mode conversions. The basic principle is that the seven parameters (five elastic parameters plus two polar angles defining the tilt of the symmetry axis) of the TTI media are sampled at primary nodes, and the group velocity values at secondary nodes are obtained by tri-linear interpolation of the primary nodes across each cell, from which the group velocities of the three wave modes (qP, qSV and qSH) are calculated. Finally, we conduct grid-/cell-based wave front expansion to trace multiple transmitted/reflected/converted arrivals from one region to the next. The results of calculations in uniform anisotropic media indicate that the numerical results agree with the analytical solutions except in directions of SV-wave triplications, at which only the lowest velocity value is selected at the singularity points by the multistage irregular shortest-path anisotropic ray tracing method. This verifies the accuracy of the methodology. Several simulation results show that the new method is able to efficiently and accurately approximate situations involving continuous velocity variations and undulating discontinuities, and that it is suitable for any combination of multiple transmitted/reflected/converted arrival tracking in TTI media of arbitrary strength and tilt. Crosshole synthetic traveltime tomographic tests have been performed, which highlight the importance of using such code when the medium is distinctly anisotropic.
Meteorites, the Moon and the History of Geology.
ERIC Educational Resources Information Center
Marvin, Ursula B.
1986-01-01
Traces the historical events that linked geology with the planetary sciences. Reviews the origins of meteorities as a modern science and highlights the advances made in this area. Discusses lunar related theories and research. (ML)
Modeling, Materials, and Metrics: The Three-m Approach to FCS Signature Solutions
2002-05-07
calculations. These multiple levels will be incorporated into the MuSES software. The four levels are described as follows: "* Radiosity - Deterministic...view-factor-based, all-diffuse solution. Very fast. Independent of user position. "* Directional Reflectivity - Radiosity with directional incident...target and environment facets (view factor with BRDF). Last ray cast bounce = radiosity solution. "* Multi-bounce path trace - Rays traced from observer
Leiner, Claude; Nemitz, Wolfgang; Schweitzer, Susanne; Kuna, Ladislav; Wenzl, Franz P; Hartmann, Paul; Satzinger, Valentin; Sommer, Christian
2016-03-20
We show that with an appropriate combination of two optical simulation techniques-classical ray-tracing and the finite difference time domain method-an optical device containing multiple diffractive and refractive optical elements can be accurately simulated in an iterative simulation approach. We compare the simulation results with experimental measurements of the device to discuss the applicability and accuracy of our iterative simulation procedure.
Pilic, Denisa; Höfs, Carolin; Weitmann, Sandra; Nöh, Frank; Fröhlich, Thorsten; Skopnik, Heino; Köhler, Henrik; Wenzl, Tobias G; Schmidt-Choudhury, Anjona
2011-09-01
Assessment of intra- and interobserver agreement in multiple intraluminal impedance (MII) measurement between investigators from different institutions. Twenty-four 18- to 24-hour MII tracings were randomly chosen from 4 different institutions (6 per center). Software-aided automatic analysis was performed. Each result was validated by 2 independent investigators from the 4 different centers (4 investigator combinations). For intraobserver agreement, 6 measurements were analyzed twice by the same investigator. Agreement between investigators was calculated using the Cohen kappa coefficient. Interobserver agreement: 13 measurements showed a perfect agreement (kappa > 0.8); 9 had a substantial (kappa 0.61-0.8), 1 a moderate (kappa coefficient 0.41 to 0.6), and 1 a fair agreement (kappa coefficient 0.11-0.4). Median kappa value was 0.83. Intraobserver agreement: 5 tracings showed perfect and 1 showed a substantial agreement. The median kappa value was 0.88. Most measurements showed substantial to perfect intra- and interobserver agreement. Still, we found a few outliers presumably caused by poorer signal quality in some tracings rather than being observer dependent. An improvement of analysis results may be achieved by using a standard analysis protocol, a standardized method for judging tracing quality, better training options for method users, and more interaction between investigators from different institutions.
Comparing Multiple Discrepancies Theory to Affective Models of Subjective Wellbeing
ERIC Educational Resources Information Center
Blore, Jed D.; Stokes, Mark A.; Mellor, David; Firth, Lucy; Cummins, Robert A.
2011-01-01
The Subjective Wellbeing (SWB) literature is replete with competing theories detailing the mechanisms underlying the construction and maintenance of SWB. The current study aimed to compare and contrast two of these approaches: multiple discrepancies theory (MDT) and an affective-cognitive theory of SWB. MDT posits SWB to be the result of perceived…
In Situ Trace Element Analysis of an Allende Type B1 CAI: EK-459-5-1
NASA Technical Reports Server (NTRS)
Jeffcoat, C. R.; Kerekgyarto, A.; Lapen, T. J.; Andreasen, R.; Righter, M.; Ross, D. K.
2014-01-01
Variations in refractory major and trace element composition of calcium, aluminum-rich inclusions (CAIs) provide constraints on physical and chemical conditions and processes in the earliest stages of the Solar System. Previous work indicates that CAIs have experienced complex histories involving, in many cases, multiple episodes of condensation, evaporation, and partial melting. We have analyzed major and trace element abundances in two core to rim transects of the melilite mantle as well as interior major phases of a Type B1 CAI (EK-459-5-1) from Allende by electron probe micro-analyzer (EPMA) and laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) to investigate the behavior of key trace elements with a primary focus on the REEs Tm and Yb.
Catchings, R.D.; Rymer, M.J.; Goldman, M.R.; Prentice, C.S.; Sickler, R.R.
2013-01-01
The San Francisco Public Utilities Commission is seismically retrofitting the water delivery system at San Andreas Lake, San Mateo County, California, where the reservoir intake system crosses the San Andreas Fault (SAF). The near-surface fault location and geometry are important considerations in the retrofit effort. Because the SAF trends through highly distorted Franciscan mélange and beneath much of the reservoir, the exact trace of the 1906 surface rupture is difficult to determine from surface mapping at San Andreas Lake. Based on surface mapping, it also is unclear if there are additional fault splays that extend northeast or southwest of the main surface rupture. To better understand the fault structure at San Andreas Lake, the U.S. Geological Survey acquired a series of seismic imaging profiles across the SAF at San Andreas Lake in 2008, 2009, and 2011, when the lake level was near historical lows and the surface traces of the SAF were exposed for the first time in decades. We used multiple seismic methods to locate the main 1906 rupture zone and fault splays within about 100 meters northeast of the main rupture zone. Our seismic observations are internally consistent, and our seismic indicators of faulting generally correlate with fault locations inferred from surface mapping. We also tested the accuracy of our seismic methods by comparing our seismically located faults with surface ruptures mapped by Schussler (1906) immediately after the April 18, 1906 San Francisco earthquake of approximate magnitude 7.9; our seismically determined fault locations were highly accurate. Near the reservoir intake facility at San Andreas Lake, our seismic data indicate the main 1906 surface rupture zone consists of at least three near-surface fault traces. Movement on multiple fault traces can have appreciable engineering significance because, unlike movement on a single strike-slip fault trace, differential movement on multiple fault traces may exert compressive and extensional stresses on built structures within the fault zone. Such differential movement and resulting distortion of built structures appear to have occurred between fault traces at the gatewell near the southern end of San Andreas Lake during the 1906 San Francisco earthquake (Schussler, 1906). In addition to the three fault traces within the main 1906 surface rupture zone, our data indicate at least one additional fault trace (or zone) about 80 meters northeast of the main 1906 surface rupture zone. Because ground shaking also can damage structures, we used fault-zone guided waves to investigate ground shaking within the fault zones relative to ground shaking outside the fault zones. Peak ground velocity (PGV) measurements from our guided-wave study indicate that ground shaking is greater at each of the surface fault traces, varying with the frequency of the seismic data and the wave type (P versus S). S-wave PGV increases by as much as 5–6 times at the fault traces relative to areas outside the fault zone, and P-wave PGV increases by as much as 3–10 times. Assuming shaking increases linearly with increasing earthquake magnitude, these data suggest strong shaking may pose a significant hazard to built structures that extend across the fault traces. Similarly complex fault structures likely underlie other strike-slip faults (such as the Hayward, Calaveras, and Silver Creek Faults) that intersect structures of the water delivery system, and these fault structures similarly should be investigated.
NASA Astrophysics Data System (ADS)
Zhang, Xiaoxing; Li, Yi; Xiao, Song; Tian, Shuangshuang; Deng, Zaitao; Tang, Ju
2017-08-01
C3F7CN has been the focus of the alternative gas research field over the past two years because of its excellent insulation properties and environmental characteristics. Experimental studies on its insulation performance have made many achievements. However, few studies on the formation mechanism of the decomposition components exist. A discussion of the decomposition characteristics of insulating media will provide guidance for scientific experimental research and the work that must be completed before further engineering application. In this study, the decomposition mechanism of C3F7CN in the presence of trace H2O under discharge was calculated based on the density functional theory and transition state theory. The reaction heat, Gibbs free energy, and activation energy of different decomposition pathways were investigated. The ionization parameters and toxicity of C3F7CN and various decomposition products were analyzed from the molecular structure perspective. The formation mechanism of the C3F7CN discharge decomposition components and the influence of trace water were evaluated. This paper confirms that C3F7CN has excellent decomposition characteristics, which provide theoretical support for later experiments and related engineering applications. However, the existence of trace water has a negative impact on C3F7CN’s insulation performance. Thus, strict trace water content standards should be developed to ensure dielectric insulation and the safety of maintenance personnel.
Barnard, Philip; deLahunta, Scott
2017-01-01
Two long-term sci-art research projects are described and positioned in the broader conceptual landscape of interdisciplinary collaboration. Both projects were aimed at understanding and augmenting choreographic decision-making and both were grounded in research conducted within a leading contemporary dance company. In each case, the work drew upon methods and theory from the cognitive sciences, and both had a direct impact on the way in which the company made new work. In the synthesis presented here the concept of an audit trace is introduced. Audit traces identify how specific classes of knowledge are used and transformed not only within the arts or sciences but also when arts practice is informed by science or when arts practice informs science.
Understanding Biogenic and Anthropogenic Trace Gas Variations Measured Near Cool, CA in June 2010
NASA Astrophysics Data System (ADS)
Klein, B. Z.; Flowers, B. A.; Gorkowski, K.; Dubey, M. K.; Knighton, W. B.; Floerchinger, C.; Herndon, S. C.; Fast, J. D.; Zaveri, R. A.
2011-12-01
Trace gas signatures produced by forested and urban areas differ greatly. Forested areas are dominated by gases produced during photosynthesis and respiration: CO2 and volatile organic compounds (VOCs) including terpenes and isoprene. Urban areas are heavily influenced by vehicle exhaust emissions and have elevated levels of CO, NOx and aromatic hydrocarbons such as benzene. Ozone is produced as a byproduct of both of these sources; it is produced when NOx from urban areas reacts with either anthropogenic or biogenic hydrocarbons. The Carbonaceous Aerosol and Radiative Effects Study (CARES) campaign was conducted during June 2010, in part to observe the evolution of urban air masses as they mix into rural locations and to better understand anthropogenic-biogenic photochemical interactions. The campaign included two ground-based sampling sites, one in Sacramento, CA (T0) and one downwind, approximately 70km NE, rurally located near Cool, CA (T1). In situ measurements of CO2, CO, O3, NO and multiple different VOCs were performed at the T1 site during the study, and are analyzed here to gain insights into the chemistry and transport of these trace gases. Comparisons between these trace gases coupled with transport modeling is used to delineate biogenic and anthropogenic sources. Additionally, comparisons between trace gases produced predominately by biogenic sources provide valuable information on how meteorology affects their production. Two atmospheric models (HYSPLIT back-trajectories and WRF forecasts) are used to predict transport episodes, where polluted air masses from the Sacramento or more distant San Francisco areas are transported to Cool. The two models display significant overlap for eleven different transport episodes during the study period. Both models also agree on two transport-free multiple-day periods. By examining the periods during which the models are in agreement, we are able to characterize with high certainty the trace gas signatures of local biogenic sources and also the significance of short-range transported anthropogenic trace gases.
Giles, Tracey M; de Lacey, Sheryl; Muir-Cochrane, Eimear
2016-01-01
Grounded theory method has been described extensively in the literature. Yet, the varying processes portrayed can be confusing for novice grounded theorists. This article provides a worked example of the data analysis phase of a constructivist grounded theory study that examined family presence during resuscitation in acute health care settings. Core grounded theory methods are exemplified, including initial and focused coding, constant comparative analysis, memo writing, theoretical sampling, and theoretical saturation. The article traces the construction of the core category "Conditional Permission" from initial and focused codes, subcategories, and properties, through to its position in the final substantive grounded theory.
ERIC Educational Resources Information Center
McNamee, Paul; Madden, Dave; McNamee, Frank; Wall, John; Hurst, Alan; Vrasidas, Charalambos; Chanquoy, Lucile; Baccino, Thierry; Acar, Emrah; Onwy-Yazici, Ela; Jordan, Ann
2009-01-01
This paper describes an ongoing EU project concerned with developing an instructional design framework for virtual classes (VC) that is based on the theory of Multiple Intelligences (MI) (1983). The psychological theory of Multiple Intelligences (Gardner 1983) has received much credence within instructional design since its inception and has been…
Sabbah, Shai; Hawryshyn, Craig W
2013-07-04
Two competing theories have been advanced to explain the evolution of multiple cone classes in vertebrate eyes. These two theories have important, but different, implications for our understanding of the design and tuning of vertebrate visual systems. The 'contrast theory' proposes that multiple cone classes evolved in shallow-water fish to maximize the visual contrast of objects against diverse backgrounds. The competing 'flicker theory' states that multiple cone classes evolved to eliminate the light flicker inherent in shallow-water environments through antagonistic neural interactions, thereby enhancing object detection. However, the selective pressures that have driven the evolution of multiple cone classes remain largely obscure. We show that two critical assumptions of the flicker theory are violated. We found that the amplitude and temporal frequency of flicker vary over the visible spectrum, precluding its cancellation by simple antagonistic interactions between the output signals of cones. Moreover, we found that the temporal frequency of flicker matches the frequency where sensitivity is maximal in a wide range of fish taxa, suggesting that the flicker may actually enhance the detection of objects. Finally, using modeling of the chromatic contrast between fish pattern and background under flickering illumination, we found that the spectral sensitivity of cones in a cichlid focal species is optimally tuned to maximize the visual contrast between fish pattern and background, instead of to produce a flicker-free visual signal. The violation of its two critical assumptions substantially undermines support for the flicker theory as originally formulated. While this alone does not support the contrast theory, comparison of the contrast and flicker theories revealed that the visual system of our focal species was tuned as predicted by the contrast theory rather than by the flicker theory (or by some combination of the two). Thus, these findings challenge key assumptions of the flicker theory, leaving the contrast theory as the most parsimonious and tenable account of the evolution of multiple cone classes.
NASA Astrophysics Data System (ADS)
Teng, Xian; Pei, Sen; Morone, Flaviano; Makse, Hernán A.
2016-10-01
Identifying the most influential spreaders that maximize information flow is a central question in network theory. Recently, a scalable method called “Collective Influence (CI)” has been put forward through collective influence maximization. In contrast to heuristic methods evaluating nodes’ significance separately, CI method inspects the collective influence of multiple spreaders. Despite that CI applies to the influence maximization problem in percolation model, it is still important to examine its efficacy in realistic information spreading. Here, we examine real-world information flow in various social and scientific platforms including American Physical Society, Facebook, Twitter and LiveJournal. Since empirical data cannot be directly mapped to ideal multi-source spreading, we leverage the behavioral patterns of users extracted from data to construct “virtual” information spreading processes. Our results demonstrate that the set of spreaders selected by CI can induce larger scale of information propagation. Moreover, local measures as the number of connections or citations are not necessarily the deterministic factors of nodes’ importance in realistic information spreading. This result has significance for rankings scientists in scientific networks like the APS, where the commonly used number of citations can be a poor indicator of the collective influence of authors in the community.
Teng, Xian; Pei, Sen; Morone, Flaviano; Makse, Hernán A.
2016-01-01
Identifying the most influential spreaders that maximize information flow is a central question in network theory. Recently, a scalable method called “Collective Influence (CI)” has been put forward through collective influence maximization. In contrast to heuristic methods evaluating nodes’ significance separately, CI method inspects the collective influence of multiple spreaders. Despite that CI applies to the influence maximization problem in percolation model, it is still important to examine its efficacy in realistic information spreading. Here, we examine real-world information flow in various social and scientific platforms including American Physical Society, Facebook, Twitter and LiveJournal. Since empirical data cannot be directly mapped to ideal multi-source spreading, we leverage the behavioral patterns of users extracted from data to construct “virtual” information spreading processes. Our results demonstrate that the set of spreaders selected by CI can induce larger scale of information propagation. Moreover, local measures as the number of connections or citations are not necessarily the deterministic factors of nodes’ importance in realistic information spreading. This result has significance for rankings scientists in scientific networks like the APS, where the commonly used number of citations can be a poor indicator of the collective influence of authors in the community. PMID:27782207
Status of Multi-beam Long Trace-profiler Development
NASA Technical Reports Server (NTRS)
Gubarev, Mikhail V.; Merthe, Daniel J.; Kilaru, Kiranmayee; Kester, Thomas; Ramsey, Brian; McKinney, Wayne R.; Takacs, Peter Z.; Dahir, A.; Yashchuk, Valeriy V.
2013-01-01
The multi-beam long trace profiler (MB-LTP) is under development at NASA's Marshall Space Flight Center. The traditional LTPs scans the surface under the test by a single laser beam directly measuring the surface figure slope errors. While capable of exceptional surface slope accuracy, the LTP single beam scanning has slow measuring speed. Metrology efficiency can be increased by replacing the single laser beam with multiple beams that can scan a section of the test surface at a single instance. The increase in speed with such a system would be almost proportional to the number of laser beams. The progress for a multi-beam long trace profiler development is presented.
A BASIC program for the removal of noise from reaction traces using Fourier filtering.
Brittain, T
1989-04-01
Software for the removal of noise from reaction curves using the principle of Fourier filtering has been written in BASIC to execute on a PC. The program inputs reaction traces which are subjected to a rotation-inversion process, to produce functions suitable for Fourier analysis. Fourier transformation into the frequency domain is followed by multiplication of the transform by a rectangular filter function, to remove the noise frequencies. Inverse transformation then yields a noise-reduced reaction trace suitable for further analysis. The program is interactive at each stage and could easily be modified to remove noise from a range of input data types.
Global multiplicity of dietary standards for trace elements.
Freeland-Graves, Jeanne H; Lee, Jane J
2012-06-01
Consistent guidelines across the world for dietary standards of trace elements remain elusive. Harmonization of dietary standards has been suggested by international agencies to facilitate consistency in food and nutrition policies and international trade. Yet significant barriers exist to standardize recommendations on a global basis, such as vast differences in geography, food availability and transport; cultural, social and economic constraints, and biological diversity. Simple commonality is precluded further by the variety of terminologies among countries and regions related to diet. Certain unions have created numerous nutritional descriptive categories for standards, while other large countries are limited to only a few. This paper will explore the global multiplicity of dietary standards and efforts for harmonization. Copyright © 2012 Elsevier GmbH. All rights reserved.
USDA-ARS?s Scientific Manuscript database
Quantification of regional greenhouse gas (GHG) fluxes is essential for establishing mitigation strategies and evaluating their effectiveness. Here, we used multiple top-down approaches and multiple trace gas observations at a tall tower to estimate GHG regional fluxes and evaluate the GHG fluxes de...
Examining Multiple Dimensions of Word Knowledge for Content Vocabulary Understanding
ERIC Educational Resources Information Center
Cervetti, Gina N.; Tilson, Jennifer L.; Castek, Jill; Bravo, Marco A.; Trainin, Guy
2012-01-01
This study traces the development of a vocabulary measure designed to assess multiple types of word knowledge. The assessment, which was administered in conjunction with a science unit about weather and the water cycle for third-and-fourth graders, included items for six knowledge types--recognition, definition, classification/example, context,…
Comparison of CAM-Chem with Trace Gas Measurements from Airborne Field Campaigns from 2009-2016.
NASA Astrophysics Data System (ADS)
Schauffler, S.; Atlas, E. L.; Kinnison, D. E.; Lamarque, J. F.; Saiz-Lopez, A.; Navarro, M. A.; Donets, V.; Blake, D. R.; Blake, N. J.
2016-12-01
Trace gas measurements collected during seven field campaigns, two with multiple deployments, will be compared with the NCAR CAM-Chem model to evaluate the model performance over multiple years. The campaigns include HIPPO (2009-2011) pole to pole observations in the Pacific on the NSF/NCAR GV over multiple seasons; SEAC4RS (Aug./Sept., 2013) in the central and southern U.S. and western Gulf of Mexico on the NASA ER-2 and DC8; ATTREX (2011-2015) on the NASA Global Hawk over multiple seasons and locations; CONTRAST (Jan/Feb, 2014) in the western Pacific on the NSF/NCAR GV; VIRGAS (Oct., 2015) in the south central US and western Gulf of Mexico on the NASA WB-57; ORCAS (Jan/Feb, 2016) over the southern ocean on the NSF/NCAR GV; and POSIDON (Oct, 2016) in the western Pacific on the NASA WB-57. We will focus on along the flight tracks comparisons with the model and will also examine comparisons of vertical distributions and various tracer-tracer correlations.
Theory and Practice in Feminist Therapy
ERIC Educational Resources Information Center
Thomas, Susan Amelia
1977-01-01
Traces the development feminist therapy. Discusses lack of definitions and systematic studies in the literature. Reports on research study, based on interviews with feminist therapists, which explores the nature and practice of this emerging mode of therapy. (Author/SMR)
NASA Astrophysics Data System (ADS)
Waxler, R.; Talmadge, C. L.; Blom, P.
2009-12-01
Theory predicts that for ground to ground infrasound propagation along paths which travel downwind, relative to the stratospheric jet, there is a shadow zone which ends about 200 km from the source where the first return from the stratosphere strikes the earth. With increasing range the single stratospheric arrival splits into two distinct arrivals, a fast arrival with the trace velocity of the effective sound speed at the stratopause, and a slower arrival with the trace velocity of the sound speed on the ground. To test the theory we have deployed eight infrasound arrays along an approximate line directly west of the site of the US Navy's Trident Missile rocket motor eliminations. The arrays were deployed during the summer of 2009 spaced roughly 10 km apart along a segment from 180 to 260 km west of the site. Comparisons between the theoretical predictions and the received data will be presented.
Control of Structure in Conventional Friction Stir Welds through a Kinematic Theory of Metal Flow
NASA Technical Reports Server (NTRS)
Rubisoff, H.A.; Schneider, J.A.; Nunes, A.C.
2009-01-01
In friction stir welding (FSW), a rotating pin is translated along a weld seam so as to stir the sides of the seam together. Metal is prevented from flowing up the pin, which would result in plowing/cutting instead of welding, by a shoulder on the pin. In conventional FSW, the weld metal rests on an "anvil", which supports the heavy "plunge" load on the tool. In this study, both embedded tungsten wires along and copper plating on the faying surfaces were used to trace the flow of AA2219 weld metal around the C-FSW tool. The effect of tool rotational speed, travel speed, plunge load, and pin thread pitch on the resulting weld metal flow was evaluated. Plan, longitudinal, and transverse section x-ray radiographs were examined to trace the metal flow paths. The results are interpreted in terms of a kinematic theory of metal flow in FSW.
Effective Vaccine Communication during the Disneyland Measles Outbreak
Broniatowski, David Andre; Hilyard, Karen M.; Dredze, Mark
2016-01-01
Vaccine refusal rates have increased in recent years, highlighting the need for effective risk communication, especially over social media. Fuzzy-trace theory predicts that individuals encode bottom-line meaning ("gist") and statistical information ("verbatim") in parallel and that articles expressing a clear gist will be most compelling. We coded news articles (n=4,686) collected during the 2014–2015 Disneyland measles for content including statistics, stories, or opinions containing bottom-line gists regarding vaccines and vaccine-preventable illnesses. We measured the extent to which articles were compelling by how frequently they were shared on Facebook. The most widely shared articles expressed bottom-line opinions, although articles containing statistics were also more likely to be shared than articles lacking statistics. Stories had limited impact on Facebook shares. Results support Fuzzy Trace Theory's predictions regarding the distinct yet parallel impact of categorical gist and statistical verbatim information on public health communication. PMID:27179915
Effective vaccine communication during the disneyland measles outbreak.
Broniatowski, David A; Hilyard, Karen M; Dredze, Mark
2016-06-14
Vaccine refusal rates have increased in recent years, highlighting the need for effective risk communication, especially over social media. Fuzzy-trace theory predicts that individuals encode bottom-line meaning ("gist") and statistical information ("verbatim") in parallel and those articles expressing a clear gist will be most compelling. We coded news articles (n=4581) collected during the 2014-2015 Disneyland measles for content including statistics, stories, or bottom-line gists regarding vaccines and vaccine-preventable illnesses. We measured the extent to which articles were compelling by how frequently they were shared on Facebook. The most widely shared articles expressed bottom-line gists, although articles containing statistics were also more likely to be shared than articles lacking statistics. Stories had limited impact on Facebook shares. Results support Fuzzy Trace Theory's predictions regarding the distinct yet parallel impact of categorical gist and statistical verbatim information on public health communication. Copyright © 2016 Elsevier Ltd. All rights reserved.
Trace anomaly and invariance under transformation of units
NASA Astrophysics Data System (ADS)
Namavarian, Nadereh
2017-05-01
Paying attention to conformal invariance as the invariance under local transformations of units of measure, we take a conformal-invariant quantum field as a quantum matter theory in which one has the freedom to choose the values of units of mass, length, and time arbitrarily at each point. To be able to have this view, it is necessary that the background on which the quantum field is based be conformal invariant as well. Consequently, defining the unambiguous expectation value of the energy-momentum tensor of such a quantum field through the Wald renormalizing prescription necessitates breaking down the conformal symmetry of the background. Then, noticing the field equations suitable for describing the backreaction effect, we show that the existence of the "trace anomaly," known for indicating the brokenness of conformal symmetry in quantum field theory, can also indicate the above "gravitational" conformal symmetry brokenness.
Exploring the Application of Multiple Intelligences Theory to Career Counseling
ERIC Educational Resources Information Center
Shearer, C. Branton; Luzzo, Darrell Anthony
2009-01-01
This article demonstrates the practical value of applying H. Gardner's (1993) theory of multiple intelligences (MI) to the practice of career counseling. An overview of H. Gardner's MI theory is presented, and the ways in which educational and vocational planning can be augmented by the integration of MI theory in career counseling contexts are…
Improvements in aircraft extraction programs
NASA Technical Reports Server (NTRS)
Balakrishnan, A. V.; Maine, R. E.
1976-01-01
Flight data from an F-8 Corsair and a Cessna 172 was analyzed to demonstrate specific improvements in the LRC parameter extraction computer program. The Cramer-Rao bounds were shown to provide a satisfactory relative measure of goodness of parameter estimates. It was not used as an absolute measure due to an inherent uncertainty within a multiplicative factor, traced in turn to the uncertainty in the noise bandwidth in the statistical theory of parameter estimation. The measure was also derived on an entirely nonstatistical basis, yielding thereby also an interpretation of the significance of off-diagonal terms in the dispersion matrix. The distinction between coefficients as linear and non-linear was shown to be important in its implication to a recommended order of parameter iteration. Techniques of improving convergence generally, were developed, and tested out on flight data. In particular, an easily implemented modification incorporating a gradient search was shown to improve initial estimates and thus remove a common cause for lack of convergence.
Factors affecting graded and ungraded memory loss following hippocampal lesions.
Winocur, Gordon; Moscovitch, Morris; Sekeres, Melanie J
2013-11-01
This review evaluates three current theories--Standard Consolidation (Squire & Wixted, 2011), Overshadowing (Sutherland, Sparks, & Lehmann, 2010), and Multiple Trace-Transformation (Winocur, Moscovitch, & Bontempi, 2010)--in terms of their ability to account for the role of the hippocampus in recent and remote memory in animals. Evidence, based on consistent findings from tests of spatial memory and memory for acquired food preferences, favours the transformation account, but this conclusion is undermined by inconsistent results from studies that measured contextual fear memory, probably the most commonly used test of hippocampal involvement in anterograde and retrograde memory. Resolution of this issue may depend on exercising greater control over critical factors (e.g., contextual environment, amount of pre-exposure to the conditioning chamber, the number and distribution of foot-shocks) that can affect the representation of the memory shortly after learning and over the long-term. Research strategies aimed at characterizing the neural basis of long-term consolidation/transformation, as well as other outstanding issues are discussed. Copyright © 2013 Elsevier Inc. All rights reserved.
Wintemute, Garen J.; Webster, Daniel W.
2010-01-01
While many handguns are used in crime each year in the USA, most are not. We conducted this study to identify factors present at the time of a handgun’s most recent retail sale that were associated with its subsequent use in crime under circumstances suggesting that the handgun had been trafficked—purchased with the intent of diverting it to criminal use. Handguns acquired in multiple-gun purchases were of particular interest. Using data for 180,321 handguns purchased from federally licensed retailers in California in 1996, we studied attributes of the handguns, the retailers selling them, the purchasers, and the sales transactions. Our outcome measure was a handgun’s recovery by a police agency, followed by a gun ownership trace, conducted by the Bureau of Alcohol, Tobacco, Firearms and Explosives, that determined (a) that the recovery had occurred within 3 years of the handgun’s most recent purchase from a licensed retailer and (b) that the person who possessed the gun when it was recovered by police was not its most recent purchaser. Altogether, 722 handguns were recovered and had trace results that met the additional criteria. Handguns acquired in multiple-gun, same-day transactions were more likely to be traced than were single-purchase handguns (odds ratio [OR] 1.33, 95% confidence intervals [CI] 1.08 to 1.63). This was not the case for multiple-purchase handguns defined more broadly as multiple handguns purchased by one individual over any 30-day period as used in “one-gun-a-month” laws. Bivariate regressions indicated increased risk of a handgun being traced when it sold new for $150 or less (OR 4.28, 95% CI 3.59 to 5.11) or had been purchased by a woman (OR 2.02, 95% CI 1.62 to 2.52). Handguns sold by retailers who also had a relatively high proportion (≥2%) of purchases denied because the prospective purchasers were prohibited from owning firearms were more likely to be traced than were those sold by other retailers (OR 4.09, 95% CI 3.39 to 4.94). These findings persisted in multivariate analyses. Our findings suggest specific strategies for intervention to prevent gun violence. PMID:20354912
Wright, Mona A; Wintemute, Garen J; Webster, Daniel W
2010-05-01
While many handguns are used in crime each year in the USA, most are not. We conducted this study to identify factors present at the time of a handgun's most recent retail sale that were associated with its subsequent use in crime under circumstances suggesting that the handgun had been trafficked--purchased with the intent of diverting it to criminal use. Handguns acquired in multiple-gun purchases were of particular interest. Using data for 180,321 handguns purchased from federally licensed retailers in California in 1996, we studied attributes of the handguns, the retailers selling them, the purchasers, and the sales transactions. Our outcome measure was a handgun's recovery by a police agency, followed by a gun ownership trace, conducted by the Bureau of Alcohol, Tobacco, Firearms and Explosives, that determined (a) that the recovery had occurred within 3 years of the handgun's most recent purchase from a licensed retailer and (b) that the person who possessed the gun when it was recovered by police was not its most recent purchaser. Altogether, 722 handguns were recovered and had trace results that met the additional criteria. Handguns acquired in multiple-gun, same-day transactions were more likely to be traced than were single-purchase handguns (odds ratio [OR] 1.33, 95% confidence intervals [CI] 1.08 to 1.63). This was not the case for multiple-purchase handguns defined more broadly as multiple handguns purchased by one individual over any 30-day period as used in "one-gun-a-month" laws. Bivariate regressions indicated increased risk of a handgun being traced when it sold new for $150 or less (OR 4.28, 95% CI 3.59 to 5.11) or had been purchased by a woman (OR 2.02, 95% CI 1.62 to 2.52). Handguns sold by retailers who also had a relatively high proportion (>or=2%) of purchases denied because the prospective purchasers were prohibited from owning firearms were more likely to be traced than were those sold by other retailers (OR 4.09, 95% CI 3.39 to 4.94). These findings persisted in multivariate analyses. Our findings suggest specific strategies for intervention to prevent gun violence.
NASA Astrophysics Data System (ADS)
Smieska, Louisa M.; Mullett, Ruth; Ferri, Laurent; Woll, Arthur R.
2017-07-01
We present trace-element and composition analysis of azurite pigments in six illuminated manuscript leaves, dating from the thirteenth to sixteenth century, using synchrotron-based, large-area x-ray fluorescence (SR-XRF) and diffraction (SR-XRD) mapping. SR-XRF mapping reveals several trace elements correlated with azurite, including arsenic, zirconium, antimony, barium, and bismuth, that appear in multiple manuscripts but were not always detected by point XRF. Within some manuscript leaves, variations in the concentration of trace elements associated with azurite coincide with distinct regions of the illuminations, suggesting systematic differences in azurite preparation or purification. Variations of the trace element concentrations in azurite are greater among different manuscript leaves than the variations within each individual leaf, suggesting the possibility that such impurities reflect distinct mineralogical/geologic sources. SR-XRD maps collected simultaneously with the SR-XRF maps confirm the identification of azurite regions and are consistent with impurities found in natural mineral sources of azurite. In general, our results suggest the feasibility of using azurite trace element analysis for provenance studies of illuminated manuscript fragments, and demonstrate the value of XRF mapping in non-destructive determination of trace element concentrations within a single pigment.
Co-digestion of manure and industrial waste--The effects of trace element addition.
Nordell, Erik; Nilsson, Britt; Nilsson Påledal, Sören; Karisalmi, Kaisa; Moestedt, Jan
2016-01-01
Manure is one of the most common substrates for biogas production. Manure from dairy- and swine animals are often considered to stabilize the biogas process by contributing nutrients and trace elements needed for the biogas process. In this study two lab-scale reactors were used to evaluate the effects of trace element addition during co-digestion of manure from swine- and dairy animals with industrial waste. The substrate used contained high background concentrations of both cobalt and nickel, which are considered to be the most important trace elements. In the reactor receiving additional trace elements, the volatile fatty acids (VFA) concentration was 89% lower than in the control reactor. The lower VFA concentration contributed to a more digested digestate, and thus lower methane emissions in the subsequent storage. Also, the biogas production rate increased with 24% and the biogas production yield with 10%, both as a result of the additional trace elements at high organic loading rates. All in all, even though 50% of the feedstock consisted of manure, trace element addition resulted in multiple positive effects and a more reliable process with stable and high yield. Copyright © 2015 Elsevier Ltd. All rights reserved.
Filler, Guido; Felder, Sarah
2014-08-01
In end-stage chronic kidney disease (CKD), pediatric nephrologists must consider the homeostasis of the multiple water-soluble ions that are influenced by renal replacement therapy (RRT). While certain ions such as potassium and calcium are closely monitored, little is known about the handling of trace elements in pediatric dialysis. RRT may lead to accumulation of toxic trace elements, either due to insufficient elimination or due to contamination, or to excessive removal of essential trace elements. However, trace elements are not routinely monitored in dialysis patients and no mechanism for these deficits or toxicities has been established. This review summarizes the handling of trace elements, with particular attention to pediatric data. The best data describe lead and indicate that there is a higher prevalence of elevated lead (Pb, atomic number 82) levels in children on RRT when compared to adults. Lead is particularly toxic in neurodevelopment and lead levels should therefore be monitored. Monitoring of zinc (Zn, atomic number 30) and selenium (Se, atomic number 34) may be indicated in the monitoring of all pediatric dialysis patients to reduce morbidity from deficiency. Prospective studies evaluating the impact of abnormal trace elements and the possible therapeutic value of intervention are required.
Neural population-level memory traces in the mouse hippocampus.
Chen, Guifen; Wang, L Phillip; Tsien, Joe Z
2009-12-16
One of the fundamental goals in neurosciences is to elucidate the formation and retrieval of brain's associative memory traces in real-time. Here, we describe real-time neural ensemble transient dynamics in the mouse hippocampal CA1 region and demonstrate their relationships with behavioral performances during both learning and recall. We employed the classic trace fear conditioning paradigm involving a neutral tone followed by a mild foot-shock 20 seconds later. Our large-scale recording and decoding methods revealed that conditioned tone responses and tone-shock association patterns were not present in CA1 during the first pairing, but emerged quickly after multiple pairings. These encoding patterns showed increased immediate-replay, correlating tightly with increased immediate-freezing during learning. Moreover, during contextual recall, these patterns reappeared in tandem six-to-fourteen times per minute, again correlating tightly with behavioral recall. Upon traced tone recall, while various fear memories were retrieved, the shock traces exhibited a unique recall-peak around the 20-second trace interval, further signifying the memory of time for the expected shock. Therefore, our study has revealed various real-time associative memory traces during learning and recall in CA1, and demonstrates that real-time memory traces can be decoded on a moment-to-moment basis over any single trial.
Chiu, Chi-yang; Jung, Jeesun; Chen, Wei; Weeks, Daniel E; Ren, Haobo; Boehnke, Michael; Amos, Christopher I; Liu, Aiyi; Mills, James L; Ting Lee, Mei-ling; Xiong, Momiao; Fan, Ruzong
2017-01-01
To analyze next-generation sequencing data, multivariate functional linear models are developed for a meta-analysis of multiple studies to connect genetic variant data to multiple quantitative traits adjusting for covariates. The goal is to take the advantage of both meta-analysis and pleiotropic analysis in order to improve power and to carry out a unified association analysis of multiple studies and multiple traits of complex disorders. Three types of approximate F -distributions based on Pillai–Bartlett trace, Hotelling–Lawley trace, and Wilks's Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants. Simulation analysis is performed to evaluate false-positive rates and power of the proposed tests. The proposed methods are applied to analyze lipid traits in eight European cohorts. It is shown that it is more advantageous to perform multivariate analysis than univariate analysis in general, and it is more advantageous to perform meta-analysis of multiple studies instead of analyzing the individual studies separately. The proposed models require individual observations. The value of the current paper can be seen at least for two reasons: (a) the proposed methods can be applied to studies that have individual genotype data; (b) the proposed methods can be used as a criterion for future work that uses summary statistics to build test statistics to meta-analyze the data. PMID:28000696
Chiu, Chi-Yang; Jung, Jeesun; Chen, Wei; Weeks, Daniel E; Ren, Haobo; Boehnke, Michael; Amos, Christopher I; Liu, Aiyi; Mills, James L; Ting Lee, Mei-Ling; Xiong, Momiao; Fan, Ruzong
2017-02-01
To analyze next-generation sequencing data, multivariate functional linear models are developed for a meta-analysis of multiple studies to connect genetic variant data to multiple quantitative traits adjusting for covariates. The goal is to take the advantage of both meta-analysis and pleiotropic analysis in order to improve power and to carry out a unified association analysis of multiple studies and multiple traits of complex disorders. Three types of approximate F -distributions based on Pillai-Bartlett trace, Hotelling-Lawley trace, and Wilks's Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants. Simulation analysis is performed to evaluate false-positive rates and power of the proposed tests. The proposed methods are applied to analyze lipid traits in eight European cohorts. It is shown that it is more advantageous to perform multivariate analysis than univariate analysis in general, and it is more advantageous to perform meta-analysis of multiple studies instead of analyzing the individual studies separately. The proposed models require individual observations. The value of the current paper can be seen at least for two reasons: (a) the proposed methods can be applied to studies that have individual genotype data; (b) the proposed methods can be used as a criterion for future work that uses summary statistics to build test statistics to meta-analyze the data.
Audiences for the Theory of Multiple Intelligences
ERIC Educational Resources Information Center
Gardner, Howard
2004-01-01
In his closing comments, Howard Gardner discusses the various audiences that have emerged over the years for the theory of multiple intelligences. Under that rubric, he places the various papers in this issue and speculates about future lines of work on MI theory.
Environmental Mycobiome Modifiers of Inflammation and Fibrosis in Systemic Sclerosis
2016-09-01
TUBB), and ribosomal proteins), while others are considered specific to SSc despite trace level detection in controls. For ex- ample, multiple SSc...Strong re- activity was seen against all five proteins in SSc with only trace levels detected in controls (Fig. 3a), indicating widespread immune...sequences in SSc RNA-seq data was used to detect microbial sequences in human tissues in an unbiased, quantitative manner. Our studies suggest that
2007-06-01
particle accelerators cannot run unless enough network band- width is available to absorb their data streams. DOE scientists running simulations routinely...send tuples to TelegraphCQ. To simulate a less-powerful machine, I increased the playback rate of the trace by a factor of 10 and reduced the query...III CPUs and 1.5 GB of main memory. To simulate using a less powerful embedded CPU, I wrote a program that would “play back” the trace at a multiple
Neural and mental hierarchies.
Wiest, Gerald
2012-01-01
The history of the sciences of the human brain and mind has been characterized from the beginning by two parallel traditions. The prevailing theory that still influences the way current neuroimaging techniques interpret brain function, can be traced back to classical localizational theories, which in turn go back to early phrenological theories. The other approach has its origins in the hierarchical neurological theories of Hughlings-Jackson, which have been influenced by the philosophical conceptions of Herbert Spencer. Another hallmark of the hierarchical tradition, which is also inherent to psychoanalytic metapsychology, is its deeply evolutionary perspective by taking both ontogenetic and phylogenetic trajectories into consideration. This article provides an outline on hierarchical concepts in brain and mind sciences, which contrast with current cognitivistic and non-hierarchical theories in the neurosciences.
Wiest, Gerald
2012-01-01
The history of the sciences of the human brain and mind has been characterized from the beginning by two parallel traditions. The prevailing theory that still influences the way current neuroimaging techniques interpret brain function, can be traced back to classical localizational theories, which in turn go back to early phrenological theories. The other approach has its origins in the hierarchical neurological theories of Hughlings-Jackson, which have been influenced by the philosophical conceptions of Herbert Spencer. Another hallmark of the hierarchical tradition, which is also inherent to psychoanalytic metapsychology, is its deeply evolutionary perspective by taking both ontogenetic and phylogenetic trajectories into consideration. This article provides an outline on hierarchical concepts in brain and mind sciences, which contrast with current cognitivistic and non-hierarchical theories in the neurosciences. PMID:23189066
Explanatory and illustrative visualization of special and general relativity.
Weiskopf, Daniel; Borchers, Marc; Ertl, Thomas; Falk, Martin; Fechtig, Oliver; Frank, Regine; Grave, Frank; King, Andreas; Kraus, Ute; Müller, Thomas; Nollert, Hans-Peter; Rica Mendez, Isabel; Ruder, Hanns; Schafhitzel, Tobias; Schär, Sonja; Zahn, Corvin; Zatloukal, Michael
2006-01-01
This paper describes methods for explanatory and illustrative visualizations used to communicate aspects of Einstein's theories of special and general relativity, their geometric structure, and of the related fields of cosmology and astrophysics. Our illustrations target a general audience of laypersons interested in relativity. We discuss visualization strategies, motivated by physics education and the didactics of mathematics, and describe what kind of visualization methods have proven to be useful for different types of media, such as still images in popular science magazines, film contributions to TV shows, oral presentations, or interactive museum installations. Our primary approach is to adopt an egocentric point of view: The recipients of a visualization participate in a visually enriched thought experiment that allows them to experience or explore a relativistic scenario. In addition, we often combine egocentric visualizations with more abstract illustrations based on an outside view in order to provide several presentations of the same phenomenon. Although our visualization tools often build upon existing methods and implementations, the underlying techniques have been improved by several novel technical contributions like image-based special relativistic rendering on GPUs, special relativistic 4D ray tracing for accelerating scene objects, an extension of general relativistic ray tracing to manifolds described by multiple charts, GPU-based interactive visualization of gravitational light deflection, as well as planetary terrain rendering. The usefulness and effectiveness of our visualizations are demonstrated by reporting on experiences with, and feedback from, recipients of visualizations and collaborators.
Toward a genealogy of culture.
Rendon, M
2001-12-01
Using psychoanalytic theory, this paper attempts to trace the natural history of the phenomenon designated as Culture. It postulates that psychoanalysis, a product of the Hegelian philosophical revolution, is still one of the best instruments to understand Culture. It traces the origins of culture as postulated by Freud and the pioneer anthropologists and its course from early and evolved religion through humanism, science, and finally postmodernism. It emphasizes the dialectical concepts in psychoanalysis and reviews summarily those psychoanalysts that, according to the author, have had a major impact on the study of culture: Freud, Horney, and Lacan.
Nonrelativistic trace and diffeomorphism anomalies in particle number background
NASA Astrophysics Data System (ADS)
Auzzi, Roberto; Baiguera, Stefano; Nardelli, Giuseppe
2018-04-01
Using the heat kernel method, we compute nonrelativistic trace anomalies for Schrödinger theories in flat spacetime, with a generic background gauge field for the particle number symmetry, both for a free scalar and a free fermion. The result is genuinely nonrelativistic, and it has no counterpart in the relativistic case. Contrary to naive expectations, the anomaly is not gauge invariant; this is similar to the nongauge covariance of the non-Abelian relativistic anomaly. We also show that, in the same background, the gravitational anomaly for a nonrelativistic scalar vanishes.
Traces of Lorentz symmetry breaking in a hydrogen atom at ground state
NASA Astrophysics Data System (ADS)
Borges, L. H. C.; Barone, F. A.
2016-02-01
Some traces of a specific Lorentz symmetry breaking scenario in the ground state of the hydrogen atom are investigated. We use standard Rayleigh-Schrödinger perturbation theory in order to obtain the corrections to the ground state energy and the wave function. It is shown that an induced four-pole moment arises, due to the Lorentz symmetry breaking. The model considered is the one studied in Borges et al. (Eur Phys J C 74:2937, 2014), where the Lorentz symmetry is broken in the electromagnetic sector.
Meillère, Alizée; Brischoux, François; Bustamante, Paco; Michaud, Bruno; Parenteau, Charline; Marciau, Coline; Angelier, Frédéric
2016-10-01
In a rapidly urbanizing world, trace element pollution may represent a threat to human health and wildlife, and it is therefore crucial to assess both exposition levels and associated effects of trace element contamination on urban vertebrates. In this study, we investigated the impact of urbanization on trace element contamination and stress physiology in a wild bird species, the common blackbird (Turdus merula), along an urbanization gradient (from rural to moderately urbanized areas). Specifically, we described the contamination levels of blackbirds by 4 non-essential (Ag, Cd, Hg, Pb) and 9 essential trace elements (As, Co, Cr, Cu, Fe, Mn, Ni, Se, Zn), and explored the putative disrupting effects of the non-essential element contamination on corticosterone levels (a hormonal proxy for environmental challenges). We found that non-essential trace element burden (Cd and Pb specifically) increased with increasing urbanization, indicating a significant trace element contamination even in medium sized cities and suburban areas. Interestingly, the increased feather non-essential trace element concentrations were also associated with elevated feather corticosterone levels, suggesting that urbanization probably constrains birds and that this effect may be mediated by trace element contamination. Future experimental studies are now required to disentangle the influence of multiple urban-related constraints on corticosterone levels and to specifically test the influence of each of these trace elements on corticosterone secretion. Copyright © 2016 Elsevier B.V. All rights reserved.
Aspects of Higher Spin Symmetry and its Breaking
NASA Astrophysics Data System (ADS)
Zhiboedov, Alexander
This thesis explores different aspects of higher spin symmetry and its breaking in the context of Quantum Field Theory, AdS/CFT and String Theory. In chapter 2, we study the constraints imposed by the existence of a single higher spin conserved current on a three-dimensional conformal field theory (CFT). A single higher spin conserved current implies the existence of an infinite number of higher spin conserved currents. The correlation functions of the stress tensor and the conserved currents are then shown to be equal to those of a free field theory. Namely a theory of N free bosons or free fermions. This is an extension of the Coleman-Mandula theorem to CFT's, which do not have a conventional S-matrix. In chapter 3, we consider three-dimensional conformal field theories that have a higher spin symmetry that is slightly broken. The theories have a large N limit, in the sense that the operators separate into single-trace and multi-trace and obey the usual large N factorization properties. We assume that the only single trace operators are the higher spin currents plus an additional scalar. Using the slightly broken higher spin symmetry we constrain the three-point functions of the theories to leading order in N. We show that there are two families of solutions. One family can be realized as a theory of N fermions with an O( N) Chern-Simons gauge field, the other as a N bosons plus the Chern-Simons gauge field. In chapter 4, we consider several aspects of unitary higher-dimensional conformal field theories. We investigate the dimensions of spinning operators via the crossing equations in the light-cone limit. We find that, in a sense, CFTs become free at large spin and 1/s is a weak coupling parameter. The spectrum of CFTs enjoys additivity: if two twists tau 1, tau2 appear in the spectrum, there are operators whose twists are arbitrarily close to tau1 + tau2. We characterize how tau1 + tau2 is approached at large spin by solving the crossing equations analytically. Applications include the 3d Ising model, theories with a gravity dual, SCFTs, and patterns of higher spin symmetry breaking. In chapter 5, we consider higher derivative corrections to the graviton three-point coupling within a weakly coupled theory of gravity. We devise a thought experiment involving a high energy scattering process which leads to causality violation if the graviton three-point vertex contains the additional structures. This violation cannot be fixed by adding conventional particles with spins J ≤ 2. But, it can be fixed by adding an infinite tower of extra massive particles with higher spins, J > 2. In AdS theories this implies a constraint on the conformal anomaly coefficients (a-c)/c lesssim 1/Delta gap2 in terms of Deltagap, the dimension of the lightest single particle operator with spin J > 2. For inflation, or de Sitter-like solutions, it indicates the existence of massive higher spin particles if the gravity wave non-gaussianity deviates significantly from the one computed in the Einstein theory.
1997-04-01
to tracing historical trends in archaeological method and theory ). The literature sum- marized here is extensive and is not accessible widely to the...of new signifi- cance assessment models. The more specific objectives in undertaking this literary review and interpretive analysis of archaeological...method and theory characteristic of the ’New Archaeology’ of the late 1960s. Once these ideas had made their way into the early literature on
Laboratory specimens and genetic privacy: evolution of legal theory.
Lewis, Michelle Huckaby
2013-03-01
Although laboratory specimens are an important resource for biomedical research, controversy has arisen when research has been conducted without the knowledge or consent of the individuals who were the source of the specimens. This paper summarizes the most important litigation regarding the research use of laboratory specimens and traces the evolution of legal theory from property claims to claims related to genetic privacy interests. © 2013 American Society of Law, Medicine & Ethics, Inc.
Controversial Medical Treatments of Learning Disabilities
ERIC Educational Resources Information Center
Sieben, Robert L.
1977-01-01
The author presents a critical review of popular medical treatments for children with learning disabilities, including dietary treatment (food additives theories, brain allergies, hypoglycemia, megavitamin therapy, and trace mineral tests) and neurophysiologic retraining (patterning, sensory integrative therapy, and optometric training). (IM)
Binary Sequences for Spread-Spectrum Multiple-Access Communication
1977-08-01
Massey, J. L., and Uhran, J. J., Jr., "Sub-baud coding," Proceedings of the Thirteenth Annual Allerton Conference on Circuit and System Theory, pp. 539...sequences in a multipl.e access environment," Proceedings of the Thirteenth Annual AIlerton Conference on Circuit and System Theory, pp. 21-27, October...34 Proceedings of the Thirteenth Annual Allertcn Conference on Circuit and System Theory, pp. 548-559, October 1975. Yao, K., *Performance bounds on
The multiple Coulomb scattering of very heavy charged particles.
Wong, M; Schimmerling, W; Phillips, M H; Ludewigt, B A; Landis, D A; Walton, J T; Curtis, S B
1990-01-01
An experiment was performed at the Lawrence Berkeley Laboratory BEVALAC to measure the multiple Coulomb scattering of 650-MeV/A uranium nuclei in 0.19 radiation lengths of a Cu target. Differential distributions in the projected multiple scattering angle were measured in the vertical and horizontal planes using silicon position-sensitive detectors to determine particle trajectories before and after target scattering. The results were compared with the multiple Coulomb scattering theories of Fermi and Molière, and with a modification of the Fermi theory, using a Monte Carlo simulation. These theories were in excellent agreement with experiment at the 2 sigma level. The best quantitative agreement is obtained with the Gaussian distribution predicted by the modified Fermi theory.
How Does the Sparse Memory “Engram” Neurons Encode the Memory of a Spatial–Temporal Event?
Guan, Ji-Song; Jiang, Jun; Xie, Hong; Liu, Kai-Yuan
2016-01-01
Episodic memory in human brain is not a fixed 2-D picture but a highly dynamic movie serial, integrating information at both the temporal and the spatial domains. Recent studies in neuroscience reveal that memory storage and recall are closely related to the activities in discrete memory engram (trace) neurons within the dentate gyrus region of hippocampus and the layer 2/3 of neocortex. More strikingly, optogenetic reactivation of those memory trace neurons is able to trigger the recall of naturally encoded memory. It is still unknown how the discrete memory traces encode and reactivate the memory. Considering a particular memory normally represents a natural event, which consists of information at both the temporal and spatial domains, it is unknown how the discrete trace neurons could reconstitute such enriched information in the brain. Furthermore, as the optogenetic-stimuli induced recall of memory did not depend on firing pattern of the memory traces, it is most likely that the spatial activation pattern, but not the temporal activation pattern of the discrete memory trace neurons encodes the memory in the brain. How does the neural circuit convert the activities in the spatial domain into the temporal domain to reconstitute memory of a natural event? By reviewing the literature, here we present how the memory engram (trace) neurons are selected and consolidated in the brain. Then, we will discuss the main challenges in the memory trace theory. In the end, we will provide a plausible model of memory trace cell network, underlying the conversion of neural activities between the spatial domain and the temporal domain. We will also discuss on how the activation of sparse memory trace neurons might trigger the replay of neural activities in specific temporal patterns. PMID:27601979
How Does the Sparse Memory "Engram" Neurons Encode the Memory of a Spatial-Temporal Event?
Guan, Ji-Song; Jiang, Jun; Xie, Hong; Liu, Kai-Yuan
2016-01-01
Episodic memory in human brain is not a fixed 2-D picture but a highly dynamic movie serial, integrating information at both the temporal and the spatial domains. Recent studies in neuroscience reveal that memory storage and recall are closely related to the activities in discrete memory engram (trace) neurons within the dentate gyrus region of hippocampus and the layer 2/3 of neocortex. More strikingly, optogenetic reactivation of those memory trace neurons is able to trigger the recall of naturally encoded memory. It is still unknown how the discrete memory traces encode and reactivate the memory. Considering a particular memory normally represents a natural event, which consists of information at both the temporal and spatial domains, it is unknown how the discrete trace neurons could reconstitute such enriched information in the brain. Furthermore, as the optogenetic-stimuli induced recall of memory did not depend on firing pattern of the memory traces, it is most likely that the spatial activation pattern, but not the temporal activation pattern of the discrete memory trace neurons encodes the memory in the brain. How does the neural circuit convert the activities in the spatial domain into the temporal domain to reconstitute memory of a natural event? By reviewing the literature, here we present how the memory engram (trace) neurons are selected and consolidated in the brain. Then, we will discuss the main challenges in the memory trace theory. In the end, we will provide a plausible model of memory trace cell network, underlying the conversion of neural activities between the spatial domain and the temporal domain. We will also discuss on how the activation of sparse memory trace neurons might trigger the replay of neural activities in specific temporal patterns.
NASA Astrophysics Data System (ADS)
Kobayashi, Satoru; Tanelli, Simone; Im, Eastwood
2005-12-01
Effects of multiple scattering on reflectivity are studied for millimeter wavelength weather radars. A time-independent vector theory, including up to second-order scattering, is derived for a single layer of hydrometeors of a uniform density and a uniform diameter. In this theory, spherical waves with a Gaussian antenna pattern are used to calculate ladder and cross terms in the analytical scattering theory. The former terms represent the conventional multiple scattering, while the latter terms cause backscattering enhancement in both the copolarized and cross-polarized components. As the optical thickness of the hydrometeor layer increases, the differences from the conventional plane wave theory become more significant, and essentially, the reflectivity of multiple scattering depends on the ratio of mean free path to radar footprint radius. These results must be taken into account when analyzing radar reflectivity for use in remote sensing.
Orchestrating Multiple Intelligences
ERIC Educational Resources Information Center
Moran, Seana; Kornhaber, Mindy; Gardner, Howard
2006-01-01
Education policymakers often go astray when they attempt to integrate multiple intelligences theory into schools, according to the originator of the theory, Howard Gardner, and his colleagues. The greatest potential of a multiple intelligences approach to education grows from the concept of a profile of intelligences. Each learner's intelligence…
Barnard, Philip; deLahunta, Scott
2017-01-01
ABSTRACT Two long-term sci–art research projects are described and positioned in the broader conceptual landscape of interdisciplinary collaboration. Both projects were aimed at understanding and augmenting choreographic decision-making and both were grounded in research conducted within a leading contemporary dance company. In each case, the work drew upon methods and theory from the cognitive sciences, and both had a direct impact on the way in which the company made new work. In the synthesis presented here the concept of an audit trace is introduced. Audit traces identify how specific classes of knowledge are used and transformed not only within the arts or sciences but also when arts practice is informed by science or when arts practice informs science. PMID:29308084
Combatting Electoral Traces: The Dutch Tempest Discussion and Beyond
NASA Astrophysics Data System (ADS)
Pieters, Wolter
In the Dutch e-voting debate, the crucial issue leading to the abandonment of all electronic voting machines was compromising radiation, or tempest: it would be possible to eavesdrop on the choice of the voter by capturing the radiation from the machine. Other countries, however, do not seem to be bothered by this risk. In this paper, we use actor-network theory to analyse the socio-technical origins of the Dutch tempest issue in e-voting, and we introduce concepts for discussing its implications for e-voting beyond the Netherlands. We introduce the term electoral traces to denote any physical, digital or social evidence of a voter’s choices in an election. From this perspective, we provide a framework for risk classification as well as an overview of countermeasures against such traces.
["A male view?" Texts on feminism film theory].
Lippert, R
1994-11-01
The author traces the course taken by psychoanalytically oriented feminist film theory from its beginnings in the late seventies. She situates its origins in the Anglo-American debate about the exclusion of female subjectivity from the cinema and the new awareness of the problem of the cinematic mise-en-scène of the gaze, of "visual pleasure". First, massive criticism was levelled at the exclusively male/patriarchal gaze of the viewer, then emphasis centred around the specifically female gaze as a category in aesthetic theory. Ultimately, psychoanalytic feminist film theory has turned its attention to films for women, melodrams and early movies in an attempt to capture the respective historical forms of female subjectivity that they reflect.
Context-sensitive trace inlining for Java.
Häubl, Christian; Wimmer, Christian; Mössenböck, Hanspeter
2013-12-01
Method inlining is one of the most important optimizations in method-based just-in-time (JIT) compilers. It widens the compilation scope and therefore allows optimizing multiple methods as a whole, which increases the performance. However, if method inlining is used too frequently, the compilation time increases and too much machine code is generated. This has negative effects on the performance. Trace-based JIT compilers only compile frequently executed paths, so-called traces, instead of whole methods. This may result in faster compilation, less generated machine code, and better optimized machine code. In the previous work, we implemented a trace recording infrastructure and a trace-based compiler for [Formula: see text], by modifying the Java HotSpot VM. Based on this work, we evaluate the effect of trace inlining on the performance and the amount of generated machine code. Trace inlining has several major advantages when compared to method inlining. First, trace inlining is more selective than method inlining, because only frequently executed paths are inlined. Second, the recorded traces may capture information about virtual calls, which simplify inlining. A third advantage is that trace information is context sensitive so that different method parts can be inlined depending on the specific call site. These advantages allow more aggressive inlining while the amount of generated machine code is still reasonable. We evaluate several inlining heuristics on the benchmark suites DaCapo 9.12 Bach, SPECjbb2005, and SPECjvm2008 and show that our trace-based compiler achieves an up to 51% higher peak performance than the method-based Java HotSpot client compiler. Furthermore, we show that the large compilation scope of our trace-based compiler has a positive effect on other compiler optimizations such as constant folding or null check elimination.
Muscarinic receptors in amygdala control trace fear conditioning.
Baysinger, Amber N; Kent, Brianne A; Brown, Thomas H
2012-01-01
Intelligent behavior requires transient memory, which entails the ability to retain information over short time periods. A newly-emerging hypothesis posits that endogenous persistent firing (EPF) is the neurophysiological foundation for aspects or types of transient memory. EPF is enabled by the activation of muscarinic acetylcholine receptors (mAChRs) and is triggered by suprathreshold stimulation. EPF occurs in several brain regions, including the lateral amygdala (LA). The present study examined the role of amygdalar mAChRs in trace fear conditioning, a paradigm that requires transient memory. If mAChR-dependent EPF selectively supports transient memory, then blocking amygdalar mAChRs should impair trace conditioning, while sparing delay and context conditioning, which presumably do not rely upon transient memory. To test the EPF hypothesis, LA was bilaterally infused, prior to trace or delay conditioning, with either a mAChR antagonist (scopolamine) or saline. Computerized video analysis quantified the amount of freezing elicited by the cue and by the training context. Scopolamine infusion profoundly reduced freezing in the trace conditioning group but had no significant effect on delay or context conditioning. This pattern of results was uniquely anticipated by the EPF hypothesis. The present findings are discussed in terms of a systems-level theory of how EPF in LA and several other brain regions might help support trace fear conditioning.
ERIC Educational Resources Information Center
Boonma, Malai; Phaiboonnugulkij, Malinee
2014-01-01
This article calls for a strong need to propose the theoretical framework of the Multiple Intelligences theory (MI) and provide a suitable answer of the doubt in part of foreign language teaching. The article addresses the application of MI theory following various sources from Howard Gardner and the authors who revised this theory for use in the…
Vertex shading of the three-dimensional model based on ray-tracing algorithm
NASA Astrophysics Data System (ADS)
Hu, Xiaoming; Sang, Xinzhu; Xing, Shujun; Yan, Binbin; Wang, Kuiru; Dou, Wenhua; Xiao, Liquan
2016-10-01
Ray Tracing Algorithm is one of the research hotspots in Photorealistic Graphics. It is an important light and shadow technology in many industries with the three-dimensional (3D) structure, such as aerospace, game, video and so on. Unlike the traditional method of pixel shading based on ray tracing, a novel ray tracing algorithm is presented to color and render vertices of the 3D model directly. Rendering results are related to the degree of subdivision of the 3D model. A good light and shade effect is achieved by realizing the quad-tree data structure to get adaptive subdivision of a triangle according to the brightness difference of its vertices. The uniform grid algorithm is adopted to improve the rendering efficiency. Besides, the rendering time is independent of the screen resolution. In theory, as long as the subdivision of a model is adequate, cool effects as the same as the way of pixel shading will be obtained. Our practical application can be compromised between the efficiency and the effectiveness.
Multiple Intelligence Theory for Gifted Education: Criticisms and Implications
ERIC Educational Resources Information Center
Calik, Basak; Birgili, Bengi
2013-01-01
This paper scrutinizes giftedness and gifted learners under the implications of multiple intelligence theory with regard to coaching young scientists. It is one of the pluralistic theories toward intelligence while supporting to view individuals as active participants during teaching and learning processes which correspond with the applications of…
ERIC Educational Resources Information Center
Abes, Elisa S.
2009-01-01
This article is an exploration of possibilities and methodological considerations for using multiple theoretical perspectives in research that challenges inequitable power structures in student development theory. Specifically, I explore methodological considerations when partnering queer theory and constructivism in research on lesbian identity…
An Holistic Approach for Counsellors: Embracing Multiple Intelligences
ERIC Educational Resources Information Center
Booth, Rosslyn; O'Brien, Patrick John
2008-01-01
This paper explores a range of therapeutic modalities used by counsellors of children and positions those modalities within Gardner's theory of multiple intelligences. Research by O'Brien ("Gardner's theory of multiple intelligence and its implications for the counselling of children." Unpublished doctoral dissertation, Queensland University of…
NASA Astrophysics Data System (ADS)
Oral, I.; Dogan, O.
2007-04-01
The aim of this study is to find out the effect of the course materials based on Multiple Intelligence Theory upon the intelligence groups' learning process. In conclusion, the results proved that the materials prepared according to Multiple Intelligence Theory have a considerable effect on the students' learning process. This effect was particularly seen on the student groups of the musical-rhythmic, verbal-linguistic, interpersonal-social and naturalist intelligence.
Poisson traces, D-modules, and symplectic resolutions
NASA Astrophysics Data System (ADS)
Etingof, Pavel; Schedler, Travis
2018-03-01
We survey the theory of Poisson traces (or zeroth Poisson homology) developed by the authors in a series of recent papers. The goal is to understand this subtle invariant of (singular) Poisson varieties, conditions for it to be finite-dimensional, its relationship to the geometry and topology of symplectic resolutions, and its applications to quantizations. The main technique is the study of a canonical D-module on the variety. In the case the variety has finitely many symplectic leaves (such as for symplectic singularities and Hamiltonian reductions of symplectic vector spaces by reductive groups), the D-module is holonomic, and hence, the space of Poisson traces is finite-dimensional. As an application, there are finitely many irreducible finite-dimensional representations of every quantization of the variety. Conjecturally, the D-module is the pushforward of the canonical D-module under every symplectic resolution of singularities, which implies that the space of Poisson traces is dual to the top cohomology of the resolution. We explain many examples where the conjecture is proved, such as symmetric powers of du Val singularities and symplectic surfaces and Slodowy slices in the nilpotent cone of a semisimple Lie algebra. We compute the D-module in the case of surfaces with isolated singularities and show it is not always semisimple. We also explain generalizations to arbitrary Lie algebras of vector fields, connections to the Bernstein-Sato polynomial, relations to two-variable special polynomials such as Kostka polynomials and Tutte polynomials, and a conjectural relationship with deformations of symplectic resolutions. In the appendix we give a brief recollection of the theory of D-modules on singular varieties that we require.
Poisson traces, D-modules, and symplectic resolutions.
Etingof, Pavel; Schedler, Travis
2018-01-01
We survey the theory of Poisson traces (or zeroth Poisson homology) developed by the authors in a series of recent papers. The goal is to understand this subtle invariant of (singular) Poisson varieties, conditions for it to be finite-dimensional, its relationship to the geometry and topology of symplectic resolutions, and its applications to quantizations. The main technique is the study of a canonical D-module on the variety. In the case the variety has finitely many symplectic leaves (such as for symplectic singularities and Hamiltonian reductions of symplectic vector spaces by reductive groups), the D-module is holonomic, and hence, the space of Poisson traces is finite-dimensional. As an application, there are finitely many irreducible finite-dimensional representations of every quantization of the variety. Conjecturally, the D-module is the pushforward of the canonical D-module under every symplectic resolution of singularities, which implies that the space of Poisson traces is dual to the top cohomology of the resolution. We explain many examples where the conjecture is proved, such as symmetric powers of du Val singularities and symplectic surfaces and Slodowy slices in the nilpotent cone of a semisimple Lie algebra. We compute the D-module in the case of surfaces with isolated singularities and show it is not always semisimple. We also explain generalizations to arbitrary Lie algebras of vector fields, connections to the Bernstein-Sato polynomial, relations to two-variable special polynomials such as Kostka polynomials and Tutte polynomials, and a conjectural relationship with deformations of symplectic resolutions. In the appendix we give a brief recollection of the theory of D-modules on singular varieties that we require.
Early Tests of Piagetian Theory Through World War II.
Beins, Bernard C
2016-01-01
Psychologists recognized the importance of Jean Piaget's theory from its inception. Within a year of the appearance of his first book translated into English, The Language and Thought of the Child (J. Piaget, 1926) , it had been reviewed and welcomed; shortly thereafter, psychologists began testing the tenets of the theory empirically. The author traces the empirical testing of his theory in the 2 decades following publication of his initial book. A review of the published literature through the World War II era reveals that the research resulted in consistent failure to support the theoretical mechanisms that Piaget proposed. Nonetheless, the theory ultimately gained traction to become the bedrock of developmental psychology. Reasons for its persistence may include a possible lack of awareness by psychologists about the lack of empirical support, its breadth and complexity, and a lack of a viable alternate theory. As a result, the theory still exerts influence in psychology even though its dominance has diminished.
On the history of biological theories of homosexuality.
Herrn, R
1995-01-01
Biological theories of homosexuality fit into the discourse on reproduction and sexuality that began in the nineteenth century. They arose in the context of the early homosexual rights movement, with its claim for natural rights, and the psychiatric discussions about sexual perversions. With the classification of homosexuality as a distinct category, homosexuals were excluded from the "normal". Biological theories of homosexuality were attempts not only to explain its causes, but also to maintain the exclusion of homosexuals as the "other". Biological explanations can be categorized as genetic, constitutional, endocrinological, and ethological. On the one hand, biological theories were used in the struggle for homosexual rights. On the other hand, they were used to "cure"e homosexuals. Every theory led to a specific therapy. This paper points out the roots of this thinking, traces the development of various theories, and shows the utilization of biological theories in treating homosexuality.
Darwinism: Evolution or Revolution?
ERIC Educational Resources Information Center
Holt, Niles R.
1989-01-01
Maintains that Darwin's theory of evolution was more than a science versus religion debate; rather it was a revolutionary concept that influenced numerous social and political ideologies and movements throughout western history. Traces the impact of Darwin's work historically, utilizing a holistic approach. (RW)
Theory of Intergovernmental Grants and Local Government
ERIC Educational Resources Information Center
Rittenoure, R. Lynn; Pluta, Joseph E.
1977-01-01
The article prepares the ground for an investigation designed to trace the economic effects of intergovernmental transfers by examining the motivations for the expenditure behavior of local governments and anticipates local responses to revenue sharing, both general and special. (Author/NQ)
ERIC Educational Resources Information Center
Berrell, Michael M.; Macpherson, R. J. S.
1995-01-01
Traces the different paradigmatic pathways followed by educational sociology and educational administration. Educational sociology has followed ideostructural, interpretive, and psychosocial paradigms, with emergent holistic critical perspectives and sociobiological materialism. Educational administration has had one dominant tradition,…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karsten, S.G.
1987-01-01
Labor and capital are usually considered as the primary factors of production, the costs of which are of utmost importance. In contrast, nature (including all natural resources), as the essential third factor, is disregarded. She is generally assumed to be always available, self-regenerating, and to be exploited without long-term costs. In other words, she is more or less viewed as a constant. Hans Immler's new treatise represents an important contribution in that he emphasized the role and function of the natural environment, and its neglect, in the formulation of theories of value and their long-term consequences on contemporary economic theoriesmore » and on the person and society. This essay traces Immler's evaluation with extensive quotations - especially with regard to Physiocracy and the classical economists - of nature's role and function, or their neglect, in the formulation of theories of value through the writings of Aristotle, St. Thomas Aquinas, William Petty, John Locke, Adam Smith, David Ricardo, Karl Marx, and others, - all dealt with in Part 1 of his book - and Francois Quesnay and the Physiocrats - the topic of Part 2.« less
The young Ramón y Cajal as a cell-theory dissenter.
Iturbe, Ulises; Pretó, Juli; Lazcano, Antonio
2008-06-01
The intellectual development of scientists normally traverses several different phases as they mature in their professions. In many cases, strong support of certain ideas and theories gives way to more critical, productive views that set the stage for major theories and discoveries. This appears to have been the case of Santiago Ramón y Cajal (1852-1934). In his youth, he supported the protoplasmic theory of life, and as he matured he maintained a critical, yet open view of the cell theory, which postulated that life phenomena could not take place below the cellular level. In later years, however, an older and wiser Ramón y Cajal abandoned all traces of dissent and joined in fully supporting a refined version of cell theory, to which his own discoveries significantly contributed.
Quantization of noncompact coverings and its physical applications
NASA Astrophysics Data System (ADS)
Ivankov, Petr
2018-02-01
A rigorous algebraic definition of noncommutative coverings is developed. In the case of commutative algebras this definition is equivalent to the classical definition of topological coverings of locally compact spaces. The theory has following nontrivial applications: • Coverings of continuous trace algebras, • Coverings of noncommutative tori, • Coverings of the quantum SU(2) group, • Coverings of foliations, • Coverings of isospectral deformations of Spin - manifolds. The theory supplies the rigorous definition of noncommutative Wilson lines.
Palatini formulation of f( R, T) gravity theory, and its cosmological implications
NASA Astrophysics Data System (ADS)
Wu, Jimin; Li, Guangjie; Harko, Tiberiu; Liang, Shi-Dong
2018-05-01
We consider the Palatini formulation of f( R, T) gravity theory, in which a non-minimal coupling between the Ricci scalar and the trace of the energy-momentum tensor is introduced, by considering the metric and the affine connection as independent field variables. The field equations and the equations of motion for massive test particles are derived, and we show that the independent connection can be expressed as the Levi-Civita connection of an auxiliary, energy-momentum trace dependent metric, related to the physical metric by a conformal transformation. Similar to the metric case, the field equations impose the non-conservation of the energy-momentum tensor. We obtain the explicit form of the equations of motion for massive test particles in the case of a perfect fluid, and the expression of the extra force, which is identical to the one obtained in the metric case. The thermodynamic interpretation of the theory is also briefly discussed. We investigate in detail the cosmological implications of the theory, and we obtain the generalized Friedmann equations of the f( R, T) gravity in the Palatini formulation. Cosmological models with Lagrangians of the type f=R-α ^2/R+g(T) and f=R+α ^2R^2+g(T) are investigated. These models lead to evolution equations whose solutions describe accelerating Universes at late times.
Applying Multiple Intelligences
ERIC Educational Resources Information Center
Christodoulou, Joanna A.
2009-01-01
The ideas of multiple intelligences introduced by Howard Gardner of Harvard University more than 25 years ago have taken form in many ways, both in schools and in other sometimes-surprising settings. The silver anniversary of Gardner's learning theory provides an opportunity to reflect on the ways multiple intelligences theory has taken form and…
Multiple-layer printed-wiring trace connector
NASA Technical Reports Server (NTRS)
Pizzeck, D. E.
1977-01-01
Nickel-plated spring-steel foil connector is hollow pin, with lengthwise slit, that is inserted into improperly plated-through holes. Edges of connector make positive contact with copper pads within hole.
2007-05-01
sufficient for explaining how theory -of- mind emerges in normally developing children . As confirmation of its plausibility, our theory explains the... autism . While there are a number of different substrate elements that we believe are operative during theory of mind computations, three elements in...15. SUBJECT TERMS PMESII, multiple representations, integrated reasoning, hybrid systems, social cognition, theory of mind 16. SECURITY
Torres-Dowdall, J.; Farmer, A.H.; Abril, M.; Bucher, E.H.; Ridley, I.
2010-01-01
Trace-element analysis has been suggested as a tool for the study of migratory connectivity because (1) trace-element abundance varies spatially in the environment, (2) trace elements are assimilated into animals' tissues through the diet, and (3) current technology permits the analysis of multiple trace elements in a small tissue sample, allowing the simultaneous exploration of several elements. We explored the potential of trace elements (B, Na, Mg, Al, Si, P, S, K, Ca, Ti, Cr, Mn, Ni, Cu, Zn, As, Sr, Cs, Hg, Tl, Pb, Bi, Th, and U) to clarify the migratory connectivity of shorebirds that breed in North America and winter in southern South America. We collected 66 recently replaced secondary feathers from Red Knots (Calidris canutus) at three sites in Patagonia and 76 from White-rumped Sandpipers (C. fuscicollis) at nine sites across Argentina. There were significant differences in trace-element abundance in shorebird feathers grown at different nonbreeding sites, and annual variability within a site was small compared to variability among sites. Across Argentina, there was no large-scale gradient in trace elements. The lack of such a gradient restricts the application of this technique to questions concerning the origin of shorebirds to a small number of discrete sites. Furthermore, our results including three additional species, the Pectoral Sandpiper (C. melanotos), Wilson's Phalarope (Phalaropus tricolor), and Collared Plover (Charadrius collaris), suggest that trace-element profiles change as feathers age. Temporal instability of trace-element values could undermine their application to the study of migratory connectivity in shorebirds. ?? The Cooper Ornithological Society 2010.
Coincidence and covariance data acquisition in photoelectron and -ion spectroscopy. I. Formal theory
NASA Astrophysics Data System (ADS)
Mikosch, Jochen; Patchkovskii, Serguei
2013-10-01
We derive a formal theory of noisy Poisson processes with multiple outcomes. We obtain simple, compact expressions for the probability distribution function of arbitrarily complex composite events and its moments. We illustrate the utility of the theory by analyzing properties of coincidence and covariance photoelectron-photoion detection involving single-ionization events. The results and techniques introduced in this work are directly applicable to more general coincidence and covariance experiments, including multiple ionization and multiple-ion fragmentation pathways.
Global advances in selenium research from theory to application
USDA-ARS?s Scientific Manuscript database
Selenium is without question one of the most influential natural-occurring trace elements for biological systems worldwide. The multi-faceted connections between the environment, food crops, human and animal health and selenium’s function through selenoprotein activity, have been well characterized....
ERIC Educational Resources Information Center
Selkin, James
1983-01-01
Traces the development of the suicide prevention movement since the 1897 publication of Emile Durkheim's book "Suicide." Durkheim's theory of suicide is outlined, and implications for contemporary suicide prevention efforts are identified and discussed. Future trends in the development of suicide prevention centers are outlined.…
An Overview of Intelligence Testing.
ERIC Educational Resources Information Center
White, Margaret B.; Hall, Alfred E.
1980-01-01
This article briefly traces the development of intelligence testing from its beginnings in 1905 with Alfred Binet; cites the intelligence theories of Spearman, Thurstone, and Guilford; and examines current objections to intelligence tests in terms of what they test and how they are interpreted. (SJL)
ERIC Educational Resources Information Center
Lambert, Linda; And Others
This book introduces the concept of leadership as the facilitation of constructivist reciprocal processes among participants in an educational community. Chapter 1, "Learning and Leading Theory: A Century in the Making," (Deborah Walker and Linda Lambert) traces the dynamic history of learning and leading during this century, concluding…
An Historical Perspective on the Theory and Practice of Soil Mechanical Analysis.
ERIC Educational Resources Information Center
Miller, W. P.; And Others
1988-01-01
Traces the history of soil mechanical analysis. Evaluates this history in order to place current concepts in perspective, from both a research and teaching viewpoint. Alternatives to traditional separation techniques for use in soils teaching laboratories are discussed. (TW)
Ou, Yanqiu; Bloom, Michael S; Nie, Zhiqiang; Han, Fengzhen; Mai, Jinzhuang; Chen, Jimei; Lin, Shao; Liu, Xiaoqing; Zhuang, Jian
2017-09-01
Prenatal exposure to toxic trace elements, including heavy metals, is an important public health concern. Few studies have assessed if individual and multiple trace elements simultaneously affect cardiac development. The current study evaluated the association between maternal blood lead (Pb), cadmium (Cd), chromium (Cr), copper (Cu), mercury (Hg), and selenium (Se) levels and congenital heart defects (CHDs) in offspring. This hospital-based case-control study included 112 case and 107 control infants. Maternal peripheral blood draw was made during gestational weeks 17-40 and used to determine trace element levels by inductively coupled plasma mass spectrometry. Multivariable logistic regression was used to assess associations and interactions between individual and multiple trace elements and fetal CHDs, adjusted for maternal age, parity, education, newborn gender, migrant, folic acid or multivitamin intake, cigarette smoking, maternal prepregnancy body mass index, and time of sample collection. Control participants had medians of 2.61μg/dL Pb, 1.76μg/L Cd, 3.57μg/L Cr, 896.56μg/L Cu, 4.17μg/L Hg, and 186.47μg/L Se in blood. In a model including all measured trace elements and adjusted for confounders, high levels of maternal Pb (OR=12.09, 95% CI: 2.81, 51.97) and Se (OR=0.25, 95% CI: 0.08, 0.77) were harmful and protective predictors of CHDs, respectively, with positive and negative interactions suggested for Cd with Pb and Se with Pb, respectively. Similar associations were detected for subgroups of CHDs, including conotruncal defects, septal defects, and right ventricle outflow tract obstruction. Our results suggest that even under the current standard for protecting human health (10μg/dL), Pb exposure poses an important health threat. These data can be used for developing interventions and identifying high-risk pregnancies. Copyright © 2017. Published by Elsevier Ltd.
ERIC Educational Resources Information Center
Ko, James; Cheng, Yin Cheong; Lee, Theodore Tai Hoi
2016-01-01
Purpose: The purpose of this paper is to trace the development of school autonomy and accountability and related multiple changes and impacts in key areas of school education in Hong Kong since implementing school-based management (SBM) from 1990s. Design/methodology/approach: To explore the evolution and the uniqueness of autonomy and…
Mendis, Nilmini; McBride, Peter; Faucher, Sébastien P
2015-01-01
Legionella pneumophila (Lp) is the etiological agent responsible for Legionnaires' disease, a potentially fatal pulmonary infection. Lp lives and multiplies inside protozoa in a variety of natural and man-made water systems prior to human infection. Fraquil, a defined freshwater medium, was used as a highly reproducible medium to study the behaviour of Lp in water. Adopting a reductionist approach, Fraquil was used to study the impact of temperature, pH and trace metal levels on the survival and subsequent intracellular multiplication of Lp in Acanthamoeba castellanii, a freshwater protozoan and a natural host of Legionella. We show that temperature has a significant impact on the short- and long-term survival of Lp, but that the bacterium retains intracellular multiplication potential for over six months in Fraquil. Moreover, incubation in Fraquil at pH 4.0 resulted in a rapid decline in colony forming units, but was not detrimental to intracellular multiplication. In contrast, variations in trace metal concentrations had no impact on either survival or intracellular multiplication in amoeba. Our data show that Lp is a resilient bacterium in the water environment, remaining infectious to host cells after six months under the nutrient-deprived conditions of Fraquil.
Weiser, Armin A; Thöns, Christian; Filter, Matthias; Falenski, Alexander; Appel, Bernd; Käsbohrer, Annemarie
2016-01-01
FoodChain-Lab is modular open-source software for trace-back and trace-forward analysis in food-borne disease outbreak investigations. Development of FoodChain-Lab has been driven by a need for appropriate software in several food-related outbreaks in Germany since 2011. The software allows integrated data management, data linkage, enrichment and visualization as well as interactive supply chain analyses. Identification of possible outbreak sources or vehicles is facilitated by calculation of tracing scores for food-handling stations (companies or persons) and food products under investigation. The software also supports consideration of station-specific cross-contamination, analysis of geographical relationships, and topological clustering of the tracing network structure. FoodChain-Lab has been applied successfully in previous outbreak investigations, for example during the 2011 EHEC outbreak and the 2013/14 European hepatitis A outbreak. The software is most useful in complex, multi-area outbreak investigations where epidemiological evidence may be insufficient to discriminate between multiple implicated food products. The automated analysis and visualization components would be of greater value if trading information on food ingredients and compound products was more easily available.
Filter, Matthias; Falenski, Alexander; Appel, Bernd; Käsbohrer, Annemarie
2016-01-01
FoodChain-Lab is modular open-source software for trace-back and trace-forward analysis in food-borne disease outbreak investigations. Development of FoodChain-Lab has been driven by a need for appropriate software in several food-related outbreaks in Germany since 2011. The software allows integrated data management, data linkage, enrichment and visualization as well as interactive supply chain analyses. Identification of possible outbreak sources or vehicles is facilitated by calculation of tracing scores for food-handling stations (companies or persons) and food products under investigation. The software also supports consideration of station-specific cross-contamination, analysis of geographical relationships, and topological clustering of the tracing network structure. FoodChain-Lab has been applied successfully in previous outbreak investigations, for example during the 2011 EHEC outbreak and the 2013/14 European hepatitis A outbreak. The software is most useful in complex, multi-area outbreak investigations where epidemiological evidence may be insufficient to discriminate between multiple implicated food products. The automated analysis and visualization components would be of greater value if trading information on food ingredients and compound products was more easily available. PMID:26985673
A novel tracing method for the segmentation of cell wall networks.
De Vylder, Jonas; Rooms, Filip; Dhondt, Stijn; Inze, Dirk; Philips, Wilfried
2013-01-01
Cell wall networks are a common subject of research in biology, which are important for plant growth analysis, organ studies, etc. In order to automate the detection of individual cells in such cell wall networks, we propose a new segmentation algorithm. The proposed method is a network tracing algorithm, exploiting the prior knowledge of the network structure. The method is applicable on multiple microscopy modalities such as fluorescence, but also for images captured using non invasive microscopes such as differential interference contrast (DIC) microscopes.
Tracing Multiple Generations of Active Galactic Nucleau Feedback in the Core of Abell 262
2009-06-01
Virgo cluster reveal a series of filaments, which trace regions that are thought 1481 Report Documentation Page Form ApprovedOMB No. 0704-0188 Public...L. Sarazin4, L. D. Anderson3, Gopal-Krishna5, E. M. Douglass3, and N. E. Kassim1 1 Naval Research Laboratory, 4555 Overlook Avenue SW, Code 7213...Washington, DC 20375, USA 2 Interferometrics Inc., 13454 Sunrise Valley Drive, Suite 240, Herndon, VA 20171, USA 3 Institute for Astrophysical Research
Progress on Artist Improvements
1988-11-01
without Elimination of Multiple Es 13 10 ARTIST Printout to Show the Current Version Number 16 1 1 ARTIST Autoscaling lonogram 18 12 ARTIST Autoscaling ...subroutines used in that version. Another area of concern has been the often observed roughness of the h’(f) traces. The ARTIST autoscaling often...1 2 3 4 [MHz] F 3(:) 0 53 58 E C) ES 22 46 57 Figure 10. ARTIST Printout to Show the Current Version Number 4.0 SMOOTHING OF THE AUTOSCALED TRACES The
Infant word recognition: Insights from TRACE simulations☆
Mayor, Julien; Plunkett, Kim
2014-01-01
The TRACE model of speech perception (McClelland & Elman, 1986) is used to simulate results from the infant word recognition literature, to provide a unified, theoretical framework for interpreting these findings. In a first set of simulations, we demonstrate how TRACE can reconcile apparently conflicting findings suggesting, on the one hand, that consonants play a pre-eminent role in lexical acquisition (Nespor, Peña & Mehler, 2003; Nazzi, 2005), and on the other, that there is a symmetry in infant sensitivity to vowel and consonant mispronunciations of familiar words (Mani & Plunkett, 2007). In a second series of simulations, we use TRACE to simulate infants’ graded sensitivity to mispronunciations of familiar words as reported by White and Morgan (2008). An unexpected outcome is that TRACE fails to demonstrate graded sensitivity for White and Morgan’s stimuli unless the inhibitory parameters in TRACE are substantially reduced. We explore the ramifications of this finding for theories of lexical development. Finally, TRACE mimics the impact of phonological neighbourhoods on early word learning reported by Swingley and Aslin (2007). TRACE offers an alternative explanation of these findings in terms of mispronunciations of lexical items rather than imputing word learning to infants. Together these simulations provide an evaluation of Developmental (Jusczyk, 1993) and Familiarity (Metsala, 1999) accounts of word recognition by infants and young children. The findings point to a role for both theoretical approaches whereby vocabulary structure and content constrain infant word recognition in an experience-dependent fashion, and highlight the continuity in the processes and representations involved in lexical development during the second year of life. PMID:24493907
Infant word recognition: Insights from TRACE simulations.
Mayor, Julien; Plunkett, Kim
2014-02-01
The TRACE model of speech perception (McClelland & Elman, 1986) is used to simulate results from the infant word recognition literature, to provide a unified, theoretical framework for interpreting these findings. In a first set of simulations, we demonstrate how TRACE can reconcile apparently conflicting findings suggesting, on the one hand, that consonants play a pre-eminent role in lexical acquisition (Nespor, Peña & Mehler, 2003; Nazzi, 2005), and on the other, that there is a symmetry in infant sensitivity to vowel and consonant mispronunciations of familiar words (Mani & Plunkett, 2007). In a second series of simulations, we use TRACE to simulate infants' graded sensitivity to mispronunciations of familiar words as reported by White and Morgan (2008). An unexpected outcome is that TRACE fails to demonstrate graded sensitivity for White and Morgan's stimuli unless the inhibitory parameters in TRACE are substantially reduced. We explore the ramifications of this finding for theories of lexical development. Finally, TRACE mimics the impact of phonological neighbourhoods on early word learning reported by Swingley and Aslin (2007). TRACE offers an alternative explanation of these findings in terms of mispronunciations of lexical items rather than imputing word learning to infants. Together these simulations provide an evaluation of Developmental (Jusczyk, 1993) and Familiarity (Metsala, 1999) accounts of word recognition by infants and young children. The findings point to a role for both theoretical approaches whereby vocabulary structure and content constrain infant word recognition in an experience-dependent fashion, and highlight the continuity in the processes and representations involved in lexical development during the second year of life.
McCluney, Kevin E; Sabo, John L
2010-12-31
Fluxes of carbon, nitrogen, and water between ecosystem components and organisms have great impacts across levels of biological organization. Although much progress has been made in tracing carbon and nitrogen, difficulty remains in tracing water sources from the ecosystem to animals and among animals (the "water web"). Naturally occurring, non-radioactive isotopes of hydrogen and oxygen in water provide a potential method for tracing water sources. However, using this approach for terrestrial animals is complicated by a change in water isotopes within the body due to differences in activity of heavy and light isotopes during cuticular and transpiratory water losses. Here we present a technique to use stable water isotopes to estimate the mean mix of water sources in a population by sampling a group of sympatric animals over time. Strong correlations between H and O isotopes in the body water of animals collected over time provide linear patterns of enrichment that can be used to predict a mean mix of water sources useful in standard mixing models to determine relative source contribution. Multiple temperature and humidity treatment levels do not greatly alter these relationships, thus having little effect on our ability to estimate this population-level mix of water sources. We show evidence for the validity of using multiple samples of animal body water, collected across time, to estimate the isotopic mix of water sources in a population and more accurately trace water sources. The ability to use isotopes to document patterns of animal water use should be a great asset to biologists globally, especially those studying drylands, droughts, streamside areas, irrigated landscapes, and the effects of climate change.
Wolfe, Christopher R.; Reyna, Valerie F.; Widmer, Colin L.; Cedillos, Elizabeth M.; Fisher, Christopher R.; Brust-Renck, Priscila G.; Weil, Audrey M.
2014-01-01
Background Many healthy women consider genetic testing for breast cancer risk, yet BRCA testing issues are complex. Objective Determining whether an intelligent tutor, BRCA Gist, grounded in fuzzy-trace theory (FTT), increases gist comprehension and knowledge about genetic testing for breast cancer risk, improving decision-making. Design In two experiments, 410 healthy undergraduate women were randomly assigned to one of three groups: an online module using a web-based tutoring system (BRCA Gist) that uses artificial intelligence technology, a second group read highly similar content from the NCI web site, and a third completed an unrelated tutorial. Intervention BRCA Gist applied fuzzy trace theory and was designed to help participants develop gist comprehension of topics relevant to decisions about BRCA genetic testing, including how breast cancer spreads, inherited genetic mutations, and base rates. Measures We measured content knowledge, gist comprehension of decision-relevant information, interest in testing, and genetic risk and testing judgments. Results Control knowledge scores ranged from 54% to 56%, NCI improved significantly to 65% and 70%, and BRCA Gist improved significantly more to 75% and 77%, p<.0001. BRCA Gist scored higher on gist comprehension than NCI and control, p<.0001. Control genetic risk-assessment mean was 48% correct; BRCA Gist (61%), and NCI (56%) were significantly higher, p<.0001. BRCA Gist participants recommended less testing for women without risk factors (not good candidates), (24% and 19%) than controls (50%, both experiments) and NCI, (32%) Experiment 2, p<.0001. BRCA Gist testing interest was lower than controls, p<.0001. Limitations BRCA Gist has not been tested with older women from diverse groups. Conclusions Intelligent tutors, such as BRCA Gist, are scalable, cost effective ways of helping people understand complex issues, improving decision-making. PMID:24829276
Integral Methodological Pluralism in Science Education Research: Valuing Multiple Perspectives
ERIC Educational Resources Information Center
Davis, Nancy T.; Callihan, Laurie P.
2013-01-01
This article examines the multiple methodologies used in educational research and proposes a model that includes all of them as contributing to understanding educational contexts and research from multiple perspectives. The model, based on integral theory (Wilber in a theory of everything. Shambhala, Boston, 2000) values all forms of research as…
Efficacy in Teaching through "Multiple Intelligence" Instructional Strategies
ERIC Educational Resources Information Center
Tamilselvi, B.; Geetha, D.
2015-01-01
Multiple intelligence is the theory that "people are smart in more ways than one has immense implication for educators". Howard Gardner proposed a new view of intelligence that is rapidly being incorporated in school curricula. In his theory of Multiple Intelligences, Gardner expanded the concept of intelligence with such areas as music,…
Do Age and Sex of School Students Make Significant Difference in Their Multiple Intelligences?
ERIC Educational Resources Information Center
Ravi, R.; Vedapriya, S. Gethsi
2009-01-01
Multiple Intelligences are a new educational theory proposed by Howard Gardner in 1983. Multiple intelligences describe an array of different kinds of intelligences exhibited by human beings. This theory consists of verbal-linguistic, logical and mathematics, visual and spatial, bodily kinesthetic, musical-rhythmic, intrapersonal, interpersonal,…
ERIC Educational Resources Information Center
Green, Crystal D.
2010-01-01
This action research study investigated the perceptions that student participants had on the development of a career exploration model and a career exploration project. The Holland code theory was the primary assessment used for this research study, in addition to the Multiple Intelligences theory and the identification of a role model for the…
Schumm, Walter R
2004-02-01
Differential risk theory, a subset of social exchange and equity theories, is proposed as an explanation for stigma towards homosexuals and as a basis for normative preferences for heterosexual marriage. Numerous gender differences involved in long-term relationships require members of such close relationships to assume greater interpersonal and social risks and thus costs, compared to same-gender relationships. Without compensating rewards or reduced costs, heterosexual relationships would be unfairly disadvantaged. Resistance to making gay marriage normative and/or equivalent legally to heterosexual marriage may be traced, rather than to homophobia, to societal attempts to maintain equity between classes of relationships characterized by inherent differential risks.
Pedagogical Affordances of Multiple External Representations in Scientific Processes
ERIC Educational Resources Information Center
Wu, Hsin-Kai; Puntambekar, Sadhana
2012-01-01
Multiple external representations (MERs) have been widely used in science teaching and learning. Theories such as dual coding theory and cognitive flexibility theory have been developed to explain why the use of MERs is beneficial to learning, but they do not provide much information on pedagogical issues such as how and in what conditions MERs…
"g" and the Measurement of Multiple Intelligences: A Response to Gardner
ERIC Educational Resources Information Center
Visser, Beth A.; Ashton, Michael C.; Vernon, Philip A.
2006-01-01
Gardner [Gardner, H. (2006-this issue). On failing to grasp the core of MI theory: A response to Visser et al. "Intelligence"] criticized some aspects of our empirical examination [Visser, B. A., Ashton, M. C., & Vernon, P. A. (2006-this issue). Beyond "g": Putting multiple intelligences theory to the test. "Intelligence"] of his "Theory of…
ERIC Educational Resources Information Center
Kul, Marat
2015-01-01
After Gardner had introduced the Multiple Intelligence (MI) theory, many researchers tried to find out the possibilities of applying this theory in the education domain. Moreover, the effects of different kinds of athletic applications on intelligence development within the framework of this theory have also been under investigation. This study…
ERIC Educational Resources Information Center
Hanafin, Joan
2014-01-01
This paper presents findings from an action research project that investigated the application of Multiple Intelligences (MI) theory in classrooms and schools. It shows how MI theory was used in the project as a basis for suggestions to generate classroom practices; how participating teachers evaluated the project; and how teachers responded to…
Dramatic Playing beyond the Theory of Multiple Intelligences
ERIC Educational Resources Information Center
Guss, Faith Gabrielle
2005-01-01
Related to aspects of drama and theatre education, I search beyond the findings about symbolic play set forth by Dr Howard Gardner in "Frames of mind. The theory of multiple intelligences". Despite the inspiration for and solidarity with arts educators that emanate from his theory, I sensed that it did not provide a full picture of the complex…
ERIC Educational Resources Information Center
Calvin-Campbell, Karole
This paper explores the similarities between Orff's Schulwerk, Montessori's philosophy, and Gardner's theory of Multiple Intelligences in an effort to explore how to best teach a child. In the late 19th century, specific learning theories began to emerge. Maria Montessori and Carl Orff each developed innovative teaching theories during the first…
[The development of Rein van Bemmelens (1904-1983) undation theory: forty years of Dutch geology].
Barzilay, Willemjan
2009-01-01
The Dutch geologist Rein van Bemmelen was the greatest opponent of plate tectonics in The Netherlands. He lived and worked during an important period in the history of earth sciences. He had studied geology when Wegeners theory was introduced and enthusiastically received in the Netherlands and he worked as a geologists during the period in which, after Wegeners theory was rejected in The Netherlands, several Dutch geologists came with their own theories to explain the origin of continents and oceans and in which plate tectonics was introduced in The Netherlands. He had proposed his own theory, the undation theory, at the beginning of the 1930s and kept on developing it during the following years. He continued to do so until his death in 1983. The history of the undation theory thus sheds light on the history of geology in The Netherlands. I will trace the history of geology in The Netherlands using Rein van Bemmelen and his undation theory as a lens.
Structure of UV divergences in maximally supersymmetric gauge theories
NASA Astrophysics Data System (ADS)
Kazakov, D. I.; Borlakov, A. T.; Tolkachev, D. M.; Vlasenko, D. E.
2018-06-01
We consider the UV divergences up to sub-subleading order for the four-point on-shell scattering amplitudes in D =8 supersymmetric Yang-Mills theory in the planar limit. We trace how the leading, subleading, etc divergences appear in all orders of perturbation theory. The structure of these divergences is typical for any local quantum field theory independently on renormalizability. We show how the generalized renormalization group equations allow one to evaluate the leading, subleading, etc. contributions in all orders of perturbation theory starting from one-, two-, etc. loop diagrams respectively. We focus then on subtraction scheme dependence of the results and show that in full analogy with renormalizable theories the scheme dependence can be absorbed into the redefinition of the couplings. The only difference is that the role of the couplings play dimensionless combinations like g2s2 or g2t2, where s and t are the Mandelstam variables.
An Exploration of the Source of Key Concepts and Principles in Froebel's Educational Theory.
ERIC Educational Resources Information Center
Lee, Sang-Wook; Evans, Roy
1996-01-01
Traces the origins of Friedrich Froebel's views on children, curriculum, pedagogics, and didactics; the underpinnings of his beliefs concerning the nature of knowledge; and the kinds of knowledge and dispositions he believed were appropriate for children to acquire. (MDM)
ERIC Educational Resources Information Center
Davis, Robert A.
2015-01-01
The 2014 INPE McLaughlin Lecture explores the emergent concept of the "postliberal" and the increasing frequency of its formal and informal uses in the languages of educational theory and practice. It traces the origins of the term "postliberal" to certain strains of modern Christian theology, maps its migration into liberal…
Building bridges between the physical and biological sciences.
Ninham, B W; Boström, M
2005-12-16
This paper attempts to identify major conceptual issues that have inhibited the application of physical chemistry to problems in the biological sciences. We will trace out where theories went wrong, how to repair the present foundations, and discuss current progress toward building a better dialogue.
Darwin and Developmental Psychology: Past and Present.
ERIC Educational Resources Information Center
Charlesworth, William R.
1992-01-01
Darwin's weak influence on developmental psychology is traced. It is explained by (1) developmentalists' commitment to an ideology of meliorism; (2) conceptual issues relating to ontogeny and phylogeny; and (3) methodological problems. Suggests that developmentalists use evolutionary theory as a heuristic for structuring new research. (BC)
Walzer, Andreas; Schausberger, Peter
2013-01-01
Intraguild (IG) prey is commonly confronted with multiple IG predator species. However, the IG predation (IGP) risk for prey is not only dependent on the predator species, but also on inherent (intraspecific) characteristics of a given IG predator such as its life-stage, sex or gravidity and the associated prey needs. Thus, IG prey should have evolved the ability to integrate multiple IG predator cues, which should allow both inter- and intraspecific threat-sensitive anti-predator responses. Using a guild of plant-inhabiting predatory mites sharing spider mites as prey, we evaluated the effects of single and combined cues (eggs and/or chemical traces left by a predator female on the substrate) of the low risk IG predator Neoseiulus californicus and the high risk IG predator Amblyseius andersoni on time, distance and path shape parameters of the larval IG prey Phytoseiulus persimilis. IG prey discriminated between traces of the low and high risk IG predator, with and without additional presence of their eggs, indicating interspecific threat-sensitivity. The behavioural changes were manifest in distance moved, activity and path shape of IG prey. The cue combination of traces and eggs of the IG predators conveyed other information than each cue alone, allowing intraspecific threat-sensitive responses by IG prey apparent in changed velocities and distances moved. We argue that graded responses to single and combined IG predator cues are adaptive due to minimization of acceptance errors in IG prey decision making. PMID:23750040
McGeown, Sarah P; Gray, Eleanor A; Robinson, Jamey L; Dewhurst, Stephen A
2014-06-01
Two experiments investigated the cognitive skills that underlie children's susceptibility to semantic and phonological false memories in the Deese/Roediger-McDermott procedure (Deese, 1959; Roediger & McDermott, 1995). In Experiment 1, performance on the Verbal Similarities subtest of the British Ability Scales (BAS) II (Elliott, Smith, & McCulloch, 1997) predicted correct and false recall of semantic lures. In Experiment 2, performance on the Yopp-Singer Test of Phonemic Segmentation (Yopp, 1988) did not predict correct recall, but inversely predicted the false recall of phonological lures. Auditory short-term memory was a negative predictor of false recall in Experiment 1, but not in Experiment 2. The findings are discussed in terms of the formation of gist and verbatim traces as proposed by fuzzy trace theory (Reyna & Brainerd, 1998) and the increasing automaticity of associations as proposed by associative activation theory (Howe, Wimmer, Gagnon, & Plumpton, 2009). Copyright © 2014 Elsevier B.V. All rights reserved.
You, Heejo; Magnuson, James S
2018-06-01
This article describes a new Python distribution of TISK, the time-invariant string kernel model of spoken word recognition (Hannagan et al. in Frontiers in Psychology, 4, 563, 2013). TISK is an interactive-activation model similar to the TRACE model (McClelland & Elman in Cognitive Psychology, 18, 1-86, 1986), but TISK replaces most of TRACE's reduplicated, time-specific nodes with theoretically motivated time-invariant, open-diphone nodes. We discuss the utility of computational models as theory development tools, the relative merits of TISK as compared to other models, and the ways in which researchers might use this implementation to guide their own research and theory development. We describe a TISK model that includes features that facilitate in-line graphing of simulation results, integration with standard Python data formats, and graph and data export. The distribution can be downloaded from https://github.com/maglab-uconn/TISK1.0 .
Bernard J. Wood Receives 2013 Harry H. Hess Medal: Citation
NASA Astrophysics Data System (ADS)
Hofmann, Albrecht W.
2014-01-01
As Harry Hess recognized over 50 years ago, mantle melting is the fundamental motor for planetary evolution and differentiation. Melting generates the major divisions of crust mantle and core. The distribution of chemical elements between solids, melts, and gaseous phases is fundamental to understanding these differentiation processes. Bernie Wood, together with Jon Blundy, has combined experimental petrology and physicochemical theory to revolutionize the understanding of the distribution of trace elements between melts and solids in the Earth. Knowledge of these distribution laws allows the reconstruction of the source compositions of the melts (deep in Earth's interior) from their abundances in volcanic rocks. Bernie's theoretical treatment relates the elastic strain of the lattice caused by the substitution of a trace element in a crystal to the ionic radius and charge of this element. This theory, and its experimental calibrations, brought order to a literature of badly scattered, rather chaotic experimental data that allowed no satisfactory quantitative modeling of melting processes in the mantle.
On the measurement of airborne, angular-dependent sound transmission through supercritical bars.
Shaw, Matthew D; Anderson, Brian E
2012-10-01
The coincidence effect is manifested by maximal sound transmission at angles at which trace wave number matching occurs. Coincidence effect theory is well-defined for unbounded thin plates using plane-wave excitation. However, experimental results for finite bars are known to diverge from theory near grazing angles. Prior experimental work has focused on pulse excitation. An experimental setup has been developed to observe coincidence using continuous- wave excitation and phased-array methods. Experimental results with an aluminum bar exhibit maxima at the predicted angles, showing that coincidence is observable using continuous waves. Transmission near grazing angles is seen to diverge from infinite plate theory.
Design and research of built-in sample cell with multiple optical reflections
NASA Astrophysics Data System (ADS)
Liu, Jianhui; Wang, Shuyao; Lv, Jinwei; Liu, Shuyang; Zhou, Tao; Jia, Xiaodong
2017-10-01
In the field of trace gas measurement, with the characteristics of high sensitivity, high selectivity and rapid detection, tunable diode laser absorption spectroscopy (TDLAS) is widely used in industrial process and trace gas pollution monitoring. Herriott cell is a common form of multiple reflections of the sample cell, the structure of the Herriott cell is relatively simple, which be used to application of trace gas absorption spectroscopy. In the pragmatic situation, the gas components are complicated, and the continuous testing process for a long time can lead to different degree of pollution and corrosion for the reflector in the sample cell. If the mirror is not cleaned up in time, it will have a great influence on the detection accuracy. In order to solve this problem in the process of harsh environment detection, this paper presents a design of the built-in sample cell to avoid the contact of gas and the mirror, thereby effectively reducing corrosion pollution. If there is optical pollution, direct replacement of the built-in optical sample cell can easily to be disassembled, and cleaned. The advantage of this design is long optical path, high precision, cost savings and so on.
A multiple reader scoring system for Nasal Potential Difference parameters.
Solomon, George M; Liu, Bo; Sermet-Gaudelus, Isabelle; Fajac, Isabelle; Wilschanski, Michael; Vermeulen, Francois; Rowe, Steven M
2017-09-01
Nasal Potential Difference (NPD) is a biomarker of CFTR activity used to diagnose CF and monitor experimental therapies. Limited studies have been performed to assess agreement between expert readers of NPD interpretation using a scoring algorithm. We developed a standardized scoring algorithm for "interpretability" and "confidence" for PD (potential difference) measures, and sought to determine the degree of agreement on NPD parameters between trained readers. There was excellent agreement for interpretability between NPD readers for CF and fair agreement for normal tracings but slight agreement of interpretability in indeterminate tracings. Amongst interpretable tracings, excellent correlation of mean scores for Ringer's Baseline PD, Δ amiloride , and Δ Cl-free+Isoproterenol was observed. There was slight agreement regarding confidence of the interpretable PD tracings, resulting in divergence of the Ringers and Δ amiloride , and ΔCl -free+Isoproterenol PDs between "high" and "low" confidence CF tracings. A multi-reader process with adjudication is important for scoring NPDs for diagnosis and in monitoring of CF clinical trials. Copyright © 2017 European Cystic Fibrosis Society. Published by Elsevier B.V. All rights reserved.
[Modern concepts of trauma care and multiple trauma management in oral and maxillofacial region].
Tan, Yinghui
2015-06-01
Multiple trauma management requires the application of modem trauma care theories. Optimal treatment results can be achieved by reinforcing cooperation and stipulating a treatment plan together with other disciplines. Based on modem theories in trauma care and our understanding of the theoretical points, this paper analyzes the injury assessment strategies and methods in oral and maxillofacial multiple trauma management. Moreover, this paper discusses operating time and other influencing factors as well as proposed definitive surgical timing and indications in comprehensive management of oral and maxillofacial multiple trauma patients associated with injuries in other body parts. We hope that this paper can help stomatological physicians deepen their understanding of modem trauma care theories and improve their capacity and results in the treatment of oral and maxillofacial multiple trauma.
ERIC Educational Resources Information Center
Kallenbach, Silja, Ed.; Viens, Julie, Ed.
This document contains nine papers from a systematic, classroom-based study of multiple intelligences (MI) theory in different adult learning contexts during which adult educators from rural and urban areas throughout the United States conducted independent inquiries into the question of how MI theory can support instruction and assessment in…
NASA Astrophysics Data System (ADS)
Sergeant, C.; Vesvres, M. H.; Devès, G.; Guillou, F.
2005-04-01
In the central nervous system, metallic cations are involved in oligodendrocyte maturation and myelinogenesis. Moreover, the metallic cations have been associated with pathogenesis, particularly multiple sclerosis and malignant gliomas. The brain is vulnerable to either a deficit or an excess of available trace elements. Relationship between trace metals and myelinogenesis is important in understanding a severe human pathology : the multiple sclerosis, which remains without efficient treatment. One approach to understand this disease has used mutant or transgenic mice presenting myelin deficiency or excess. But to date, the concentration of trace metals and mineral elements in white and gray matter areas in wild type brain is unknown. The aim of this study is to establish the reference concentrations of trace metals (iron, copper and zinc) and minerals (potassium and calcium) in the white and gray matter of the mouse cerebellum and corpus callosum. The brains of four different genetic mouse strains (C57Black6/SJL, C57Black6/D2, SJL and C3H) were analyzed. The freeze-dried samples were prepared to allow PIXE (Proton-induced X-ray emission) and RBS (Rutherford backscattering spectrometry) analyses with the nuclear microprobe in Bordeaux. The results obtained give the first reference values. Furthermore, one species out of the fours testes exhibited differences in calcium, iron and zinc concentrations in the white matter.
What Learning Systems do Intelligent Agents Need? Complementary Learning Systems Theory Updated.
Kumaran, Dharshan; Hassabis, Demis; McClelland, James L
2016-07-01
We update complementary learning systems (CLS) theory, which holds that intelligent agents must possess two learning systems, instantiated in mammalians in neocortex and hippocampus. The first gradually acquires structured knowledge representations while the second quickly learns the specifics of individual experiences. We broaden the role of replay of hippocampal memories in the theory, noting that replay allows goal-dependent weighting of experience statistics. We also address recent challenges to the theory and extend it by showing that recurrent activation of hippocampal traces can support some forms of generalization and that neocortical learning can be rapid for information that is consistent with known structure. Finally, we note the relevance of the theory to the design of artificial intelligent agents, highlighting connections between neuroscience and machine learning. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Yazdani, Mohsen
Transient electromagnetic scattering by a radially uniaxial dielectric sphere is explored using three well-known methods: Debye series, Mie series, and ray tracing theory. In the first approach, the general solutions for the impulse and step responses of a uniaxial sphere are evaluated using the inverse Laplace transformation of the generalized Mie series solution. Following high frequency scattering solution of a large uniaxial sphere, the Mie series summation is split into the high frequency (HF) and low frequency terms where the HF term is replaced by its asymptotic expression allowing a significant reduction in computation time of the numerical Bromwich integral. In the second approach, the generalized Debye series for a radially uniaxial dielectric sphere is introduced and the Mie series coefficients are replaced by their equivalent Debye series formulations. The results are then applied to examine the transient response of each individual Debye term allowing the identification of impulse returns in the transient response of the uniaxial sphere. In the third approach, the ray tracing theory in a uniaxial sphere is investigated to evaluate the propagation path as well as the arrival time of the ordinary and extraordinary returns in the transient response of the uniaxial sphere. This is achieved by extracting the reflection and transmission angles of a plane wave obliquely incident on the radially oriented air-uniaxial and uniaxial-air boundaries, and expressing the phase velocities as well as the refractive indices of the ordinary and extraordinary waves in terms of the incident angle, optic axis and propagation direction. The results indicate a satisfactory agreement between Debye series, Mie series and ray tracing methods.
Deconfinement in Yang-Mills Theory through Toroidal Compactification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simic, Dusan; Unsal, Mithat; /Stanford U., Phys. Dept. /SLAC
2011-08-12
We introduce field theory techniques through which the deconfinement transition of four-dimensional Yang-Mills theory can be moved to a semi-classical domain where it becomes calculable using two-dimensional field theory. We achieve this through a double-trace deformation of toroidally compactified Yang-Mills theory on R{sup 2} x S{sub L}{sup 1} x S{sub {beta}}{sup 1}. At large N, fixed-L, and arbitrary {beta}, the thermodynamics of the deformed theory is equivalent to that of ordinary Yang-Mills theory at leading order in the large N expansion. At fixed-N, small L and a range of {beta}, the deformed theory maps to a two-dimensional theory with electricmore » and magnetic (order and disorder) perturbations, analogs of which appear in planar spin-systems and statistical physics. We show that in this regime the deconfinement transition is driven by the competition between electric and magnetic perturbations in this two-dimensional theory. This appears to support the scenario proposed by Liao and Shuryak regarding the magnetic component of the quark-gluon plasma at RHIC.« less
Zebrafish in the sea of mineral (iron, zinc, and copper) metabolism
Zhao, Lu; Xia, Zhidan; Wang, Fudi
2014-01-01
Iron, copper, zinc, and eight other minerals are classified as essential trace elements because they present in minute in vivo quantities and are essential for life. Because either excess or insufficient levels of trace elements can be detrimental to life (causing human diseases such as iron-deficiency anemia, hemochromatosis, Menkes syndrome and Wilson's disease), the endogenous levels of trace minerals must be tightly regulated. Many studies have demonstrated the existence of systems that maintain trace element homeostasis, and these systems are highly conserved in multiple species ranging from yeast to mice. As a model for studying trace mineral metabolism, the zebrafish is indispensable to researchers. Several large-scale mutagenesis screens have been performed in zebrafish, and these screens led to the identification of a series of metal transporters and the generation of several mutagenesis lines, providing an in-depth functional analysis at the system level. Moreover, because of their developmental advantages, zebrafish have also been used in mineral metabolism-related chemical screens and toxicology studies. Here, we systematically review the major findings of trace element homeostasis studies using the zebrafish model, with a focus on iron, zinc, copper, selenium, manganese, and iodine. We also provide a homology analysis of trace mineral transporters in fish, mice and humans. Finally, we discuss the evidence that zebrafish is an ideal experimental tool for uncovering novel mechanisms of trace mineral metabolism and for improving approaches to treat mineral imbalance-related diseases. PMID:24639652
NASA Astrophysics Data System (ADS)
Gerhard, Christoph; Adams, Geoff
2015-10-01
Geometric optics is at the heart of optics teaching. Some of us may remember using pins and string to test the simple lens equation at school. Matters get more complex at undergraduate/postgraduate levels as we are introduced to paraxial rays, real rays, wavefronts, aberration theory and much more. Software is essential for the later stages, and the right software can profitably be used even at school. We present two free PC programs, which have been widely used in optics teaching, and have been further developed in close cooperation with lecturers/professors in order to address the current content of the curricula for optics, photonics and lasers in higher education. PreDesigner is a single thin lens modeller. It illustrates the simple lens law with construction rays and then allows the user to include field size and aperture. Sliders can be used to adjust key values with instant graphical feedback. This tool thus represents a helpful teaching medium for the visualization of basic interrelations in optics. WinLens3DBasic can model multiple thin or thick lenses with real glasses. It shows the system focii, principal planes, nodal points, gives paraxial ray trace values, details the Seidel aberrations, offers real ray tracing and many forms of analysis. It is simple to reverse lenses and model tilts and decenters. This tool therefore provides a good base for learning lens design fundamentals. Much work has been put into offering these features in ways that are easy to use, and offer opportunities to enhance the student's background understanding.
Theory of Multiple Coulomb Scattering from Extended Nuclei
DOE R&D Accomplishments Database
Cooper, L. N.; Rainwater, J.
1954-08-01
Two independent methods are described for calculating the multiple scattering distribution for projected angle scattering resulting when very high energy charged particles traverse a thick scatterer. The results are compared with the theories of Moliere and Olbert.
Theory of time-dependent rupture in the Earth
NASA Technical Reports Server (NTRS)
Das, S.; Scholz, C. H.
1980-01-01
Fracture mechanics is used to develop a theory of earthquake mechanism which includes the phenomenon of subcritical crack growth. The following phenomena are predicted: slow earthquakes, multiple events, delayed multiple events (doublets), postseismic rupture growth and afterslip, foreshocks, and aftershocks. The theory predicts a nucleation stage prior to an earthquake, and suggests a physical mechanism by which one earthquake may 'trigger' another.
Huang, Yu-tin; Johansson, Henrik
2013-04-26
We show that three-dimensional supergravity amplitudes can be obtained as double copies of either three-algebra super-Chern-Simons matter theory or two-algebra super-Yang-Mills theory when either theory is organized to display the color-kinematics duality. We prove that only helicity-conserving four-dimensional gravity amplitudes have nonvanishing descendants when reduced to three dimensions, implying the vanishing of odd-multiplicity S-matrix elements, in agreement with Chern-Simons matter theory. We explicitly verify the double-copy correspondence at four and six points for N = 12,10,8 supergravity theories and discuss its validity for all multiplicity.
Ostrogradsky in theories with multiple fields
NASA Astrophysics Data System (ADS)
de Rham, Claudia; Matas, Andrew
2016-06-01
We review how the (absence of) Ostrogradsky instability manifests itself in theories with multiple fields. It has recently been appreciated that when multiple fields are present, the existence of higher derivatives may not automatically imply the existence of ghosts. We discuss the connection with gravitational theories like massive gravity and beyond Horndeski which manifest higher derivatives in some formulations and yet are free of Ostrogradsky ghost. We also examine an interesting new class of Extended Scalar-Tensor Theories of gravity which has been recently proposed. We show that for a subclass of these theories, the tensor modes are either not dynamical or are infinitely strongly coupled. Among the remaining theories for which the tensor modes are well-defined one counts one new model that is not field-redefinable to Horndeski via a conformal and disformal transformation but that does require the vacuum to break Lorentz invariance. We discuss the implications for the effective field theory of dark energy and the stability of the theory. In particular we find that if we restrict ourselves to the Extended Scalar-Tensor class of theories for which the tensors are well-behaved and the scalar is free from gradient or ghost instabilities on FLRW then we recover Horndeski up to field redefinitions.
NASA Astrophysics Data System (ADS)
Sharman, Elizabeth R.; Taylor, Bruce E.; Minarik, William G.; Dubé, Benoît; Wing, Boswell A.
2015-06-01
We examine models for volcanogenic massive sulfide (VMS) mineralization in the ~2.7-Ga Noranda camp, Abitibi subprovince, Superior Province, Canada, using a combination of multiple sulfur isotope and trace element data from ore sulfide minerals. The Noranda camp is a well-preserved, VMS deposit-rich area that is thought to represent a collapsed volcanic caldera. Due to its economic value, the camp has been studied extensively, providing a robust geological framework within which to assess the new data presented in this study. We explore previously proposed controls on mineralization within the Noranda camp and, in particular, the exceptional Au-rich Horne and Quemont deposits. We present multiple sulfur isotope and trace element compositional data for sulfide separates representing 25 different VMS deposits and "showings" within the Noranda camp. Multiple sulfur isotope data for this study have δ34SV-CDT values of between -1.9 and +2.5 ‰, and Δ33SV-CDT values of between -0.59 and -0.03 ‰. We interpret the negative Δ33S values to be due to a contribution of sulfur that originated as seawater sulfate to form the ore sulfides of the Noranda camp VMS deposits. The contribution of seawater sulfate increased with the collapse and subsequent evolution of the Noranda caldera, an inference supported by select trace and major element analyses. In particular, higher concentrations of Se occur in samples with Δ33S values closer to 0 ‰, as well as lower Fe/Zn ratios in sphalerite, suggesting lower pressures and temperatures of formation. We also report a relationship between average Au grade and Δ33S values within Au-rich VMS deposits of the Noranda camp, whereby higher gold grades are associated with near-zero Δ33S values. From this, we infer a dominance of igneous sulfur in the gold-rich deposits, either leached from the volcanic pile and/or directly degassed from an associated intrusion.
Bakić-Mirić, Natasa
2010-01-01
Theory of multiple intelligences (MI) is considered an innovation in learning the English language because it helps students develop all eight intelligences that, on the other hand, represent ways people understand the world around them, solve problems and learn. They are: verbal/linguistic, logical/mathematical, visual/spatial, bodily/kinaesthetic, musical/rhythmic, interpersonal, intrapersonal and naturalist. Also, by focusing on the problem-solving activities, teachers, by implementing theory of multiple intelligences, encourage students not only to build their existing language knowledge but also learn new content and skills. The objective of this study has been to determine the importance of implementation of the theory of multiple intelligences in the English language course syllabus at the University of Nis Medical School. Ways in which the theory of multiple intelligences has been implemented in the English language course syllabus particularly in one lecture for junior year students of pharmacy in the University of Nis Medical School. The English language final exam results from February 2009 when compared with the final exam results from June 2007 prior to the implementation of MI theory showed the following: out of 80 junior year students of pharmacy, 40 obtained grade 10 (outstanding), 16 obtained grade 9 (excellent), 11 obtained grade 8 (very good), 4 obtained grade 7 (good) and 9 obtained grade 6 (pass). No student failed. The implementation of the theory of multiple intelligences in the English language course syllabus at the University of Nis Medical School has had a positive impact on learning the English language and has increased students' interest in language learning. Genarally speaking, this theory offers better understanding of students' intelligence and greater appreciation of their strengths. It provides numerous opportunities for students to use and develop all eight intelligences not just the few they excel in prior to enrolling in a university or college.
Moral Education and the Perils of Developmentalism.
ERIC Educational Resources Information Center
Carr, David
2002-01-01
Discusses conception of moral formation. Traces progress to moral maturity through well defined stages of cognitive, conative, and/or affective growth. Explains that logical status of developmental theories are not clear. Argues that the accounts are more evaluative than descriptive. Explores the problematic moral educational implications of this…
The Functionalist Tradition and Communication Theory.
ERIC Educational Resources Information Center
Burrowes, Carl Patrick
This paper traces the development of the functionalist position chronologically through its major permutations, from the defining contributions of Emile Durkheim, Bronislaw Malinowski, and A. R. Radcliffe-Brown in its anthropological phase through its development in American sociology by Talcott Parsons and Robert K. Merton to its explicit…
Alienation, Mass Society and Mass Culture.
ERIC Educational Resources Information Center
Dam, Hari N.
This monograph examines the nature of alienation in mass society and mass culture. Conceptually based on the "Gemeinschaft-Gesellschaft" paradigm of sociologist Ferdinand Tonnies, discussion traces the concept of alienation as it appears in the philosophies of Hegel, Marx, Kierkegaard, Sartre, and others. Dwight Macdonald's "A Theory of Mass…
What if Learning Analytics Were Based on Learning Science?
ERIC Educational Resources Information Center
Marzouk, Zahia; Rakovic, Mladen; Liaqat, Amna; Vytasek, Jovita; Samadi, Donya; Stewart-Alonso, Jason; Ram, Ilana; Woloshen, Sonya; Winne, Philip H.; Nesbit, John C.
2016-01-01
Learning analytics are often formatted as visualisations developed from traced data collected as students study in online learning environments. Optimal analytics inform and motivate students' decisions about adaptations that improve their learning. We observe that designs for learning often neglect theories and empirical findings in learning…
The Development of Motivational Thought in the Study of Curiosity.
ERIC Educational Resources Information Center
Vidler, Derek C.
1981-01-01
Presents an overview of the development of motivational thought in the study of exploratory behavior and curiosity. Traces the way in which concepts of curiosity were considered from the perspectives of instinct and drive-reduction theories to the more recent notions of optimal stimulation. (Author)
ERIC Educational Resources Information Center
Sovik, Nils
1980-01-01
A description is given of an experiment investigating the applicability of a cybernetic theory in teaching children psychomotor skills. Results showed a learning effect in copying for younger subjects, in tracing for older subjects, and in tracking for all subjects. (GK)
The Production of "Proper Cheating" in Online Examinations within Technological Universities
ERIC Educational Resources Information Center
Kitto, Simon; Saltmarsh, Sue
2007-01-01
This paper uses poststructuralist theories of governmentality, agency, consumption and Barry's (2001) concept of Technological Societies, as a heuristic framework to trace the role of online education technologies in the instantiation of subjectification processes within contemporary Australian universities. This case study of the unintended…
Vygotsky's Psychology: A Biography of Ideas.
ERIC Educational Resources Information Center
Kozulin, Alex
Noting that the previous two decades have seen Lev Vygotsky's psychology become highly influential while the psychology of other theoretical giants has faded, this book provides a major intellectual biography about Vygotsky's theories and their relationship to twentieth-century Russian and Western intellectual culture. The book traces Vygotsky's…
Emergent Bilinguals: Framing Students as Statistical Data?
ERIC Educational Resources Information Center
Koyama, Jill; Menken, Kate
2013-01-01
Immigrant youth who are designated as English language learners in American schools--whom we refer to as "emergent bilinguals"--are increasingly framed by numerical calculations. Utilizing the notion of assemblage from actor-network theory (ANT), we trace how emergent bilinguals are discursively constructed by officials, administrators,…
Lee, L.; Helsel, D.
2005-01-01
Trace contaminants in water, including metals and organics, often are measured at sufficiently low concentrations to be reported only as values below the instrument detection limit. Interpretation of these "less thans" is complicated when multiple detection limits occur. Statistical methods for multiply censored, or multiple-detection limit, datasets have been developed for medical and industrial statistics, and can be employed to estimate summary statistics or model the distributions of trace-level environmental data. We describe S-language-based software tools that perform robust linear regression on order statistics (ROS). The ROS method has been evaluated as one of the most reliable procedures for developing summary statistics of multiply censored data. It is applicable to any dataset that has 0 to 80% of its values censored. These tools are a part of a software library, or add-on package, for the R environment for statistical computing. This library can be used to generate ROS models and associated summary statistics, plot modeled distributions, and predict exceedance probabilities of water-quality standards. ?? 2005 Elsevier Ltd. All rights reserved.
Bond, John W; Weart, Jocelyn R
2017-05-01
Recovery, profiling, and speculative searching of trace DNA (not attributable to a body fluid/cell type) over a twelve-month period in a U.S. Crime Laboratory and U.K. police force are compared. Results show greater numbers of U.S. firearm-related items submitted for analysis compared with the U.K., where greatest numbers were submitted from burglary or vehicle offenses. U.S. multiple recovery techniques (double swabbing) occurred mainly during laboratory examination, whereas the majority of U.K. multiple recovery techniques occurred at the scene. No statistical difference was observed for useful profiles from single or multiple recovery. Database loading of interpretable profiles was most successful for U.K. items related to burglary or vehicle offenses. Database associations (matches) represented 7.0% of all U.S. items and 13.1% of all U.K. items. The U.K. strategy for burglary and vehicle examination demonstrated that careful selection of both items and sampling techniques is crucial to obtaining the observed results. © 2016 American Academy of Forensic Sciences.
Advanced ECCD based NTM control in closed-loop operation at ASDEX Upgrade (AUG)
NASA Astrophysics Data System (ADS)
Reich, Matthias; Barrera-Orte, Laura; Behler, Karl; Bock, Alexander; Giannone, Louis; Maraschek, Marc; Poli, Emanuele; Rapson, Chris; Stober, Jörg; Treutterer, Wolfgang
2012-10-01
In high performance plasmas, Neoclassical Tearing Modes (NTMs) are regularly observed at reactor-grade beta-values. They limit the achievable normalized beta, which is undesirable because fusion performance scales as beta squared. The method of choice for controlling and avoiding NTMs at AUG is the deposition of ECCD inside the magnetic island for stabilization in real-time (rt). Our approach to tackling such complex control problems using real-time diagnostics allows rigorous optimization of all subsystems. Recent progress in rt-equilibrium reconstruction (< 3.5 ms), rt-localization of NTMs (< 8 ms) and rt beam tracing (< 25 ms) allows closed-loop feedback operation using multiple movable mirrors as the ECCD deposition actuator. The rt-equilibrium uses function parametrization or a fast Grad-Shafranov solver with an option to include rt-MSE measurements. The island localization is based on a correlation of ECE and filtered Mirnov signals. The rt beam-tracing module provides deposition locations and their derivative versus actuator position of multiple gyrotrons. The ``MHD controller'' finally drives the actuators. Results utilizing closed-loop operation with multiple gyrotrons and their effect on NTMs are shown.
Computational models for the analysis of three-dimensional internal and exhaust plume flowfields
NASA Technical Reports Server (NTRS)
Dash, S. M.; Delguidice, P. D.
1977-01-01
This paper describes computational procedures developed for the analysis of three-dimensional supersonic ducted flows and multinozzle exhaust plume flowfields. The models/codes embodying these procedures cater to a broad spectrum of geometric situations via the use of multiple reference plane grid networks in several coordinate systems. Shock capturing techniques are employed to trace the propagation and interaction of multiple shock surfaces while the plume interface, separating the exhaust and external flows, and the plume external shock are discretely analyzed. The computational grid within the reference planes follows the trace of streamlines to facilitate the incorporation of finite-rate chemistry and viscous computational capabilities. Exhaust gas properties consist of combustion products in chemical equilibrium. The computational accuracy of the models/codes is assessed via comparisons with exact solutions, results of other codes and experimental data. Results are presented for the flows in two-dimensional convergent and divergent ducts, expansive and compressive corner flows, flow in a rectangular nozzle and the plume flowfields for exhausts issuing out of single and multiple rectangular nozzles.
NASA Astrophysics Data System (ADS)
Sato, Haruo; Hayakawa, Toshihiko
2014-10-01
Short-period seismograms of earthquakes are complex especially beneath volcanoes, where the S wave mean free path is short and low velocity bodies composed of melt or fluid are expected in addition to random velocity inhomogeneities as scattering sources. Resonant scattering inherent in a low velocity body shows trap and release of waves with a delay time. Focusing of the delay time phenomenon, we have to consider seriously multiple resonant scattering processes. Since wave phases are complex in such a scattering medium, the radiative transfer theory has been often used to synthesize the variation of mean square (MS) amplitude of waves; however, resonant scattering has not been well adopted in the conventional radiative transfer theory. Here, as a simple mathematical model, we study the sequence of isotropic resonant scattering of a scalar wavelet by low velocity spheres at low frequencies, where the inside velocity is supposed to be low enough. We first derive the total scattering cross-section per time for each order of scattering as the convolution kernel representing the decaying scattering response. Then, for a random and uniform distribution of such identical resonant isotropic scatterers, we build the propagator of the MS amplitude by using causality, a geometrical spreading factor and the scattering loss. Using those propagators and convolution kernels, we formulate the radiative transfer equation for a spherically impulsive radiation from a point source. The synthesized MS amplitude time trace shows a dip just after the direct arrival and a delayed swelling, and then a decaying tail at large lapse times. The delayed swelling is a prominent effect of resonant scattering. The space distribution of synthesized MS amplitude shows a swelling near the source region in space, and it becomes a bell shape like a diffusion solution at large lapse times.
Rorres, Chris; Romano, Maria; Miller, Jennifer A; Mossey, Jana M; Grubesic, Tony H; Zellner, David E; Smith, Gary
2018-06-01
Contact tracing is a crucial component of the control of many infectious diseases, but is an arduous and time consuming process. Procedures that increase the efficiency of contact tracing increase the chance that effective controls can be implemented sooner and thus reduce the magnitude of the epidemic. We illustrate a procedure using Graph Theory in the context of infectious disease epidemics of farmed animals in which the epidemics are driven mainly by the shipment of animals between farms. Specifically, we created a directed graph of the recorded shipments of deer between deer farms in Pennsylvania over a timeframe and asked how the properties of the graph could be exploited to make contact tracing more efficient should Chronic Wasting Disease (a prion disease of deer) be discovered in one of the farms. We show that the presence of a large strongly connected component in the graph has a significant impact on the number of contacts that can arise. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Entanglement entropy and the colored Jones polynomial
NASA Astrophysics Data System (ADS)
Balasubramanian, Vijay; DeCross, Matthew; Fliss, Jackson; Kar, Arjun; Leigh, Robert G.; Parrikar, Onkar
2018-05-01
We study the multi-party entanglement structure of states in Chern-Simons theory created by performing the path integral on 3-manifolds with linked torus boundaries, called link complements. For gauge group SU(2), the wavefunctions of these states (in a particular basis) are the colored Jones polynomials of the corresponding links. We first review the case of U(1) Chern-Simons theory where these are stabilizer states, a fact we use to re-derive an explicit formula for the entanglement entropy across a general link bipartition. We then present the following results for SU(2) Chern-Simons theory: (i) The entanglement entropy for a bipartition of a link gives a lower bound on the genus of surfaces in the ambient S 3 separating the two sublinks. (ii) All torus links (namely, links which can be drawn on the surface of a torus) have a GHZ-like entanglement structure — i.e., partial traces leave a separable state. By contrast, through explicit computation, we test in many examples that hyperbolic links (namely, links whose complements admit hyperbolic structures) have W-like entanglement — i.e., partial traces leave a non-separable state. (iii) Finally, we consider hyperbolic links in the complexified SL(2,C) Chern-Simons theory, which is closely related to 3d Einstein gravity with a negative cosmological constant. In the limit of small Newton constant, we discuss how the entanglement structure is controlled by the Neumann-Zagier potential on the moduli space of hyperbolic structures on the link complement.
Hunt, Pamela S.; Barnet, Robert C.
2015-01-01
Experience-produced deficits in trace conditioning and context conditioning have been useful tools for examining the role of the hippocampus in learning. It has also been suggested that learning in these tasks is especially vulnerable to neurotoxic effects of alcohol during key developmental periods such as adolescence. In five experiments we systematically examined the presence and source of age-dependent vulnerability to the memory-disrupting effects of acute ethanol in trace conditioning and contextual fear conditioning. In Experiment 1a pre-training ethanol disrupted trace conditioning more strongly in adolescent (postnatal day, PD30-35) than adult rats (PD65-75). In Experiment 1b when pre-training ethanol was accompanied by pre-test ethanol no deficit in trace conditioning was observed in adolescents, suggesting that state-dependent retrieval failure mediated ethanol's disruption of trace conditioning at this age. Experiments 2a and 2b examined the effect of ethanol pretreatment on context conditioning. Here, adult but not adolescent rats were impaired in conditioned freezing to context cues. Experiment 2c explored state-dependency of this effect. Pre-training ethanol continued to disrupt context conditioning in adults even when ethanol was also administered prior to test. Collectively these findings reveal clear age-dependent and task-dependent vulnerabilities in ethanol's disruptive effects on hippocampus-dependent memory. Adolescents were more disrupted by ethanol in trace conditioning than adults, and adults were more disrupted by ethanol in context conditioning than adolescents. We suggest that adolescents may be more susceptible to changes in internal state (state-dependent retrieval failure) than adults and that ethanol disrupted performance in trace and context conditioning through different mechanisms. Relevance of these findings to theories of hippocampus function is discussed. PMID:26192910
The contrasting roles of Planck's constant in classical and quantum theories
NASA Astrophysics Data System (ADS)
Boyer, Timothy H.
2018-04-01
We trace the historical appearance of Planck's constant in physics, and we note that initially the constant did not appear in connection with quanta. Furthermore, we emphasize that Planck's constant can appear in both classical and quantum theories. In both theories, Planck's constant sets the scale of atomic phenomena. However, the roles played in the foundations of the theories are sharply different. In quantum theory, Planck's constant is crucial to the structure of the theory. On the other hand, in classical electrodynamics, Planck's constant is optional, since it appears only as the scale factor for the (homogeneous) source-free contribution to the general solution of Maxwell's equations. Since classical electrodynamics can be solved while taking the homogenous source-free contribution in the solution as zero or non-zero, there are naturally two different theories of classical electrodynamics, one in which Planck's constant is taken as zero and one where it is taken as non-zero. The textbooks of classical electromagnetism present only the version in which Planck's constant is taken to vanish.
Quantum criticality and duality in the Sachdev-Ye-Kitaev/AdS2 chain
NASA Astrophysics Data System (ADS)
Jian, Shao-Kai; Xian, Zhuo-Yu; Yao, Hong
2018-05-01
We show that the quantum critical point (QCP) between a diffusive metal and ferromagnetic (or antiferromagnetic) phases in the SYK chain has a gravitational description corresponding to the double-trace deformation in an AdS2 chain. Specifically, by studying a double-trace deformation of a Z2 scalar in an AdS2 chain where the Z2 scalar is dual to the order parameter in the SYK chain, we find that the susceptibility and renormalization group equation describing the QCP in the SYK chain can be exactly reproduced in the holographic model. Our results suggest that the infrared geometry in the gravity theory dual to the diffusive metal of the SYK chain is also an AdS2 chain. We further show that the transition in SYK model captures universal information about double-trace deformation in generic black holes with near horizon AdS2 space-time.
Adsorption of humic acids and trace metals in natural waters
NASA Technical Reports Server (NTRS)
Leung, W. H.
1982-01-01
Studies concerning the interactions between suspended hydrous iron oxide and dissolved humic acids and trace metals are reported. As a major component of dissolved organic matters and its readiness for adsorption at the solid/water interface, humic acids may play a very important role in the organometallic geochemistry of suspended sediments and in determining the fate and distribution of trace metals, pesticides and anions in natural water systems. Most of the solid phases in natural waters contain oxides and hydroxides. The most simple promising theory to describe the interactions of hydrous iron oxide interface is the surface complex formation model. In this model, the adsorptions of humic acids on hydrous iron oxide may be interpreted as complex formation of the organic bases (humic acid oxyanions) with surface Fe ions. Measurements on adsorptions were made in both fresh water and seawater. Attempts have been made to fit our data to Langmuir adsorption isotherm. Adsorption equilibrium constants were determined.
NASA Astrophysics Data System (ADS)
Smith, J. P.; Muller, A. C.
2013-05-01
Predicting the fate and distribution of anthropogenic-sourced trace metals in riverine and estuarine systems is challenging due to multiple and varying source functions and dynamic physiochemical conditions. Between July 2011 and November 2012, sediment and water column samples were collected from over 20 sites in the tidal-fresh Potomac River estuary, Washington, DC near the outfall of the Blue Plains Advanced Wastewater Treatment Plant (BPWTP) for measurement of select trace metals. Field observations of water column parameters (conductivity, temperature, pH, turbidity) were also made at each sampling site. Trace metal concentrations were normalized to the "background" composition of the river determined from control sites in order to investigate the distribution BPWTP-sourced in local Potomac River receiving waters. Temporal differences in the observed distribution of trace metals were attributed to changes in the relative contribution of metals from different sources (wastewater, riverine, other) coupled with differences in the physiochemical conditions of the water column. Results show that normalizing near-source concentrations to the background composition of the water body and also to key environmental parameters can aid in predicting the fate and distribution of anthropogenic-sourced trace metals in dynamic riverine and estuarine systems like the tidal-fresh Potomac River.
Tlale, Lebapotswe; Frasso, Rosemary; Kgosiesele, Onalenna; Selemogo, Mpho; Mothei, Quirk; Habte, Dereje; Steenhoff, Andrew
2016-01-01
Introduction TB contact tracing rates remain low in high burden settings and reasons for this are not well known. We describe factors that influence health care workers' (HCW) implementation of TB contact tracing (CT) in a high TB burden district of Botswana. Methods Data were collected using questionnaires and in-depth interviews in 31 of the 52 health facilities in Kweneng East Health District. Responses were summarized using summary statistics and comparisons between HCW groups were done using parametric or non-parametric tests as per normality of the data distribution. Results One hundred and four HCWs completed questionnaires. Factors that influenced HCW TB contact tracing were their knowledge, attitudes and practices as well as personal factors including decreased motivation and lack of commitment. Patient factors included living further away from the clinic, unknown residential address and high rates of migration and mobility. Administrative factors included staff shortages, lack of transport, poor reporting of TB cases and poor medical infrastructure e.g. suboptimal laboratory services. A national HCW strike and a restructuring of the health system emerged as additional factors during in-depth interviews of TB coordinators. Conclusion Multiple factors lead to poor TB contact tracing in this district. Interventions to increase TB contact tracing will be informed by these findings. PMID:27800084
Tlale, Lebapotswe; Frasso, Rosemary; Kgosiesele, Onalenna; Selemogo, Mpho; Mothei, Quirk; Habte, Dereje; Steenhoff, Andrew
2016-01-01
TB contact tracing rates remain low in high burden settings and reasons for this are not well known. We describe factors that influence health care workers' (HCW) implementation of TB contact tracing (CT) in a high TB burden district of Botswana. Data were collected using questionnaires and in-depth interviews in 31 of the 52 health facilities in Kweneng East Health District. Responses were summarized using summary statistics and comparisons between HCW groups were done using parametric or non-parametric tests as per normality of the data distribution. One hundred and four HCWs completed questionnaires. Factors that influenced HCW TB contact tracing were their knowledge, attitudes and practices as well as personal factors including decreased motivation and lack of commitment. Patient factors included living further away from the clinic, unknown residential address and high rates of migration and mobility. Administrative factors included staff shortages, lack of transport, poor reporting of TB cases and poor medical infrastructure e.g. suboptimal laboratory services. A national HCW strike and a restructuring of the health system emerged as additional factors during in-depth interviews of TB coordinators. Multiple factors lead to poor TB contact tracing in this district. Interventions to increase TB contact tracing will be informed by these findings.
van Rosmalen, Lenny; van der Horst, Frank C P; van der Veer, René
2016-02-01
John Bowlby is generally regarded as the founder of attachment theory, with the help of Mary Ainsworth. Through her Uganda and Baltimore studies Ainsworth provided empirical evidence for attachment theory, and she contributed the notion of the secure base and exploratory behavior, the Strange Situation Procedure and its classification system, and the notion of maternal sensitivity. On closer scrutiny, many of these contributions appear to be heavily influenced by William Blatz and his security theory. Even though Blatz's influence on Ainsworth has been generally acknowledged, this article, partly based on understudied correspondence from several personal archives, is the first to show which specific parts of attachment theory can be traced back directly to Blatz and his security theory. When Ainsworth started working with Bowlby in the 1950s, around the time he turned to evolutionary theory for an explanation of his findings, she integrated much of Blatzian security theory into Bowlby's theory in the making and used her theoretical and practical experience to enrich attachment theory. Even though Blatz is hardly mentioned nowadays, several of his ideas live on in attachment theory. (c) 2016 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Raman, Barani; Meier, Douglas; Shenoy, Rupa; Benkstein, Kurt; Semancik, Steve
2011-09-01
We describe progress on an array-based microsensor approach employed for detecting trace levels of toxic industrial chemicals (TICs) in air-based backgrounds with varied levels of humidity, and with occasional introduction of aggressive interferents. Our MEMS microhotplate arrays are populated with multiple chemiresistive sensing materials, and all elements are programmed to go through extensive temperature cycling over repetitive cycles with lengths of approximately 20 s. Under such operation, analytically-rich data streams are produced containing the required information for target recognition.
Dual-wavelength quantum cascade laser for trace gas spectroscopy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jágerská, J.; Tuzson, B.; Mangold, M.
2014-10-20
We demonstrate a sequentially operating dual-wavelength quantum cascade laser with electrically separated laser sections, emitting single-mode at 5.25 and 6.25 μm. Based on a single waveguide ridge, this laser represents a considerable asset to optical sensing and trace gas spectroscopy, as it allows probing multiple gas species with spectrally distant absorption features using conventional optical setups without any beam combining optics. The laser capability was demonstrated in simultaneous NO and NO{sub 2} detection, reaching sub-ppb detection limits and selectivity comparable to conventional high-end spectroscopic systems.
Poreh, Amir; Winocur, Gordon; Moscovitch, Morris; Backon, Matti; Goshen, Elinor; Ram, Zvi; Feldman, Zeev
2006-01-01
AD, a 45-year-old man, presented with a severe and global anterograde amnesia following surgery for removal of a colloid cyst. Structural neuroimaging confirmed bilateral lesions to the fornix and a small lesion in the basal forebrain. Testing for remote episodic memory of autobiographical events, and for remote semantic memory of personal and public events, and of famous people, revealed that AD had a severe retrograde amnesia for autobiographical episodes that covered his entire lifetime, and a time-limited retrograde amnesia for semantic memory. Because the fornix and basal forebrain lesions disrupted major afferent and efferent pathways of the hippocampus, it was concluded that the integrity of the hippocampus and its projections are needed to retain and/or recover autobiographical memories no matter how old they are. By contrast, hippocampal contribution to semantic memory is time-limited. These findings were interpreted as consistent with Multiple Trace Theory, which holds that the hippocampal system is essential for recovering contextually rich memories no matter how old they are, but is not needed for recovering semantic memories.
Tuning the cosmological constant, broken scale invariance, unitarity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Förste, Stefan; Manz, Paul; Physikalisches Institut der Universität Bonn,Nussallee 12, 53115 Bonn
2016-06-10
We study gravity coupled to a cosmological constant and a scale but not conformally invariant sector. In Minkowski vacuum, scale invariance is spontaneously broken. We consider small fluctuations around the Minkowski vacuum. At the linearised level we find that the trace of metric perturbations receives a positive or negative mass squared contribution. However, only for the Fierz-Pauli combination the theory is free of ghosts. The mass term for the trace of metric perturbations can be cancelled by explicitly breaking scale invariance. This reintroduces fine-tuning. Models based on four form field strength show similarities with explicit scale symmetry breaking due tomore » quantisation conditions.« less
Development of the False-Memory Illusion
ERIC Educational Resources Information Center
Brainerd, C. J.; Forrest, T. J.; Karibian, D.; Reyna, V. F.
2006-01-01
The counterintuitive developmental trend in the Deese-Roediger-McDermott (DRM) illusion (that false-memory responses increase with age) was investigated in learning-disabled and nondisabled children from the 6- to 14-year-old age range. Fuzzy-trace theory predicts that because there are qualitative differences in how younger versus older children…
The Hidden Meaning of Inner Speech.
ERIC Educational Resources Information Center
Pomper, Marlene M.
This paper is concerned with the inner speech process, its relationship to thought and behavior, and its theoretical and educational implications. The paper first defines inner speech as a bridge between thought and written or spoken language and traces its development. Second, it investigates competing theories surrounding the subject with an…
The Leadership for Dignity of All: Thom's "Resolved Christianity."
ERIC Educational Resources Information Center
Thom, Douglas J.
This book explores the spiritual dimension of leadership. Its discussion of leadership includes formal leading within education. The book has six chapters. The first two chapters trace the development of leadership thought/theory and practice and discuss phenomena that are present on a continuing basis in society and particularly within…
Selkin, J
1983-01-01
The development of the suicide prevention movement since the 1897 publication of Emile Dukheim's book Suicide is briefly traced. Durkheim's theory of suicide is outlined, and implications for contemporary suicide prevention efforts are identified and discussed. Future trends in the development of suicide prevention centers and in the national organization of suicidology are outlined.
Are Learning Organizations Pragmatic?
ERIC Educational Resources Information Center
Cavaleri, Steven A.
2008-01-01
Purpose: The purpose of this paper is to evaluate the future prospects of the popular concept known as the learning organization; to trace the influence of philosophical pragmatism on the learning organization and to consider its potential impact on the future; and to emphasize how pragmatic theories have shaped the development of Deming's total…
Sequential Ideal-Observer Analysis of Visual Discriminations.
ERIC Educational Resources Information Center
Geisler, Wilson S.
1989-01-01
A new analysis, based on the concept of the ideal observer in signal detection theory, is described. It allows: tracing of the flow of discrimination information through the initial physiological stages of visual processing for arbitrary spatio-chromatic stimuli, and measurement of the information content of said visual stimuli. (TJH)
The State, Television, and Political Power in Brazil.
ERIC Educational Resources Information Center
de Lima, Venicio A.
1988-01-01
Using Antonio Gramsci's theory of politics, traces the rise of the Brazilian media giant, TV Globo, and examines how its owner (Roberto Marinho) has become the key mediator between the Brazilian "ruling bloc" and the rest of the country in the construction and maintenance of cultural and political hegemony. (JK)
Dinetah: Navajo History. Volume II.
ERIC Educational Resources Information Center
Roessel, Robert A., Jr.
Using archaeological data, written chronicles of Spanish explorers and missionaries, and oral narratives and legends, the book traces the history of the Navajo people to their original homeland, Dinetah, located primarily off the present reservation in an area south and east of Farmington, New Mexico. The book discusses various theories on Navajo…
Education and the Political Community.
ERIC Educational Resources Information Center
Peden, Joseph R.
This paper traces the ideology (assertions, theories, and aims) of public schooling from Plato through the first Prussian state school system under Bismarck, through Adam Smith and John Stuart Mill. It contends that public schooling contradicts and works to destroy the United States' libertarian traditions of freedom and self-rule. Though not…
Recent Evolution of the Introductory Curriculum in Computing.
ERIC Educational Resources Information Center
Tucker, Allen B.; Garnick, David K.
1991-01-01
Traces the evolution of introductory computing courses for undergraduates based on the Association for Computing Machinery (ACM) guidelines published in "Curriculum 78." Changes in the curricula are described, including the role of discrete mathematics and theory; and the need for a broader model for designing introductory courses is…
Gestalt Therapy: Its Inheritance from Gestalt Psychology.
ERIC Educational Resources Information Center
Yontef, Gary M.
When adequately elaborated, the basic method of Gestalt therapy can be traced to the phenomenological field theory of Gestalt psychology. Gestalt therapy differs from Gestalt psychology not because of a difference in philosophy or method, but because of different contexts; the clinical context has different demands than those of basic research.…
The Mind's Staircase: Exploring the Conceptual Underpinnings of Children's Thought and Knowledge.
ERIC Educational Resources Information Center
Case, Robbie; And Others
This book, which contains 19 chapters, examines children's learning processes in light of reconceptualizations of Piagetian theory. Part one traces the theoretical question underpinning this examination and includes three chapters: "General and Specific Views of the Mind, its Structure, and its Development"; "A Neo-Piagetian…
Sibling Relationships and Influences in Childhood and Adolescence
ERIC Educational Resources Information Center
McHale, Susan M.; Updegraff, Kimberly A.; Whiteman, Shawn D.
2012-01-01
The authors review the literature on sibling relationships in childhood and adolescence, starting by tracing themes from foundational research and theory and then focusing on empirical research during the past 2 decades. This literature documents siblings' centrality in family life, sources of variation in sibling relationship qualities, and the…
Autistic Children in Public School.
ERIC Educational Resources Information Center
Schopler, Eric; Bristol, Marie
Intended for public school administrators and regular classroom teachers, the report discusses the nature of autistic children and examines aspects of successful educational programs for them. The historical background is traced down from Itard's wild boy through theories of faulty parental conditioning, to current thought on the causes of autism.…
Policy as Performance: Tracing the Rituals of Racism
ERIC Educational Resources Information Center
Schick, Carol
2011-01-01
This article examines the relations between two contrasting education phenomena that occur generally and that have come to light in the geographic location where the author teaches and works. This first phenomenon is the proliferation of interest in issues of diversity and equity through education policies, theories, practices, and initiatives.…
Pedagogical Documentation as a Lens for Examining Equality in Early Childhood Education
ERIC Educational Resources Information Center
Paananen, Maiju; Lipponen, Lasse
2018-01-01
In this paper, we consider pedagogical quality particularly as equal opportunities for participating in decision-making in preschool. Relying on Ferraris' [2013. "Documentality: Why it is necessary to leave traces." New York: Fordham University Press] theory of Documentality, we demonstrate how pedagogical documentation can contribute to…
ERIC Educational Resources Information Center
Garrison, Kevin
2014-01-01
Technical communication's attempt to prioritize theories of scholarship and pedagogy has resulted in several authors contributing a three-dimensional framework to approach technology: the instrumental perspective, the critical humanist perspective, and the user-centered perspective [1-3]. This article traces connections between this framework for…
Toward a Dialogic Theory of Public Relations.
ERIC Educational Resources Information Center
Kent, Michael L.; Taylor, Maureen
2002-01-01
Explains the concept of dialogue in order to reduce the ambiguity that surrounds the use of the term. Seeks to make the concept of dialogue more accessible for scholars and practitioners interested in relationship building. Traces the roots of dialogue, identifies several over-arching tenets, and provides three ways that organizations can…
A Brief Look at the History of Probability and Statistics.
ERIC Educational Resources Information Center
Lightner, James E.
1991-01-01
The historical development of probability theory is traced from its early origins in games of chance through its mathematical foundations in the work of Pascal and Fermat. The roots of statistics are also presented beginning with early actuarial developments through the work of Laplace, Gauss, and others. (MDH)
ERIC Educational Resources Information Center
Tamura, Eileen H.
2011-01-01
While narrative history has been the prevailing mode in historical scholarship, its preeminence has not gone unquestioned. In the 1980s, the role of narrative in historical writing was "the subject of extraordinarily intense debate." The historical backdrop of this debate can be traced to the preceding two decades, when four groups of thinkers…
Does Tracing Worked Examples Enhance Geometry Learning?
ERIC Educational Resources Information Center
Hu, Fang-Tzu; Ginns, Paul; Bobis, Janette
2014-01-01
Cognitive load theory seeks to generate novel instructional designs through a focus on human cognitive architecture including a limited working memory; however, the potential for enhancing learning through non-visual or non-auditory working memory channels is yet to be evaluated. This exploratory experiment tested whether explicit instructions to…
Neo-Keynesian Economics Today.
ERIC Educational Resources Information Center
Shackleton, J. R.
1987-01-01
Traces the development of post-Keynesian economic theories and examines the arguments which surround current neo-Keynesian thought. Argues for an eclecticism which recognizes that both supply-side and demand-side factors have a role to play in determining levels of output and employment. Useful charts and diagrams are included. (Author/DH)
Simple Derivation of the Lindblad Equation
ERIC Educational Resources Information Center
Pearle, Philip
2012-01-01
The Lindblad equation is an evolution equation for the density matrix in quantum theory. It is the general linear, Markovian, form which ensures that the density matrix is Hermitian, trace 1, positive and completely positive. Some elementary examples of the Lindblad equation are given. The derivation of the Lindblad equation presented here is…
Competing and Contested Discourses on Citizenship and Civic Praxis
ERIC Educational Resources Information Center
Koyama, Jill
2017-01-01
In this paper, I utilize complementary features of critical discourse analysis (CDA) and Actor-Network Theory (ANT) to trace and investigate issues of power, materiality, and reproduction embedded within notions of citizenship and civic engagement. I interrogate the often narrow and conservative political and public discourses in Arizona that…
Contexts, Cultures, Learning: Contemporary Understandings
ERIC Educational Resources Information Center
Peim, Nick; Hodkinson, Phil
2007-01-01
This paper addresses the general significance of the collection. It briefly and broadly traces the relation of the project's theoretical concerns to its purposes and its positioned nature. These concerns and this positioning are connected with tendencies in contemporary thought in social science theory and in research philosophy. The project's…
Multiple Intelligences: Its Tensions and Possibilities
ERIC Educational Resources Information Center
Eisner, Elliot W.
2004-01-01
This article explores the tensions between Howard Gardner's theory of multiple intelligences and current educational policies emphasizing standardized and predictable outcomes. The article situates Gardner's theory within the historical interests among psychometricians in identifying those core processes that constitute human intelligence.…
Montessori and Gardner's Theory of Multiple Intelligences.
ERIC Educational Resources Information Center
Vardin, Patricia A.
2003-01-01
Reviews Gardner's theory of multiple intelligences. Shows how Maria Montessori and Howard Gardner drew similar conclusions regarding human capacity and potential. Examines how Gardner's eight intelligences and underlying core operations lie at the heart of the Montessori exercises and activities. (KB)
Fine-scale detection of pollutants by a benthic marine jellyfish.
Epstein, Hannah E; Templeman, Michelle A; Kingsford, Michael J
2016-06-15
Local sources of pollution can vary immensely on small geographic scales and short time frames due to differences in runoff and adjacent land use. This study examined the rate of uptake and retention of trace metals in Cassiopea maremetens, a benthic marine jellyfish, over a short time frame and in the presence of multiple pollutants. This study also validated the ability of C. maremetens to uptake metals in the field. Experimental manipulation demonstrated that metal accumulation in jellyfish tissue began within 24h of exposure to treated water and trended for higher accumulation in the presence of multiple pollutants. C. maremetens was found to uptake trace metals in the field and provide unique signatures among locations. This fine-scale detection and rapid accumulation of metals in jellyfish tissue can have major implications for both biomonitoring and the trophic transfer of pollutants through local ecosystems. Copyright © 2016 Elsevier Ltd. All rights reserved.
Polarized reflectance and transmittance properties of windblown sea surfaces.
Mobley, Curtis D
2015-05-20
Generation of random sea surfaces using wave variance spectra and Fourier transforms is formulated in a way that guarantees conservation of wave energy and fully resolves wave height and slope variances. Monte Carlo polarized ray tracing, which accounts for multiple scattering between light rays and wave facets, is used to compute effective Mueller matrices for reflection and transmission of air- or water-incident polarized radiance. Irradiance reflectances computed using a Rayleigh sky radiance distribution, sea surfaces generated with Cox-Munk statistics, and unpolarized ray tracing differ by 10%-18% compared with values computed using elevation- and slope-resolving surfaces and polarized ray tracing. Radiance reflectance factors, as used to estimate water-leaving radiance from measured upwelling and sky radiances, are shown to depend on sky polarization, and improved values are given.
Efficient calculation of luminance variation of a luminaire that uses LED light sources
NASA Astrophysics Data System (ADS)
Goldstein, Peter
2007-09-01
Many luminaires have an array of LEDs that illuminate a lenslet-array diffuser in order to create the appearance of a single, extended source with a smooth luminance distribution. Designing such a system is challenging because luminance calculations for a lenslet array generally involve tracing millions of rays per LED, which is computationally intensive and time-consuming. This paper presents a technique for calculating an on-axis luminance distribution by tracing only one ray per LED per lenslet. A multiple-LED system is simulated with this method, and with Monte Carlo ray-tracing software for comparison. Accuracy improves, and computation time decreases by at least five orders of magnitude with this technique, which has applications in LED-based signage, displays, and general illumination.
Working Memory, Age, Crew Downsizing, System Design and Training
2000-08-01
Radvansky and Zacks, 1997). As authors have noted perceived demand. Accurate "Situation Models " (Johnson- when attempting to make sense of a... models of cognitive function and workload (cf. Baddeley bodies of information to be processed or multiple results and Gathercole, 1993). The ability to...major bottleneck in human performance. Some models of multiple traces from different headings and the human information processing (Pashler, 1998) place
James A. Stevens; Claire A. Montgomery
2002-01-01
In this report, multiresource research is described as it has coevolved with forest policy objectivesâfrom managing for single or dominant uses, to managing for compatible multiple forest uses, to sustaining ecosystem health on the forest. The evolution of analytical methods for multiresource research is traced from impact analysis to multiresource modeling, and...
Multifunctional Metallosupramolecular Materials
2011-02-28
supramolecular polymers based on 16 and Zn(NTf2)2 using small- angle X - ray scattering (SAXS) and transmission electron microscopy (TEM), carried out by...The SAXS data (Figure 13a) show multiple strong Bragg diffraction maxima at integer multiples of the scattering vector of the primary diffraction ...a minor amount of residual double bonds in the poly(ethylene-co-butylene) core. The metallopolymers 16·[Zn(NTf2)2] x exhibit similar traces, but do
Simultaneous nano-tracking of multiple motor proteins via spectral discrimination of quantum dots.
Kakizuka, Taishi; Ikezaki, Keigo; Kaneshiro, Junichi; Fujita, Hideaki; Watanabe, Tomonobu M; Ichimura, Taro
2016-07-01
Simultaneous nanometric tracking of multiple motor proteins was achieved by combining multicolor fluorescent labeling of target proteins and imaging spectroscopy, revealing dynamic behaviors of multiple motor proteins at the sub-diffraction-limit scale. Using quantum dot probes of distinct colors, we experimentally verified the localization precision to be a few nanometers at temporal resolution of 30 ms or faster. One-dimensional processive movement of two heads of a single myosin molecule and multiple myosin molecules was successfully traced. Furthermore, the system was modified for two-dimensional measurement and applied to tracking of multiple myosin molecules. Our approach is useful for investigating cooperative movement of proteins in supramolecular nanomachinery.
Simultaneous nano-tracking of multiple motor proteins via spectral discrimination of quantum dots
Kakizuka, Taishi; Ikezaki, Keigo; Kaneshiro, Junichi; Fujita, Hideaki; Watanabe, Tomonobu M.; Ichimura, Taro
2016-01-01
Simultaneous nanometric tracking of multiple motor proteins was achieved by combining multicolor fluorescent labeling of target proteins and imaging spectroscopy, revealing dynamic behaviors of multiple motor proteins at the sub-diffraction-limit scale. Using quantum dot probes of distinct colors, we experimentally verified the localization precision to be a few nanometers at temporal resolution of 30 ms or faster. One-dimensional processive movement of two heads of a single myosin molecule and multiple myosin molecules was successfully traced. Furthermore, the system was modified for two-dimensional measurement and applied to tracking of multiple myosin molecules. Our approach is useful for investigating cooperative movement of proteins in supramolecular nanomachinery. PMID:27446684
NASA Astrophysics Data System (ADS)
Shi, Shengxian; Ding, Junfei; New, T. H.; Soria, Julio
2017-07-01
This paper presents a dense ray tracing reconstruction technique for a single light-field camera-based particle image velocimetry. The new approach pre-determines the location of a particle through inverse dense ray tracing and reconstructs the voxel value using multiplicative algebraic reconstruction technique (MART). Simulation studies were undertaken to identify the effects of iteration number, relaxation factor, particle density, voxel-pixel ratio and the effect of the velocity gradient on the performance of the proposed dense ray tracing-based MART method (DRT-MART). The results demonstrate that the DRT-MART method achieves higher reconstruction resolution at significantly better computational efficiency than the MART method (4-50 times faster). Both DRT-MART and MART approaches were applied to measure the velocity field of a low speed jet flow which revealed that for the same computational cost, the DRT-MART method accurately resolves the jet velocity field with improved precision, especially for the velocity component along the depth direction.
[Global Atmospheric Chemistry/Transport Modeling and Data-Analysis
NASA Technical Reports Server (NTRS)
Prinn, Ronald G.
1999-01-01
This grant supported a global atmospheric chemistry/transport modeling and data- analysis project devoted to: (a) development, testing, and refining of inverse methods for determining regional and global transient source and sink strengths for trace gases; (b) utilization of these inverse methods which use either the Model for Atmospheric Chemistry and Transport (MATCH) which is based on analyzed observed winds or back- trajectories calculated from these same winds for determining regional and global source and sink strengths for long-lived trace gases important in ozone depletion and the greenhouse effect; (c) determination of global (and perhaps regional) average hydroxyl radical concentrations using inverse methods with multiple "titrating" gases; and (d) computation of the lifetimes and spatially resolved destruction rates of trace gases using 3D models. Important ultimate goals included determination of regional source strengths of important biogenic/anthropogenic trace gases and also of halocarbons restricted by the Montreal Protocol and its follow-on agreements, and hydrohalocarbons now used as alternatives to the above restricted halocarbons.
Lead theft--a study of the "uniqueness" of lead from church roofs.
Bond, John W; Hainsworth, Sarah V; Lau, Tien L
2013-07-01
In the United Kingdom, theft of lead is common, particularly from churches and other public buildings with lead roofs. To assess the potential to distinguish lead from different sources, 41 samples of lead from 24 church roofs in Northamptonshire, U.K, have been analyzed for relative abundance of trace elements and isotopes of lead using X-ray fluorescence (XRF) and inductively coupled plasma mass spectrometry, respectively. XRF revealed the overall presence of 12 trace elements with the four most abundant, calcium, phosphorus, silicon, and sulfur, showing a large weight percentage standard error of the mean of all samples suggesting variation in the weight percentage of these elements between different church roofs. Multiple samples from the same roofs, but different lead sheets, showed much lower weight percentage standard errors of the mean suggesting similar trace element concentrations. Lead isotope ratios were similar for all samples. Factors likely to affect the occurrence of these trace elements are discussed. © 2013 American Academy of Forensic Sciences.
ERIC Educational Resources Information Center
Kentab, Mohammad Yousef
2016-01-01
In this study, the researcher attempted to shed light on Saudi intermediate school EFL teachers' views of the multiple intelligences theory as an inclusive pedagogy. The purpose of this study was to investigate the impact of multiple intelligences on Saudi intermediate students' learning of EFL. The study also tried to illustrate the main…
Dark matter as a ghost free conformal extension of Einstein theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barvinsky, A.O., E-mail: barvin@td.lpi.ru
We discuss ghost free models of the recently suggested mimetic dark matter theory. This theory is shown to be a conformal extension of Einstein general relativity. Dark matter originates from gauging out its local Weyl invariance as an extra degree of freedom which describes a potential flow of the pressureless perfect fluid. For a positive energy density of this fluid the theory is free of ghost instabilities, which gives strong preference to stable configurations with a positive scalar curvature and trace of the matter stress tensor. Instabilities caused by caustics of the geodesic flow, inherent in this model, serve asmore » a motivation for an alternative conformal extension of Einstein theory, based on the generalized Proca vector field. A potential part of this field modifies the inflationary stage in cosmology, whereas its rotational part at the post inflationary epoch might simulate rotating flows of dark matter.« less
NASA Astrophysics Data System (ADS)
Lam, Wai Sze Tiffany
Optical components made of anisotropic materials, such as crystal polarizers and crystal waveplates, are widely used in many complex optical system, such as display systems, microlithography, biomedical imaging and many other optical systems, and induce more complex aberrations than optical components made of isotropic materials. The goal of this dissertation is to accurately simulate the performance of optical systems with anisotropic materials using polarization ray trace. This work extends the polarization ray tracing calculus to incorporate ray tracing through anisotropic materials, including uniaxial, biaxial and optically active materials. The 3D polarization ray tracing calculus is an invaluable tool for analyzing polarization properties of an optical system. The 3x3 polarization ray tracing P matrix developed for anisotropic ray trace assists tracking the 3D polarization transformations along a ray path with series of surfaces in an optical system. To better represent the anisotropic light-matter interactions, the definition of the P matrix is generalized to incorporate not only the polarization change at a refraction/reflection interface, but also the induced optical phase accumulation as light propagates through the anisotropic medium. This enables realistic modeling of crystalline polarization elements, such as crystal waveplates and crystal polarizers. The wavefront and polarization aberrations of these anisotropic components are more complex than those of isotropic optical components and can be evaluated from the resultant P matrix for each eigen-wavefront as well as for the overall image. One incident ray refracting or reflecting into an anisotropic medium produces two eigenpolarizations or eigenmodes propagating in different directions. The associated ray parameters of these modes necessary for the anisotropic ray trace are described in Chapter 2. The algorithms to calculate the P matrix from these ray parameters are described in Chapter 3 for anisotropic ray tracing. x. Chapter 4 presents the data reduction of the P matrix of a crystal waveplate. The diattenuation is embedded in the singular values of P. The retardance is divided into two parts: (A) The physical retardance induced by OPLs and surface interactions, and (B) the geometrical transformation induced by geometry of a ray path, which is calculated by the geometrical transform Q matrix. The Q matrix of an anisotropic intercept is derived from the generalization of s- and p-bases at the anisotropic intercept; the p basis is not confined to the plane of incidence due to the anisotropic refraction or reflection. Chapter 5 shows how the multiple P matrices associated with the eigenmodes resulting from propagation through multiple anisotropic surfaces can be combined into one P matrix when the multiple modes interfere in their overlapping regions. The resultant P matrix contains diattenuation induced at each surface interaction as well as the retardance due to ray propagation and total internal reflections. The polarization aberrations of crystal waveplates and crystal polarizers are studied in Chapter 6 and Chapter 7. A wavefront simulated by a grid of rays is traced through the anisotropic system and the resultant grid of rays is analyzed. The analysis is complicated by the ray doubling effects and the partially overlapping eigen-wavefronts propagating in various directions. The wavefront and polarization aberrations of each eigenmode can be evaluated from the electric field distributions. The overall polarization at the plane of interest or the image quality at the image plane are affected by each of these eigen-wavefronts. Isotropic materials become anisotropic due to stress, strain, or applied electric or magnetic fields. In Chapter 8, the P matrix for anisotropic materials is extended to ray tracing in stress birefringent materials which are treated as spatially varying anisotropic materials. Such simulations can predict the spatial retardance variation throughout the stressed optical component and its effects on the point spread function and modulation transfer function for different incident polarizations. The anisotropic extension of the P matrix also applies to other anisotropic optical components, such as anisotropic diffractive optical elements and anisotropic thin films. It systematically keeps track of polarization transformation in 3D global Cartesian coordinates of a ray propagating through series of anisotropic and isotropic optical components with arbitrary orientations. The polarization ray tracing calculus with this generalized P matrix provides a powerful tool for optical ray trace and allows comprehensive analysis of complex optical system. (Abstract shortened by UMI.).
Reproductive social behavior: cooperative games to replace sexual selection.
Roughgarden, Joan; Oishi, Meeko; Akçay, Erol
2006-02-17
Theories about sexual selection can be traced back to Darwin in 1871. He proposed that males fertilize as many females as possible with inexpensive sperm, whereas females, with a limited supply of large eggs, select the genetically highest quality males to endow their offspring with superior capabilities. Since its proposal, problems with this narrative have continued to accumulate, and it is our view that sexual selection theory needs to be replaced. We suggest an approach that relies on the exchange of direct ecological benefits among cooperating animals without reference to genetic benefits. This approach can be expressed mathematically in a branch of game theory that pertains to bargaining and side payments.
Generalized second law of thermodynamics in f(R,T) theory of gravity
NASA Astrophysics Data System (ADS)
Momeni, D.; Moraes, P. H. R. S.; Myrzakulov, R.
2016-07-01
We present a study of the generalized second law of thermodynamics in the scope of the f(R,T) theory of gravity, with R and T representing the Ricci scalar and trace of the energy-momentum tensor, respectively. From the energy-momentum tensor equation for the f(R,T)=R+f(T) case, we calculate the form of the geometric entropy in such a theory. Then, the generalized second law of thermodynamics is quantified and some relations for its obedience in f(R,T) gravity are presented. Those relations depend on some cosmological quantities, as the Hubble and deceleration parameters, and also on the form of f(T).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maldacena, Juan; Simmons-Duffin, David; Zhiboedov, Alexander
Here, we consider Lorentzian correlators of local operators. In perturbation theory, singularities occur when we can draw a position-space Landau diagram with null lines. In theories with gravity duals, we can also draw Landau diagrams in the bulk. We also argue that certain singularities can arise only from bulk diagrams, not from boundary diagrams. As has been previously observed, these singularities are a clear diagnostic of bulk locality. We analyze some properties of these perturbative singularities and discuss their relation to the OPE and the dimensions of double-trace operators. In the exact nonperturbative theory, we expect no singularity at thesemore » locations. Finally, we prove this statement in 1+1 dimensions by CFT methods.« less
Maldacena, Juan; Simmons-Duffin, David; Zhiboedov, Alexander
2017-01-03
Here, we consider Lorentzian correlators of local operators. In perturbation theory, singularities occur when we can draw a position-space Landau diagram with null lines. In theories with gravity duals, we can also draw Landau diagrams in the bulk. We also argue that certain singularities can arise only from bulk diagrams, not from boundary diagrams. As has been previously observed, these singularities are a clear diagnostic of bulk locality. We analyze some properties of these perturbative singularities and discuss their relation to the OPE and the dimensions of double-trace operators. In the exact nonperturbative theory, we expect no singularity at thesemore » locations. Finally, we prove this statement in 1+1 dimensions by CFT methods.« less
Nebular chemistry and theories of lunar origin
NASA Technical Reports Server (NTRS)
Larimer, John W.
1986-01-01
The cosmic history of planetary matter is traced from nucleosynthesis through accretion in an attempt to understand the origin of the moon. It is noted that nebular processes must be considered in any theory of lunar origin and that planetary differentiation and volcanism determine the final character of lunar rocks. The moon's unique blend of nebular components suggests that the earth and moon accreted from the same mix of components as the proto-moon orbited the proto-earth, with the earth winning and the moon progressively losing, its solar complement of the components.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spence, R.D.; Godbee, H.W.; Tallent, O.K.
1989-01-01
The analysis of leaching data using analytical solutions based on mass transport theory and empiricism is presented. The waste forms leached to generate the data used in this analysis were prepared with a simulated radioactive waste slurry with traces of potassium ion, manganese ions, carbonate ions, phosphate ions, and sulfate ions solidified with several blends of cementitious materials. Diffusion coefficients were estimated from the results of ANS - 16.1 tests. Data of fraction leached versus time is presented and discussed.
Hawking radiation from rotating black holes and gravitational anomalies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murata, Keiju; Soda, Jiro
2006-08-15
We study the Hawking radiation from Rotating black holes from the gravitational anomalies point of view. First, we show that the scalar field theory near the Kerr black hole horizon can be reduced to the 2-dimensional effective theory. Then, following Robinson and Wilczek, we derive the Hawking flux by requiring the cancellation of gravitational anomalies. We also apply this method to Hawking radiation from higher dimensional Myers-Perry black holes. In the appendix, we present the trace anomaly derivation of Hawking radiation to argue the validity of the boundary condition at the horizon.
Time-resolved non-sequential ray-tracing modelling of non-line-of-sight picosecond pulse LIDAR
NASA Astrophysics Data System (ADS)
Sroka, Adam; Chan, Susan; Warburton, Ryan; Gariepy, Genevieve; Henderson, Robert; Leach, Jonathan; Faccio, Daniele; Lee, Stephen T.
2016-05-01
The ability to detect motion and to track a moving object that is hidden around a corner or behind a wall provides a crucial advantage when physically going around the obstacle is impossible or dangerous. One recently demonstrated approach to achieving this goal makes use of non-line-of-sight picosecond pulse laser ranging. This approach has recently become interesting due to the availability of single-photon avalanche diode (SPAD) receivers with picosecond time resolution. We present a time-resolved non-sequential ray-tracing model and its application to indirect line-of-sight detection of moving targets. The model makes use of the Zemax optical design programme's capabilities in stray light analysis where it traces large numbers of rays through multiple random scattering events in a 3D non-sequential environment. Our model then reconstructs the generated multi-segment ray paths and adds temporal analysis. Validation of this model against experimental results is shown. We then exercise the model to explore the limits placed on system design by available laser sources and detectors. In particular we detail the requirements on the laser's pulse energy, duration and repetition rate, and on the receiver's temporal response and sensitivity. These are discussed in terms of the resulting implications for achievable range, resolution and measurement time while retaining eye-safety with this technique. Finally, the model is used to examine potential extensions to the experimental system that may allow for increased localisation of the position of the detected moving object, such as the inclusion of multiple detectors and/or multiple emitters.
ERIC Educational Resources Information Center
Banton, Cynthia L.
2014-01-01
The purpose of this qualitative grounded theory study was to explore and examine the factors that led to the creation of multiple record entries, and present a theory on the impact the problem has on the business performance of health care organizations. A sample of 59 health care professionals across the United States participated in an online…
NASA Astrophysics Data System (ADS)
Bernhardt, E. S.; Helton, A. M.; Morse, J. L.; Poole, G. C.
2013-12-01
Wetlands are the dominant natural source of methane to the global atmosphere and can be important sites of either N2O emission or consumption. Changes in the spatial extent or inundation frequency and duration may lead to substantial shifts in the contribution of wetland ecosystems to global CH4 and N2O emissions. Trace gases are produced at the scale of individual microbes, each of which respond dynamically to the local availability of electron donors and acceptors. Within landscape patches, substrate supply and redox conditions are strongly controlled by variation in water table elevation and vertical hydrologic exchange. At the landscape scale, lateral exchange between patches and the extent and duration of inundation. Accurate estimates of trace gas emissions from wetlands are hard to estimate given the dynamic patterns of redox potential within the soil column and across the landscape that redistribute electron donors and acceptors both vertically and laterally. In five years of trace gas flux measurement and modeling at TOWER, a 440 ha restored wetland in coastal NC, we have developed both simulation and statistical models to estimate landscape level trace gas fluxes. Yet, because trace gas emissions are highly variable in both time and space, our qualitative and quantitative attempts at upscaling trace gas emissions typically generate estimates with extremely high uncertainty. In this talk we will explore the challenges inherent to the estimation of landscape scale trace gas fluxes at the scale of our individual ecosystem as well as the difficulties in extrapolating across multiple ecosystem studies.
Improving Predictions of Multiple Binary Models in ILP
2014-01-01
Despite the success of ILP systems in learning first-order rules from small number of examples and complexly structured data in various domains, they struggle in dealing with multiclass problems. In most cases they boil down a multiclass problem into multiple black-box binary problems following the one-versus-one or one-versus-rest binarisation techniques and learn a theory for each one. When evaluating the learned theories of multiple class problems in one-versus-rest paradigm particularly, there is a bias caused by the default rule toward the negative classes leading to an unrealistic high performance beside the lack of prediction integrity between the theories. Here we discuss the problem of using one-versus-rest binarisation technique when it comes to evaluating multiclass data and propose several methods to remedy this problem. We also illustrate the methods and highlight their link to binary tree and Formal Concept Analysis (FCA). Our methods allow learning of a simple, consistent, and reliable multiclass theory by combining the rules of the multiple one-versus-rest theories into one rule list or rule set theory. Empirical evaluation over a number of data sets shows that our proposed methods produce coherent and accurate rule models from the rules learned by the ILP system of Aleph. PMID:24696657
Ostrogradsky in theories with multiple fields
de Rham, Claudia; Matas, Andrew
2016-06-23
We review how the (absence of) Ostrogradsky instability manifests itself in theories with multiple fields. It has recently been appreciated that when multiple fields are present, the existence of higher derivatives may not automatically imply the existence of ghosts. We discuss the connection with gravitational theories like massive gravity and beyond Horndeski which manifest higher derivatives in some formulations and yet are free of Ostrogradsky ghost. We also examine an interesting new class of Extended Scalar-Tensor Theories of gravity which has been recently proposed. We show that for a subclass of these theories, the tensor modes are either not dynamicalmore » or are infinitely strongly coupled. Among the remaining theories for which the tensor modes are well-defined one counts one new model that is not field-redefinable to Horndeski via a conformal and disformal transformation but that does require the vacuum to break Lorentz invariance. We discuss the implications for the effective field theory of dark energy and the stability of the theory. In particular we find that if we restrict ourselves to the Extended Scalar-Tensor class of theories for which the tensors are well-behaved and the scalar is free from gradient or ghost instabilities on FLRW then we recover Horndeski up to field redefinitions.« less
Ostrogradsky in theories with multiple fields
DOE Office of Scientific and Technical Information (OSTI.GOV)
de Rham, Claudia; Matas, Andrew
We review how the (absence of) Ostrogradsky instability manifests itself in theories with multiple fields. It has recently been appreciated that when multiple fields are present, the existence of higher derivatives may not automatically imply the existence of ghosts. We discuss the connection with gravitational theories like massive gravity and beyond Horndeski which manifest higher derivatives in some formulations and yet are free of Ostrogradsky ghost. We also examine an interesting new class of Extended Scalar-Tensor Theories of gravity which has been recently proposed. We show that for a subclass of these theories, the tensor modes are either not dynamicalmore » or are infinitely strongly coupled. Among the remaining theories for which the tensor modes are well-defined one counts one new model that is not field-redefinable to Horndeski via a conformal and disformal transformation but that does require the vacuum to break Lorentz invariance. We discuss the implications for the effective field theory of dark energy and the stability of the theory. In particular we find that if we restrict ourselves to the Extended Scalar-Tensor class of theories for which the tensors are well-behaved and the scalar is free from gradient or ghost instabilities on FLRW then we recover Horndeski up to field redefinitions.« less
Olu, Olushayo Oluseun; Lamunu, Margaret; Nanyunja, Miriam; Dafae, Foday; Samba, Thomas; Sempiira, Noah; Kuti-George, Fredson; Abebe, Fikru Zeleke; Sensasi, Benjamin; Chimbaru, Alexander; Ganda, Louisa; Gausi, Khoti; Gilroy, Sonia; Mugume, James
2016-01-01
Contact tracing is a critical strategy required for timely prevention and control of Ebola virus disease (EVD) outbreaks. Available evidence suggests that poor contact tracing was a driver of the EVD outbreak in West Africa, including Sierra Leone. In this article, we answered the question as to whether EVD contact tracing, as practiced in Western Area (WA) districts of Sierra Leone from 2014 to 2015, was effective. The goal is to describe contact tracing and identify obstacles to its effective implementation. Mixed methods comprising secondary data analysis of the EVD case and contact tracing data sets collected from WA during the period from 2014 to 2015, key informant interviews of contact tracers and their supervisors, and a review of available reports on contact tracing were implemented to obtain data for this study. During the study period, 3,838 confirmed cases and 32,706 contacts were listed in the viral hemorrhagic fever and contact databases for the district (mean 8.5 contacts per case). Only 22.1% (852) of the confirmed cases in the study area were listed as contacts at the onset of their illness, which indicates incomplete identification and tracing of contacts. Challenges associated with effective contact tracing included lack of community trust, concealing of exposure information, political interference with recruitment of tracers, inadequate training of contact tracers, and incomplete EVD case and contact database. While the tracers noted the usefulness of community quarantine in facilitating their work, they also reported delayed or irregular supply of basic needs, such as food and water, which created resistance from the communities. Multiple gaps in contact tracing attributed to a variety of factors associated with implementers, and communities were identified as obstacles that impeded timely control of the EVD outbreak in the WA of Sierra Leone. In future outbreaks, early community engagement and participation in contact tracing, establishment of appropriate mechanisms for selection, adequate training and supervision of qualified contact tracers, establishment of a well-managed and complete contact tracing database, and provision of basic needs to quarantined contacts are recommended as measures to enhance effective contact tracing.
Transitions theory: a trajectory of theoretical development in nursing.
Im, Eun-Ok
2011-01-01
There have been very few investigations into how any single nursing theory has actually evolved historically. In this paper, a trajectory of theoretical development in nursing is explored through reviewing the theoretical development of a single nursing theory-transitions theory. The literature related to transitions theory was searched and retrieved using multiple databases. Ninety-nine papers were analyzed according to type of theory, populations of interest, sources of theorizing, and theoretical methods. Transitions theory originated in research but was initially borrowed. It also arose in research with immigrants and from national and international collaborative research efforts. A product of mentoring, transitions theory is used widely in nursing education, research, and practice. Diverse thoughts related to transitions theory coexist. For future theoretical development in nursing, we need to remain open to new ideas and continue to engage in multiple collaborative efforts. Copyright © 2011 Elsevier Inc. All rights reserved.
Zou, Yan-e; Jiang, Ping-ping; Zhang, Qiang; Tang, Qing-jia; Kang, Zhi-qiang; Gong, Xiao- ping; Chen, Chang-jie; Yu, Jian-guo
2015-12-01
High-frequency sampling was conducted at the outlet of Guangxi Bishuiyan karst subterranean river using an automatic sampler during the rainfall events. The hydrochemical drymanic variation characteristics of trace metals (Cu, Pb, Zn, Cd) at the outlet of Guangxi Bishuiyan karst subterranean river were analyzed, and the sources of the trace metals in the subterranean river as well as their response to rainfall were explored. The results showed that the rainfall provoked a sharp decrease in the major elements (Ca²⁺, Mg²⁺, HCO₃⁻, etc.) due to dilution and precipitation, while it also caused an increase in the concentrations of dissolved metals including Al, Mn, Cu, Zn and Cd, due to water-rock reaction, sediment remobilization, and soil erosion. The water-rock reaction was more sensitive to rainfall than the others, while the sediment remobilization and soil erosion took the main responsibility for the chemical change of the heavy metals. The curves of the heavy metal concentrations presented multiple peaks, of which the maximum was reached at 9 hours later after the largest precipitation. Different metal sources and the double-inlet structure of the subterranean river were supposed to be the reasons for the formation of multiple peaks. During the monitoring period, the average speed of the solute in the river reached about 0.47 km · h⁻¹, indicating fast migration of the pollutants. Therefore, monitoring the chemical dynamics of the karst subterranean river, mastering the sources and migration characteristics of trace metal components have great significance for the subterranean river environment pollution treatment.
Huertas, Marco A; Schwettmann, Sarah E; Shouval, Harel Z
2016-01-01
The ability to maximize reward and avoid punishment is essential for animal survival. Reinforcement learning (RL) refers to the algorithms used by biological or artificial systems to learn how to maximize reward or avoid negative outcomes based on past experiences. While RL is also important in machine learning, the types of mechanistic constraints encountered by biological machinery might be different than those for artificial systems. Two major problems encountered by RL are how to relate a stimulus with a reinforcing signal that is delayed in time (temporal credit assignment), and how to stop learning once the target behaviors are attained (stopping rule). To address the first problem synaptic eligibility traces were introduced, bridging the temporal gap between a stimulus and its reward. Although, these were mere theoretical constructs, recent experiments have provided evidence of their existence. These experiments also reveal that the presence of specific neuromodulators converts the traces into changes in synaptic efficacy. A mechanistic implementation of the stopping rule usually assumes the inhibition of the reward nucleus; however, recent experimental results have shown that learning terminates at the appropriate network state even in setups where the reward nucleus cannot be inhibited. In an effort to describe a learning rule that solves the temporal credit assignment problem and implements a biologically plausible stopping rule, we proposed a model based on two separate synaptic eligibility traces, one for long-term potentiation (LTP) and one for long-term depression (LTD), each obeying different dynamics and having different effective magnitudes. The model has been shown to successfully generate stable learning in recurrent networks. Although, the model assumes the presence of a single neuromodulator, evidence indicates that there are different neuromodulators for expressing the different traces. What could be the role of different neuromodulators for expressing the LTP and LTD traces? Here we expand on our previous model to include several neuromodulators, and illustrate through various examples how different these contribute to learning reward-timing within a wide set of training paradigms and propose further roles that multiple neuromodulators can play in encoding additional information of the rewarding signal.
Are Prospective English Teachers Linguistically Intelligent?
ERIC Educational Resources Information Center
Tezel, Kadir Vefa
2017-01-01
Language is normally associated with linguistic capabilities of individuals. In the theory of multiple intelligences, language is considered to be related primarily to linguistic intelligence. Using the theory of Multiple Intelligences as its starting point, this descriptive survey study investigated to what extent prospective English teachers'…
Li, Siyue; Zhang, Quanfa
2011-06-15
Water samples were collected for determination of dissolved trace metals in 56 sampling sites throughout the upper Han River, China. Multivariate statistical analyses including correlation analysis, stepwise multiple linear regression models, and principal component and factor analysis (PCA/FA) were employed to examine the land use influences on trace metals, and a receptor model of factor analysis-multiple linear regression (FA-MLR) was used for source identification/apportionment of anthropogenic heavy metals in the surface water of the River. Our results revealed that land use was an important factor in water metals in the snow melt flow period and land use in the riparian zone was not a better predictor of metals than land use away from the river. Urbanization in a watershed and vegetation along river networks could better explain metals, and agriculture, regardless of its relative location, however slightly explained metal variables in the upper Han River. FA-MLR analysis identified five source types of metals, and mining, fossil fuel combustion, and vehicle exhaust were the dominant pollutions in the surface waters. The results demonstrated great impacts of human activities on metal concentrations in the subtropical river of China. Copyright © 2011 Elsevier B.V. All rights reserved.
Airborne In-Situ Trace Gas Measurements of Multiple Wildfires in California (2013-2014)
NASA Astrophysics Data System (ADS)
Iraci, L. T.; Yates, E. L.; Tanaka, T.; Roby, M.; Gore, W.; Clements, C. B.; Lareau, N.; Ambrosia, V. G.; Quayle, B.; Schroeder, W.
2014-12-01
Biomass burning emissions are an important source of a wide range of trace gases and particles that can impact local, regional and global air quality, climate forcing, biogeochemical cycles and human health. In the western US, wildfires dominate over prescribed fires, contributing to atmospheric trace gas budgets and regional and local air pollution. Limited sampling of emissions from wildfires means western US emission estimates rely largely on data from prescribed fires, which may not be a suitable proxy for wildfire emissions. We report here in-situ measurements of carbon dioxide, methane, ozone and water vapor from the plumes of a variety of wildfires sampled in California in the fire seasons of 2013 and 2014. Included in the analysis are the Rim Fire (August - October 2013, near Yosemite National Park), the Morgan Fire (September 2013, near Clayton, CA), and the El Portal Fire (July - August 2014, in Yosemite National Park), among others. When possible, fires were sampled on multiple days. Emission ratios and estimated emission factors will be presented and discussed in the context of fuel composition, plume structure, and fire phase. Correlations of plume chemical composition to MODIS/VIIRS Fire Radiative Power (FRP) and other remote sensing information will be explored. Furthermore, the role of plumes in delivery of enhanced ozone concentrations to downwind municipalities will be discussed.
To analyse a trace or not? Evaluating the decision-making process in the criminal investigation.
Bitzer, Sonja; Ribaux, Olivier; Albertini, Nicola; Delémont, Olivier
2016-05-01
In order to broaden our knowledge and understanding of the decision steps in the criminal investigation process, we started by evaluating the decision to analyse a trace and the factors involved in this decision step. This decision step is embedded in the complete criminal investigation process, involving multiple decision and triaging steps. Considering robbery cases occurring in a geographic region during a 2-year-period, we have studied the factors influencing the decision to submit biological traces, directly sampled on the scene of the robbery or on collected objects, for analysis. The factors were categorised into five knowledge dimensions: strategic, immediate, physical, criminal and utility and decision tree analysis was carried out. Factors in each category played a role in the decision to analyse a biological trace. Interestingly, factors involving information available prior to the analysis are of importance, such as the fact that a positive result (a profile suitable for comparison) is already available in the case, or that a suspect has been identified through traditional police work before analysis. One factor that was taken into account, but was not significant, is the matrix of the trace. Hence, the decision to analyse a trace is not influenced by this variable. The decision to analyse a trace first is very complex and many of the tested variables were taken into account. The decisions are often made on a case-by-case basis. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
T\\overline{T} -deformations, AdS/CFT and correlation functions
NASA Astrophysics Data System (ADS)
Giribet, Gaston
2018-02-01
A solvable irrelevant deformation of AdS3/CFT2 correspondence leading to a theory with Hagedorn spectrum at high energy has been recently proposed. It consists of a single trace deformation of the boundary theory, which is inspired by the recent work on solvable T\\overline{T} deformations of two-dimensional CFTs. Thought of as a worldsheet σ-model, the interpretation of the deformed theory from the bulk viewpoint is that of string theory on a background that interpolates between AdS3 in the IR and a linear dilaton vacuum of little string theory in the UV. The insertion of the operator that realizes the deformation in the correlation functions produces a logarithmic divergence, leading to the renormalization of the primary operators, which thus acquire an anomalous dimension. We compute this anomalous dimension explicitly, and this provides us with a direct way of determining the spectrum of the theory. We discuss this and other features of the correlation functions in presence of the deformation.
Signal processing and neural network toolbox and its application to failure diagnosis and prognosis
NASA Astrophysics Data System (ADS)
Tu, Fang; Wen, Fang; Willett, Peter K.; Pattipati, Krishna R.; Jordan, Eric H.
2001-07-01
Many systems are comprised of components equipped with self-testing capability; however, if the system is complex involving feedback and the self-testing itself may occasionally be faulty, tracing faults to a single or multiple causes is difficult. Moreover, many sensors are incapable of reliable decision-making on their own. In such cases, a signal processing front-end that can match inference needs will be very helpful. The work is concerned with providing an object-oriented simulation environment for signal processing and neural network-based fault diagnosis and prognosis. In the toolbox, we implemented a wide range of spectral and statistical manipulation methods such as filters, harmonic analyzers, transient detectors, and multi-resolution decomposition to extract features for failure events from data collected by data sensors. Then we evaluated multiple learning paradigms for general classification, diagnosis and prognosis. The network models evaluated include Restricted Coulomb Energy (RCE) Neural Network, Learning Vector Quantization (LVQ), Decision Trees (C4.5), Fuzzy Adaptive Resonance Theory (FuzzyArtmap), Linear Discriminant Rule (LDR), Quadratic Discriminant Rule (QDR), Radial Basis Functions (RBF), Multiple Layer Perceptrons (MLP) and Single Layer Perceptrons (SLP). Validation techniques, such as N-fold cross-validation and bootstrap techniques, are employed for evaluating the robustness of network models. The trained networks are evaluated for their performance using test data on the basis of percent error rates obtained via cross-validation, time efficiency, generalization ability to unseen faults. Finally, the usage of neural networks for the prediction of residual life of turbine blades with thermal barrier coatings is described and the results are shown. The neural network toolbox has also been applied to fault diagnosis in mixed-signal circuits.
A density functional theory for colloids with two multiple bonding associating sites.
Haghmoradi, Amin; Wang, Le; Chapman, Walter G
2016-06-22
Wertheim's multi-density formalism is extended for patchy colloidal fluids with two multiple bonding patches. The theory is developed as a density functional theory to predict the properties of an associating inhomogeneous fluid. The equation of state developed for this fluid depends on the size of the patch, and includes formation of cyclic, branched and linear clusters of associated species. The theory predicts the density profile and the fractions of colloids in different bonding states versus the distance from one wall as a function of bulk density and temperature. The predictions from our theory are compared with previous results for a confined fluid with four single bonding association sites. Also, comparison between the present theory and Monte Carlo simulation indicates a good agreement.
General theory of remote gaze estimation using the pupil center and corneal reflections.
Guestrin, Elias Daniel; Eizenman, Moshe
2006-06-01
This paper presents a general theory for the remote estimation of the point-of-gaze (POG) from the coordinates of the centers of the pupil and corneal reflections. Corneal reflections are produced by light sources that illuminate the eye and the centers of the pupil and corneal reflections are estimated in video images from one or more cameras. The general theory covers the full range of possible system configurations. Using one camera and one light source, the POG can be estimated only if the head is completely stationary. Using one camera and multiple light sources, the POG can be estimated with free head movements, following the completion of a multiple-point calibration procedure. When multiple cameras and multiple light sources are used, the POG can be estimated following a simple one-point calibration procedure. Experimental and simulation results suggest that the main sources of gaze estimation errors are the discrepancy between the shape of real corneas and the spherical corneal shape assumed in the general theory, and the noise in the estimation of the centers of the pupil and corneal reflections. A detailed example of a system that uses the general theory to estimate the POG on a computer screen is presented.
Multiple Intelligences in the Schools.
ERIC Educational Resources Information Center
Quigley, Kathleen M.
Within the context of school improvement and school reform, it is important to examine Howard Gardner's theory of multiple intelligences (MI theory). His work has far-reaching implications for curriculum development and classroom implementation. Gardner believes that the culture defines intelligence too narrowly. He sought to broaden the scope of…
Multiple Intelligences: A Collection.
ERIC Educational Resources Information Center
Fogarty, Robin, Ed.; Bellanca, James, Ed.
As a concise resource for Howard Gardner's theory of multiple intelligences and its implications for schooling around the world, this collection is designed for educators, parents, and others interested in education. The first section discusses Gardner and his background, and the second section expounds his theory. The third section explores the…
ERIC Educational Resources Information Center
Gardner, Howard; Connell, Michael
2000-01-01
Replies to "The Theory of Multiple Intelligences: A Case of Missing Cognitive Matter," also in this issue. Disagrees about the role theory of knowledge plays in the context of justification of multiple intelligences. Specifically, asserts that the article's criticisms based on philosophy of science claims and work with artificial neural…
The Evolution of Experiential Learning Theory: Tracing Lines of Research in the "JEE"
ERIC Educational Resources Information Center
Seaman, Jayson; Brown, Mike; Quay, John
2017-01-01
This essay introduces a collection of past articles from the "Journal of Experiential Education" ("JEE") focused on the concept of experiential learning. It outlines the historical trajectory of the concept beginning with human relations training practices beginning in 1946, as it came to be understood as a naturally occurring…
Jack Mezirow's Conceptualisation of Adult Transformative Learning: A Review
ERIC Educational Resources Information Center
Calleja, Colin
2014-01-01
This paper traces the evolution of Jack Mezirow's transformative learning theory and its conceptualisation. It discusses the three major influences, namely Thomas Khun's philosophical conception of paradigm, Freire's conception of conscientisation and consciousness growth, and Habermas' domains of learning and the discussion of…
William Kessen and James Mark Baldwin: Lessons from the History of Developmental Psychology.
ERIC Educational Resources Information Center
Ferrari, Michel; Runions, Kevin; Fueser, Josephine J.
2003-01-01
Considers the work of developmental scholar William Kessen (1925-1999) in light of James Mark Baldwin, one of the founders and principal architects of developmental psychology. Traces Kessen's interest in Baldwin's thought and examines Baldwin's legacy for developmental psychologists. Asserts that Baldwin's theory sought to integrate the role of…
Reflections on Dead Theory in International Relations
ERIC Educational Resources Information Center
Thakur, Vineet
2016-01-01
In this short autobiographical essay, I trace my journey in the discipline of International Relations. While entering the discipline, I, along with a host of my classmates, were enamoured by the exciting possibilities of thinking theoretically. Almost a decade later, those promises look bleak. From the perspective of a student in the discipline, I…
Science and Society in the Eugenic Thought of H. J. Muller
ERIC Educational Resources Information Center
Allen, Garland E.
1970-01-01
Traces the growth of theories of eugenics during the twentieth century, focussing on the work of H. J. Muller. Concludes that "Muller's lasting contribution was to write the hereditarian attitudes associated with traditional eugenics and the environmentalist's viewpoint associated with modern sociology to obtain a humane and reasoned approach to…
The Oral History of Evaluation: The Professional Development of Marvin C. Alkin
ERIC Educational Resources Information Center
American Journal of Evaluation, 2010
2010-01-01
Over the past 7 years, the Oral History Project Team has conducted interviews with individuals who have made signal contributions to evaluation theory and practice, tracing their professional development and contextualizing their work within the social and political climates of the time. By capturing the professional evolution of those who have…
Enactivism and the Study of Collectivity
ERIC Educational Resources Information Center
Towers, Jo; Martin, Lyndon C.
2015-01-01
In this paper, we trace the development of our theorizing about students' mathematical understanding, showing how the adoption of an enactivist perspective has transformed our gaze in terms of the objects of our studies and occasioned for us new methods of data analysis. Drawing on elements of Pirie-Kieren (P-K) Theory for the Dynamical Growth of…
Economic Development in American Indian Reservations. Development Series No. 1.
ERIC Educational Resources Information Center
Ortiz, Roxanne Dunbar, Ed.
A collection of 13 scholarly articles and essays, this book makes available hard-to-find information and theories about American Indian economic development. Part I, "The Land and the People", emphasizes cultural traditions and beliefs of Indian people and traces the development of the concept of sovereignty and its applicability to…
The Antieconomy Hypothesis (Part 3): Toward a Solution
ERIC Educational Resources Information Center
Vanderburg, Willem H.
2009-01-01
Parts 1 and 2 explore the hypothesis that the application of mainstream economics has led to economies becoming uneconomic, which is as close as a social science can get to experimentally disproving its theories. One of the primary reasons for this failure is traced to the characteristics of the knowledge infrastructures of contemporary societies,…
Poverty Eradication: Lessons from China and South Korea in the 1950s and 1960s.
ERIC Educational Resources Information Center
Wignaraja, Ponna
1996-01-01
Traces the search for economic development alternatives that go beyond conventional neo-classical and Marxist theory and practice. Outlines case studies of social and economic transformation in South Korea and China and delineates the differences between similar attempts in Latin America and Eastern Europe. (MJP)
ERIC Educational Resources Information Center
Menchaca, Martha; Valencia, Richard R.
1990-01-01
Traces the development of Anglo-Saxon superiority theories from the nineteenth century onward and demonstrates their impact on social conditions in the southwest, specifically on school segregation in Santa Paula School District (California) from the 1920s to the present. (DM)
Thomas B. Greenfield: A Challenging Perspective of Organizations
ERIC Educational Resources Information Center
Bailey, Scott
2010-01-01
Organizations are not real; people are. Any science or theory of organizations must consider how the organization impinges, in a very real and tangible way, on the lives of its members. This article traces the development of one such theoretical branch of organizational science through the pioneering work of Thomas B. Greenfield. The author uses…
ERIC Educational Resources Information Center
Morgan, Jeff
2011-01-01
Cultural sensitivity theory is the study of how individuals relate to cultural difference. Using literature to help students prepare for study abroad, instructors could analyze character and trace behavior through a model of cultural sensitivity. Milton J. Bennett has developed such an instrument, The Developmental Model of Intercultural…
ERIC Educational Resources Information Center
Hillon, Yue Cai; Boje, David M.
2017-01-01
Purpose: Calls for dialectical learning process model development in learning organizations have largely gone unheeded, thereby limiting conceptual understanding and application in the field. This paper aims to unify learning organization theory with a new understanding of Hegelian dialectics to trace the development of the storytelling learning…