ERIC Educational Resources Information Center
Wood, Phil
2017-01-01
In this article, I begin by outlining some of the barriers which constrain sustainable organizational change in schools and universities. I then go on to introduce a theory which has already started to help explain complex change and innovation processes in health and care contexts, Normalization Process Theory. Finally, I consider what this…
ERIC Educational Resources Information Center
Stukuls, Henry I.
Eighteen retarded Ss (mean IQ 50 and mean age 14 years) and 18 normal Ss (mean IQ 100 and mean age 7 years) participated in a study to isolate variables that differentially control discrimination learning and retention processes, and to evaluate contrasting theories on discrimination learning and menory processes of retarded and normal children.…
1981-06-15
relationships 5 3. Normalized energy in ambiguity function for i = 0 14 k ilI SACLANTCEN SR-50 A RESUME OF STOCHASTIC, TIME-VARYING, LINEAR SYSTEM THEORY WITH...the order in which systems are concatenated is unimportant. These results are exactly analogous to the results of time-invariant linear system theory in...REFERENCES 1. MEIER, L. A rdsum6 of deterministic time-varying linear system theory with application to active sonar signal processing problems, SACLANTCEN
Developing Visualization Support System for Teaching/Learning Database Normalization
ERIC Educational Resources Information Center
Folorunso, Olusegun; Akinwale, AdioTaofeek
2010-01-01
Purpose: In tertiary institution, some students find it hard to learn database design theory, in particular, database normalization. The purpose of this paper is to develop a visualization tool to give students an interactive hands-on experience in database normalization process. Design/methodology/approach: The model-view-controller architecture…
Drew, Sarah; Judge, Andrew; May, Carl; Farmer, Andrew; Cooper, Cyrus; Javaid, M Kassim; Gooberman-Hill, Rachael
2015-04-23
National and international guidance emphasizes the need for hospitals to have effective secondary fracture prevention services, to reduce the risk of future fractures in hip fracture patients. Variation exists in how hospitals organize these services, and there remain significant gaps in care. No research has systematically explored reasons for this to understand how to successfully implement these services. The objective of this study was to use extended Normalization Process Theory to understand how secondary fracture prevention services can be successfully implemented. Forty-three semi-structured interviews were conducted with healthcare professionals involved in delivering secondary fracture prevention within 11 hospitals that receive patients with acute hip fracture in one region in England. These included orthogeriatricians, fracture prevention nurses and service managers. Extended Normalization Process Theory was used to inform study design and analysis. Extended Normalization Process Theory specifies four constructs relating to collective action in service implementation: capacity, potential, capability and contribution. The capacity of healthcare professionals to co-operate and co-ordinate their actions was achieved using dedicated fracture prevention co-ordinators to organize important processes of care. However, participants described effective communication with GPs as challenging. Individual potential and commitment to operationalize services was generally high. Shared commitments were promoted through multi-disciplinary team working, facilitated by fracture prevention co-ordinators. Healthcare professionals had capacity to deliver multiple components of services when co-ordinators 'freed up' time. As key agents in its intervention, fracture prevention coordinators were therefore indispensable to effective implementation. Aside from difficulty of co-ordination with primary care, the intervention was highly workable and easily integrated into practice. Nevertheless, implementation was threatened by under-staffed and under-resourced services, lack of capacity to administer scans and poor patient access. To ensure ongoing service delivery, the contributions of healthcare professionals were shaped by planning, in multi-disciplinary team meetings, the use of clinical databases to identify patients and define the composition of clinical work and monitoring to improve clinical practice. Findings identify and describe elements needed to implement secondary fracture prevention services successfully. The study highlights the value of Normalization Process Theory to achieve comprehensive understanding of healthcare professionals' experiences in enacting a complex intervention.
[The Application of Grief Theories to Bereaved Family Members].
Wu, Lee-Jen Suen; Chou, Chuan-Chiang; Lin, Yen-Chun
2017-12-01
Loss is an inevitable experience for humans for which grief is a natural response. Nurses must have an adequate understanding of grief and bereavement in order to be more sensitive to these painful emotions and to provide appropriate care to families who have lost someone they love deeply. This article introduces four important grief theories: Freud's grief theory, Bowlby's attachment theory, Stroebe and Schuts' dual process model, and Neiyemer's meaning reconstruction model. Freud's grief theory holds that the process of grief adaptation involves a bereaved family adopting alternative ways to connect with the death of a loved one and to restore their self-ego. Attachment theory holds that individuals who undergo grieving that is caused by separation from significant others and that triggers the process of grief adaptation will fail to adapt if they resist change. The dual process model holds that bereaved families undergo grief adaptation not only as a way to face their loss but also to restore normality in their lives. Finally, the meaning reconstruction model holds that the grief-adaptation strength of bereaved families comes from their meaning reconstruction in response to encountered events. It is hoped that these theories offer nurses different perspectives on the grieving process and provide a practical framework for grief assessment and interventions. Additionally, specific interventions that are based on these four grief theories are recommended. Furthermore, theories of grief may help nurses gain insight into their own practice-related reactions and healing processes, which is an important part of caring for the grieving. Although the grieving process is time consuming, nurses who better understand grief will be better able to help family members prepare in advance for the death of a loved one and, in doing so, help facilitate their healing, with a view to the future and to finally returning to normal daily life.
Role of environmental variability in the evolution of life history strategies.
Hastings, A; Caswell, H
1979-09-01
We reexamine the role of environmental variability in the evolution of life history strategies. We show that normally distributed deviations in the quality of the environment should lead to normally distributed deviations in the logarithm of year-to-year survival probabilities, which leads to interesting consequences for the evolution of annual and perennial strategies and reproductive effort. We also examine the effects of using differing criteria to determine the outcome of selection. Some predictions of previous theory are reversed, allowing distinctions between r and K theory and a theory based on variability. However, these distinctions require information about both the environment and the selection process not required by current theory.
Seth, Anil K
2014-01-01
Normal perception involves experiencing objects within perceptual scenes as real, as existing in the world. This property of "perceptual presence" has motivated "sensorimotor theories" which understand perception to involve the mastery of sensorimotor contingencies. However, the mechanistic basis of sensorimotor contingencies and their mastery has remained unclear. Sensorimotor theory also struggles to explain instances of perception, such as synesthesia, that appear to lack perceptual presence and for which relevant sensorimotor contingencies are difficult to identify. On alternative "predictive processing" theories, perceptual content emerges from probabilistic inference on the external causes of sensory signals, however, this view has addressed neither the problem of perceptual presence nor synesthesia. Here, I describe a theory of predictive perception of sensorimotor contingencies which (1) accounts for perceptual presence in normal perception, as well as its absence in synesthesia, and (2) operationalizes the notion of sensorimotor contingencies and their mastery. The core idea is that generative models underlying perception incorporate explicitly counterfactual elements related to how sensory inputs would change on the basis of a broad repertoire of possible actions, even if those actions are not performed. These "counterfactually-rich" generative models encode sensorimotor contingencies related to repertoires of sensorimotor dependencies, with counterfactual richness determining the degree of perceptual presence associated with a stimulus. While the generative models underlying normal perception are typically counterfactually rich (reflecting a large repertoire of possible sensorimotor dependencies), those underlying synesthetic concurrents are hypothesized to be counterfactually poor. In addition to accounting for the phenomenology of synesthesia, the theory naturally accommodates phenomenological differences between a range of experiential states including dreaming, hallucination, and the like. It may also lead to a new view of the (in)determinacy of normal perception.
Tensor products of process matrices with indefinite causal structure
NASA Astrophysics Data System (ADS)
Jia, Ding; Sakharwade, Nitica
2018-03-01
Theories with indefinite causal structure have been studied from both the fundamental perspective of quantum gravity and the practical perspective of information processing. In this paper we point out a restriction in forming tensor products of objects with indefinite causal structure in certain models: there exist both classical and quantum objects the tensor products of which violate the normalization condition of probabilities, if all local operations are allowed. We obtain a necessary and sufficient condition for when such unrestricted tensor products of multipartite objects are (in)valid. This poses a challenge to extending communication theory to indefinite causal structures, as the tensor product is the fundamental ingredient in the asymptotic setting of communication theory. We discuss a few options to evade this issue. In particular, we show that the sequential asymptotic setting does not suffer the violation of normalization.
Prediction and control of chaotic processes using nonlinear adaptive networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, R.D.; Barnes, C.W.; Flake, G.W.
1990-01-01
We present the theory of nonlinear adaptive networks and discuss a few applications. In particular, we review the theory of feedforward backpropagation networks. We then present the theory of the Connectionist Normalized Linear Spline network in both its feedforward and iterated modes. Also, we briefly discuss the theory of stochastic cellular automata. We then discuss applications to chaotic time series, tidal prediction in Venice lagoon, finite differencing, sonar transient detection, control of nonlinear processes, control of a negative ion source, balancing a double inverted pendulum and design advice for free electron lasers and laser fusion targets.
Thompson, G Brian; Fletcher-Flinn, Claire M; Wilson, Kathryn J; McKay, Michael F; Margrain, Valerie G
2015-03-01
Predictions from theories of the processes of word reading acquisition have rarely been tested against evidence from exceptionally early readers. The theories of Ehri, Share, and Byrne, and an alternative, Knowledge Sources theory, were so tested. The former three theories postulate that full development of context-free letter sounds and awareness of phonemes are required for normal acquisition, while the claim of the alternative is that with or without such, children can use sublexical information from their emerging reading vocabularies to acquire word reading. Results from two independent samples of children aged 3-5, and 5 years, with mean word reading levels of 7 and 9 years respectively, showed underdevelopment of their context-free letter sounds and phoneme awareness, relative to their word reading levels and normal comparison samples. Despite such underdevelopment, these exceptional readers engaged in a form of phonological recoding that enabled pseudoword reading, at the level of older-age normal controls matched on word reading level. Moreover, in the 5-year-old sample further experiments showed that, relative to normal controls, they had a bias toward use of sublexical information from their reading vocabularies for phonological recoding of heterophonic pseudowords with irregular consistent spelling, and were superior in accessing word meanings independently of phonology, although only if the readers were without exposure to explicit phonics. The three theories were less satisfactory than the alternative theory in accounting for the learning of the exceptionally early readers. Copyright © 2014 Elsevier B.V. All rights reserved.
Bamford, Claire; Poole, Marie; Brittain, Katie; Chew-Graham, Carolyn; Fox, Chris; Iliffe, Steve; Manthorpe, Jill; Robinson, Louise
2014-11-08
Case management has been suggested as a way of improving the quality and cost-effectiveness of support for people with dementia. In this study we adapted and implemented a successful United States' model of case management in primary care in England. The results are reported elsewhere, but a key finding was that little case management took place. This paper reports the findings of the process evaluation which used Normalization Process Theory to understand the barriers to implementation. Ethnographic methods were used to explore the views and experiences of case management. Interviews with 49 stakeholders (patients, carers, case managers, health and social care professionals) were supplemented with observation of case managers during meetings and initial assessments with patients. Transcripts and field notes were analysed initially using the constant comparative approach and emerging themes were then mapped onto the framework of Normalization Process Theory. The primary focus during implementation was on the case managers as isolated individuals, with little attention being paid to the social or organizational context within which they worked. Barriers relating to each of the four main constructs of Normalization Process Theory were identified, with a lack of clarity over the scope and boundaries of the intervention (coherence); variable investment in the intervention (cognitive participation); a lack of resources, skills and training to deliver case management (collective action); and limited reflection and feedback on the case manager role (reflexive monitoring). Despite the intuitive appeal of case management to all stakeholders, there were multiple barriers to implementation in primary care in England including: difficulties in embedding case managers within existing well-established community networks; the challenges of protecting time for case management; and case managers' inability to identify, and act on, emerging patient and carer needs (an essential, but previously unrecognised, training need). In the light of these barriers it is unclear whether primary care is the most appropriate setting for case management in England. The process evaluation highlights key aspects of implementation and training to be addressed in future studies of case management for dementia.
NASA Astrophysics Data System (ADS)
Hirsch, J. E.
2018-05-01
Since the discovery of the Meissner effect, the superconductor to normal (S-N) phase transition in the presence of a magnetic field is understood to be a first-order phase transformation that is reversible under ideal conditions and obeys the laws of thermodynamics. The reverse (N-S) transition is the Meissner effect. This implies in particular that the kinetic energy of the supercurrent is not dissipated as Joule heat in the process where the superconductor becomes normal and the supercurrent stops. In this paper, we analyze the entropy generation and the momentum transfer between the supercurrent and the body in the S-N transition and the N-S transition as described by the conventional theory of superconductivity. We find that it is not possible to explain the transition in a way that is consistent with the laws of thermodynamics unless the momentum transfer between the supercurrent and the body occurs with zero entropy generation, for which the conventional theory of superconductivity provides no mechanism. Instead, we point out that the alternative theory of hole superconductivity does not encounter such difficulties.
Nonlinear adaptive networks: A little theory, a few applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, R.D.; Qian, S.; Barnes, C.W.
1990-01-01
We present the theory of nonlinear adaptive networks and discuss a few applications. In particular, we review the theory of feedforward backpropagation networks. We than present the theory of the Connectionist Normalized Linear Spline network in both its feedforward and iterated modes. Also, we briefly discuss the theory of stochastic cellular automata. We then discuss applications to chaotic time series tidal prediction in Venice Lagoon, sonar transient detection, control of nonlinear processes, balancing a double inverted pendulum and design advice for free electron lasers. 26 refs., 23 figs.
Kjær, Inger
2014-01-01
Human eruption is a unique developmental process in the organism. The aetiology or the mechanism behind eruption has never been fully understood and the scientific literature in the field is extremely sparse. Human and animal tissues provide different possibilities for eruption analyses, briefly discussed in the introduction. Human studies, mainly clinical and radiological, have focused on normal eruption and gender differences. Why a tooth begins eruption and what enables it to move eruptively and later to end these eruptive movements is not known. Pathological eruption courses contribute to insight into the aetiology behind eruption. A new theory on the eruption mechanism is presented. Accordingly, the mechanism of eruption depends on the correlation between space in the eruption course, created by the crown follicle, eruption pressure triggered by innervation in the apical root membrane, and the ability of the periodontal ligament to adapt to eruptive movements. Animal studies and studies on normal and pathological eruption in humans can support and explain different aspects in the new theory. The eruption mechanism still needs elucidation and the paper recommends that future research on eruption keeps this new theory in mind. Understanding the aetiology of the eruption process is necessary for treating deviant eruption courses. PMID:24688798
Left Hemisphere Regions Are Critical for Language in the Face of Early Left Focal Brain Injury
ERIC Educational Resources Information Center
Beharelle, Anjali Raja; Dick, Anthony Steven; Josse, Goulven; Solodkin, Ana; Huttenlocher, Peter R.; Levine, Susan C.; Small, Steven L.
2010-01-01
A predominant theory regarding early stroke and its effect on language development, is that early left hemisphere lesions trigger compensatory processes that allow the right hemisphere to assume dominant language functions, and this is thought to underlie the near normal language development observed after early stroke. To test this theory, we…
Chaos Theory as a Model for Life Transitions Counseling: Nonlinear Dynamics and Life's Changes
ERIC Educational Resources Information Center
Bussolari, Cori J.; Goodell, Judith A.
2009-01-01
Chaos theory is presented for counselors working with clients experiencing life transitions. It is proposed as a model that considers disorder, unpredictability, and lack of control as normal parts of transition processes. Nonlinear constructs from physics are adapted for use in counseling. The model provides a method clients can use to…
Mrs. Malaprop's Neighborhood: Using Word Errors to Reveal Neighborhood Structure
ERIC Educational Resources Information Center
Goldrick, Matthew; Folk, Jocelyn R.; Rapp, Brenda
2010-01-01
Many theories of language production and perception assume that in the normal course of processing a word, additional non-target words (lexical neighbors) become active. The properties of these neighbors can provide insight into the structure of representations and processing mechanisms in the language processing system. To infer the properties of…
Interhemispheric and Intrahemispheric Control of Emotion: A Focus on Unilateral Brain Damage.
ERIC Educational Resources Information Center
Borod, Joan C.
1992-01-01
Discusses neocortical contributions to emotional processing. Examines parameters critical to neuropsychological study of emotion: interhemispheric and intrahemispheric factors, processing mode, and communication channel. Describes neuropsychological theories of emotion. Reviews studies of right-brain-damaged, left-brain-damaged, and normal adults,…
ERIC Educational Resources Information Center
CANTOR, GORDON N.; GIRARDEAU, FREDERIC L.
THIS INQUIRY INVESTIGATED DISCRIMINATION LEARNING PROCESSES IN TRAINABLE MONGOLOID CHILDREN AS COMPARED WITH NORMAL PRESCHOOL CHILDREN. ITS PURPOSE WAS TO CONTRIBUTE TO GENERAL BEHAVIOR THEORY AND TO THE KNOWLEDGE OF MENTAL DEFICIENCY BY SEEING IF SUCH VARIABLES AS TRANSFER OF TRAINING, ACQUIRED DISTINCTIVENESS OF CUES, AND ACQUIRED EQUIVALENCE OF…
Using a theory-driven conceptual framework in qualitative health research.
Macfarlane, Anne; O'Reilly-de Brún, Mary
2012-05-01
The role and merits of highly inductive research designs in qualitative health research are well established, and there has been a powerful proliferation of grounded theory method in the field. However, tight qualitative research designs informed by social theory can be useful to sensitize researchers to concepts and processes that they might not necessarily identify through inductive processes. In this article, we provide a reflexive account of our experience of using a theory-driven conceptual framework, the Normalization Process Model, in a qualitative evaluation of general practitioners' uptake of a free, pilot, language interpreting service in the Republic of Ireland. We reflect on our decisions about whether or not to use the Model, and describe our actual use of it to inform research questions, sampling, coding, and data analysis. We conclude with reflections on the added value that the Model and tight design brought to our research.
[Normal aging of frontal lobe functions].
Calso, Cristina; Besnard, Jérémy; Allain, Philippe
2016-03-01
Normal aging in individuals is often associated with morphological, metabolic and cognitive changes, which particularly concern the cerebral frontal regions. Starting from the "frontal lobe hypothesis of cognitive aging" (West, 1996), the present review is based on the neuroanatomical model developed by Stuss (2008), introducing four categories of frontal lobe functions: executive control, behavioural and emotional self-regulation and decision-making, energization and meta-cognitive functions. The selected studies only address the changes of one at least of these functions. The results suggest a deterioration of several cognitive frontal abilities in normal aging: flexibility, inhibition, planning, verbal fluency, implicit decision-making, second-order and affective theory of mind. Normal aging seems also to be characterised by a general reduction in processing speed observed during neuropsychological assessment (Salthouse, 1996). Nevertheless many cognitive functions remain preserved such as automatic or non-conscious inhibition, specific capacities of flexibility and first-order theory of mind. Therefore normal aging doesn't seem to be associated with a global cognitive decline but rather with a selective change in some frontal systems, conclusion which should be taken into account for designing caring programs in normal aging.
Bonifacci, Paola; Snowling, Margaret J
2008-06-01
English and Italian children with dyslexia were compared with children with reading difficulties associated with low-IQ on tests of simple and choice RT, and in number and symbol scanning tasks. On all four speed-of-processing tasks, children with low-IQ responded more slowly than children with dyslexia and age-controls. In the choice RT task, the performance of children with low-IQ was also less accurate than that of children of normal IQ, consistent with theories linking processing speed limitations with low-IQ. These findings support the hypothesis that dyslexia is a specific cognitive deficit that can arise in the context of normal IQ and normal speed of processing. The same cognitive phenotype was observed in readers of a deep (English) and a shallow (Italian) orthography.
ERIC Educational Resources Information Center
Wickelgren, Wayne A.
1979-01-01
The relationship between current information processing and prior associative theories of human and animal learning, memory, and amnesia are discussed. The paper focuses on the two components of the amnesic syndrome, retrograde amnesia and anterograde amnesia. A neural theory of chunking and consolidation is proposed. (Author/RD)
Application of data fusion technology based on D-S evidence theory in fire detection
NASA Astrophysics Data System (ADS)
Cai, Zhishan; Chen, Musheng
2015-12-01
Judgment and identification based on single fire characteristic parameter information in fire detection is subject to environmental disturbances, and accordingly its detection performance is limited with the increase of false positive rate and false negative rate. The compound fire detector employs information fusion technology to judge and identify multiple fire characteristic parameters in order to improve the reliability and accuracy of fire detection. The D-S evidence theory is applied to the multi-sensor data-fusion: first normalize the data from all sensors to obtain the normalized basic probability function of the fire occurrence; then conduct the fusion processing using the D-S evidence theory; finally give the judgment results. The results show that the method meets the goal of accurate fire signal identification and increases the accuracy of fire alarm, and therefore is simple and effective.
Siéroff, Eric; Piquard, Ambre
2004-12-01
Due to progress in the cognitive theories in the last twenty years, the description of attentional deficits associated with normal or pathological aging has substantially improved. In this article, attentional deficits are presented according to Posner theory, which describes three sub-systems in a global network of attention: vigilance, selective attention, command. This theory not only characterizes the functions of these subsystems, but gives precise indications about their anatomical and neurochemical substrates. Several clinical tests can be described for each of these different subsystems. The main attentional deficits are presented in the second part of this paper: if some decline of the attentional command occurs in normal aging, a real deficit in this subsystem is found in most degenerative processes (frontotemporal dementia, Alzheimer and Parkinson diseases). Alzheimer disease is also frequently associated with a deficit of selective spatial attention, early in the evolution of the disease.
Affect intensity and processing fluency of deterrents.
Holman, Andrei
2013-01-01
The theory of emotional intensity (Brehm, 1999) suggests that the intensity of affective states depends on the magnitude of their current deterrents. Our study investigated the role that fluency--the subjective experience of ease of information processing--plays in the emotional intensity modulations as reactions to deterrents. Following an induction phase of good mood, we manipulated both the magnitude of deterrents (using sets of photographs with pre-tested potential to instigate an emotion incompatible with the pre-existent affective state--pity) and their processing fluency (normal vs. enhanced through subliminal priming). Current affective state and perception of deterrents were then measured. In the normal processing conditions, the results revealed the cubic effect predicted by the emotional intensity theory, with the initial affective state being replaced by the one appropriate to the deterrent only in participants exposed to the high magnitude deterrence. In the enhanced fluency conditions the emotional intensity pattern was drastically altered; also, the replacement of the initial affective state occurred at a lower level of deterrence magnitude (moderate instead of high), suggesting the strengthening of deterrence emotional impact by enhanced fluency.
[Mourning and depression, from the attachment theory perspective].
Wolfberg, Elsa; Ekboir, Alberto; Faiman, Graciela; Finzi, Josefina; Freedman, Margarita; Heath, Adela; Martínez de Cipolatti, María C
2011-01-01
Since depression, according to OMS, is such a worldwide condition, it is necessary to be able to distinguish a normal mourning from a pathological mourning and a depression, so as to qualify patients and health professionals to be able to support a normal mourning without medicating it nor hurrying (hasting) it, as well as being able to treat a depression adequately when it appears as a complication. Attachment theory focuses on mourning after loss with notions such as 1- acceptance of search for the lost person as a normal fact; 2- that mourning in children may have non-pathological outcomes; 3- that a non-processed mourning may be transmitted in an intergenerational way, and 4- also defines which elements may determine a pathological mourning or a depression. A clinical case is presented with an analysis of these notions.
Influence of phase inversion on the formation and stability of one-step multiple emulsions.
Morais, Jacqueline M; Rocha-Filho, Pedro A; Burgess, Diane J
2009-07-21
A novel method of preparation of water-in-oil-in-micelle-containing water (W/O/W(m)) multiple emulsions using the one-step emulsification method is reported. These multiple emulsions were normal (not temporary) and stable over a 60 day test period. Previously, reported multiple emulsion by the one-step method were abnormal systems that formed at the inversion point of simple emulsion (where there is an incompatibility in the Ostwald and Bancroft theories, and typically these are O/W/O systems). Pseudoternary phase diagrams and bidimensional process-composition (phase inversion) maps were constructed to assist in process and composition optimization. The surfactants used were PEG40 hydrogenated castor oil and sorbitan oleate, and mineral and vegetables oils were investigated. Physicochemical characterization studies showed experimentally, for the first time, the significance of the ultralow surface tension point on multiple emulsion formation by one-step via phase inversion processes. Although the significance of ultralow surface tension has been speculated previously, to the best of our knowledge, this is the first experimental confirmation. The multiple emulsion system reported here was dependent not only upon the emulsification temperature, but also upon the component ratios, therefore both the emulsion phase inversion and the phase inversion temperature were considered to fully explain their formation. Accordingly, it is hypothesized that the formation of these normal multiple emulsions is not a result of a temporary incompatibility (at the inversion point) during simple emulsion preparation, as previously reported. Rather, these normal W/O/W(m) emulsions are a result of the simultaneous occurrence of catastrophic and transitional phase inversion processes. The formation of the primary emulsions (W/O) is in accordance with the Ostwald theory ,and the formation of the multiple emulsions (W/O/W(m)) is in agreement with the Bancroft theory.
Towards a general theory of implementation
2013-01-01
Understanding and evaluating the implementation of complex interventions in practice is an important problem for healthcare managers and policy makers, and for patients and others who must operationalize them beyond formal clinical settings. It has been argued that this work should be founded on theory that provides a foundation for understanding, designing, predicting, and evaluating dynamic implementation processes. This paper sets out core constituents of a general theory of implementation, building on Normalization Process Theory and linking it to key constructs from recent work in sociology and psychology. These are informed by ideas about agency and its expression within social systems and fields, social and cognitive mechanisms, and collective action. This approach unites a number of contending perspectives in a way that makes possible a more comprehensive explanation of the implementation and embedding of new ways of thinking, enacting and organizing practice. PMID:23406398
Towards a general theory of implementation.
May, Carl
2013-02-13
Understanding and evaluating the implementation of complex interventions in practice is an important problem for healthcare managers and policy makers, and for patients and others who must operationalize them beyond formal clinical settings. It has been argued that this work should be founded on theory that provides a foundation for understanding, designing, predicting, and evaluating dynamic implementation processes. This paper sets out core constituents of a general theory of implementation, building on Normalization Process Theory and linking it to key constructs from recent work in sociology and psychology. These are informed by ideas about agency and its expression within social systems and fields, social and cognitive mechanisms, and collective action. This approach unites a number of contending perspectives in a way that makes possible a more comprehensive explanation of the implementation and embedding of new ways of thinking, enacting and organizing practice.
On Nonequivalence of Several Procedures of Structural Equation Modeling
ERIC Educational Resources Information Center
Yuan, Ke-Hai; Chan, Wai
2005-01-01
The normal theory based maximum likelihood procedure is widely used in structural equation modeling. Three alternatives are: the normal theory based generalized least squares, the normal theory based iteratively reweighted least squares, and the asymptotically distribution-free procedure. When data are normally distributed and the model structure…
ERIC Educational Resources Information Center
Gibbons, Robert D.; And Others
In the process of developing a conditionally-dependent item response theory (IRT) model, the problem arose of modeling an underlying multivariate normal (MVN) response process with general correlation among the items. Without the assumption of conditional independence, for which the underlying MVN cdf takes on comparatively simple forms and can be…
Personality dimensions of people who suffer from visual stress.
Hollis, J; Allen, P M; Fleischmann, D; Aulak, R
2007-11-01
Personality dimensions of participants who suffer from visual stress were compared with those of normal participants using the Eysenck Personality Inventory. Extraversion-Introversion scores showed no significant differences between the participants who suffered visual stress and those who were classified as normal. By contrast, significant differences were found between the normal participants and those with visual stress in respect of Neuroticism-Stability. These differences accord with Eysenck's personality theory which states that those who score highly on the neuroticism scale do so because they have a neurological system with a low threshold such that their neurological system is easily activated by external stimuli. The findings also relate directly to the theory of visual stress proposed by Wilkins which postulates that visual stress results from an excess of neural activity. The data may indicate that the excess activity is likely to be localised at particular neurological regions or neural processes.
Measurement theory in local quantum physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Okamura, Kazuya, E-mail: okamura@math.cm.is.nagoya-u.ac.jp; Ozawa, Masanao, E-mail: ozawa@is.nagoya-u.ac.jp
In this paper, we aim to establish foundations of measurement theory in local quantum physics. For this purpose, we discuss a representation theory of completely positive (CP) instruments on arbitrary von Neumann algebras. We introduce a condition called the normal extension property (NEP) and establish a one-to-one correspondence between CP instruments with the NEP and statistical equivalence classes of measuring processes. We show that every CP instrument on an atomic von Neumann algebra has the NEP, extending the well-known result for type I factors. Moreover, we show that every CP instrument on an injective von Neumann algebra is approximated bymore » CP instruments with the NEP. The concept of posterior states is also discussed to show that the NEP is equivalent to the existence of a strongly measurable family of posterior states for every normal state. Two examples of CP instruments without the NEP are obtained from this result. It is thus concluded that in local quantum physics not every CP instrument represents a measuring process, but in most of physically relevant cases every CP instrument can be realized by a measuring process within arbitrary error limits, as every approximately finite dimensional von Neumann algebra on a separable Hilbert space is injective. To conclude the paper, the concept of local measurement in algebraic quantum field theory is examined in our framework. In the setting of the Doplicher-Haag-Roberts and Doplicher-Roberts theory describing local excitations, we show that an instrument on a local algebra can be extended to a local instrument on the global algebra if and only if it is a CP instrument with the NEP, provided that the split property holds for the net of local algebras.« less
Dimensional modeling: beyond data processing constraints.
Bunardzic, A
1995-01-01
The focus of information processing requirements is shifting from the on-line transaction processing (OLTP) issues to the on-line analytical processing (OLAP) issues. While the former serves to ensure the feasibility of the real-time on-line transaction processing (which has already exceeded a level of up to 1,000 transactions per second under normal conditions), the latter aims at enabling more sophisticated analytical manipulation of data. The OLTP requirements, or how to efficiently get data into the system, have been solved by applying the Relational theory in the form of Entity-Relation model. There is presently no theory related to OLAP that would resolve the analytical processing requirements as efficiently as Relational theory provided for the transaction processing. The "relational dogma" also provides the mathematical foundation for the Centralized Data Processing paradigm in which mission-critical information is incorporated as 'one and only one instance' of data, thus ensuring data integrity. In such surroundings, the information that supports business analysis and decision support activities is obtained by running predefined reports and queries that are provided by the IS department. In today's intensified competitive climate, businesses are finding that this traditional approach is not good enough. The only way to stay on top of things, and to survive and prosper, is to decentralize the IS services. The newly emerging Distributed Data Processing, with its increased emphasis on empowering the end user, does not seem to find enough merit in the relational database model to justify relying upon it. Relational theory proved too rigid and complex to accommodate the analytical processing needs. In order to satisfy the OLAP requirements, or how to efficiently get the data out of the system, different models, metaphors, and theories have been devised. All of them are pointing to the need for simplifying the highly non-intuitive mathematical constraints found in the relational databases normalized to their 3rd normal form. Object-oriented approach insists on the importance of the common sense component of the data processing activities. But, particularly interesting, is the approach that advocates the necessity of 'flattening' the structure of the business models as we know them today. This discipline is called Dimensional Modeling and it enables users to form multidimensional views of the relevant facts which are stored in a 'flat' (non-structured), easy-to-comprehend and easy-to-access database. When using dimensional modeling, we relax many of the axioms inherent in a relational model. We focus on the knowledge of the relevant facts which are reflecting the business operations and are the real basis for the decision support and business analysis. At the core of the dimensional modeling are fact tables that contain the non-discrete, additive data. To determine the level of aggregation of these facts, we use granularity tables that specify the resolution, or the level/detail, that the user is allowed to entertain. The third component is dimension tables that embody the knowledge of the constraints to be used to form the views.
Normal theory procedures for calculating upper confidence limits (UCL) on the risk function for continuous responses work well when the data come from a normal distribution. However, if the data come from an alternative distribution, the application of the normal theory procedure...
Tierney, Edel; McEvoy, Rachel; O'Reilly-de Brún, Mary; de Brún, Tomas; Okonkwo, Ekaterina; Rooney, Michelle; Dowrick, Chris; Rogers, Anne; MacFarlane, Anne
2016-06-01
There have been recent important advances in conceptualizing and operationalizing involvement in health research and health-care service development. However, problems persist in the field that impact on the scope for meaningful involvement to become a routine - normalized - way of working in primary care. In this review, we focus on current practice to critically interrogate factors known to be relevant for normalization - definition, enrolment, enactment and appraisal. Ours was a multidisciplinary, interagency team, with community representation. We searched EBSCO host for papers from 2007 to 2011 and engaged in an iterative, reflexive approach to sampling, appraising and analysing the literature following the principles of a critical interpretive synthesis approach and using Normalization Process Theory. Twenty-six papers were chosen from 289 papers, as a purposeful sample of work that is reported as service user involvement in the field. Few papers provided a clear working definition of service user involvement. The dominant identified rationale for enrolling service users in primary care projects was linked with policy imperatives for co-governance and emancipatory ideals. The majority of methodologies employed were standard health services research methods that do not qualify as research with service users. This indicates a lack of congruence between the stated aims and methods. Most studies only reported positive outcomes, raising questions about the balance or completeness of the published appraisals. To improve normalization of meaningful involvement in primary care, it is necessary to encourage explicit reporting of definitions, methodological innovation to enhance co-governance and dissemination of research processes and findings. © 2014 The Authors Health Expectations Published by John Wiley & Sons Ltd.
Seth, Anil K.
2014-01-01
Normal perception involves experiencing objects within perceptual scenes as real, as existing in the world. This property of “perceptual presence” has motivated “sensorimotor theories” which understand perception to involve the mastery of sensorimotor contingencies. However, the mechanistic basis of sensorimotor contingencies and their mastery has remained unclear. Sensorimotor theory also struggles to explain instances of perception, such as synesthesia, that appear to lack perceptual presence and for which relevant sensorimotor contingencies are difficult to identify. On alternative “predictive processing” theories, perceptual content emerges from probabilistic inference on the external causes of sensory signals, however, this view has addressed neither the problem of perceptual presence nor synesthesia. Here, I describe a theory of predictive perception of sensorimotor contingencies which (1) accounts for perceptual presence in normal perception, as well as its absence in synesthesia, and (2) operationalizes the notion of sensorimotor contingencies and their mastery. The core idea is that generative models underlying perception incorporate explicitly counterfactual elements related to how sensory inputs would change on the basis of a broad repertoire of possible actions, even if those actions are not performed. These “counterfactually-rich” generative models encode sensorimotor contingencies related to repertoires of sensorimotor dependencies, with counterfactual richness determining the degree of perceptual presence associated with a stimulus. While the generative models underlying normal perception are typically counterfactually rich (reflecting a large repertoire of possible sensorimotor dependencies), those underlying synesthetic concurrents are hypothesized to be counterfactually poor. In addition to accounting for the phenomenology of synesthesia, the theory naturally accommodates phenomenological differences between a range of experiential states including dreaming, hallucination, and the like. It may also lead to a new view of the (in)determinacy of normal perception. PMID:24446823
Modeling of active transmembrane transport in a mixture theory framework.
Ateshian, Gerard A; Morrison, Barclay; Hung, Clark T
2010-05-01
This study formulates governing equations for active transport across semi-permeable membranes within the framework of the theory of mixtures. In mixture theory, which models the interactions of any number of fluid and solid constituents, a supply term appears in the conservation of linear momentum to describe momentum exchanges among the constituents. In past applications, this momentum supply was used to model frictional interactions only, thereby describing passive transport processes. In this study, it is shown that active transport processes, which impart momentum to solutes or solvent, may also be incorporated in this term. By projecting the equation of conservation of linear momentum along the normal to the membrane, a jump condition is formulated for the mechano-electrochemical potential of fluid constituents which is generally applicable to nonequilibrium processes involving active transport. The resulting relations are simple and easy to use, and address an important need in the membrane transport literature.
Personal Pronouns and the Autistic Child.
ERIC Educational Resources Information Center
Fay, Warren H.
1979-01-01
Current theory and research in development of self and of language in autistic children is considered, with emphasis on studies of normal development of personal pronouns and the roles played in that process by listening, echoic memory, mitigated echolalia (recording), and person deixis. (Author)
MacFarlane, Anne; O'Donnell, Catherine; Mair, Frances; O'Reilly-de Brún, Mary; de Brún, Tomas; Spiegel, Wolfgang; van den Muijsenbergh, Maria; van Weel-Baumgarten, Evelyn; Lionis, Christos; Burns, Nicola; Gravenhorst, Katja; Princz, Christine; Teunissen, Erik; van den Driessen Mareeuw, Francine; Saridaki, Aristoula; Papadakaki, Maria; Vlahadi, Maria; Dowrick, Christopher
2012-11-20
The implementation of guidelines and training initiatives to support communication in cross-cultural primary care consultations is ad hoc across a range of international settings with negative consequences particularly for migrants. This situation reflects a well-documented translational gap between evidence and practice and is part of the wider problem of implementing guidelines and the broader range of professional educational and quality interventions in routine practice. In this paper, we describe our use of a contemporary social theory, Normalization Process Theory and participatory research methodology--Participatory Learning and Action--to investigate and support implementation of such guidelines and training initiatives in routine practice. This is a qualitative case study, using multiple primary care sites across Europe. Purposive and maximum variation sampling approaches will be used to identify and recruit stakeholders-migrant service users, general practitioners, primary care nurses, practice managers and administrative staff, interpreters, cultural mediators, service planners, and policy makers. We are conducting a mapping exercise to identify relevant guidelines and training initiatives. We will then initiate a PLA-brokered dialogue with stakeholders around Normalization Process Theory's four constructs--coherence, cognitive participation, collective action, and reflexive monitoring. Through this, we will enable stakeholders in each setting to select a single guideline or training initiative for implementation in their local setting. We will prospectively investigate and support the implementation journeys for the five selected interventions. Data will be generated using a Participatory Learning and Action approach to interviews and focus groups. Data analysis will follow the principles of thematic analysis, will occur in iterative cycles throughout the project and will involve participatory co-analysis with key stakeholders to enhance the authenticity and veracity of findings. This research employs a unique combination of Normalization Process Theory and Participatory Learning and Action, which will provide a novel approach to the analysis of implementation journeys. The findings will advance knowledge in the field of implementation science because we are using and testing theoretical and methodological approaches so that we can critically appraise their scope to mediate barriers and improve the implementation processes.
NASA Technical Reports Server (NTRS)
Xu, Jian-Jun
1989-01-01
The complicated dendritic structure of a growing needle crystal is studied on the basis of global interfacial wave theory. The local dispersion relation for normal modes is derived in a paraboloidal coordinate system using the multiple-variable-expansion method. It is shown that the global solution in a dendrite growth process incorporates the morphological instability factor and the traveling wave factor.
Integrative mental health care: from theory to practice, Part 2.
Lake, James
2008-01-01
Integrative approaches will lead to more accurate and different understandings of mental illness. Beneficial responses to complementary and alternative therapies provide important clues about the phenomenal nature of the human body in space-time and disparate biological, informational, and energetic factors associated with normal and abnormal psychological functioning. The conceptual framework of contemporary Western psychiatry includes multiple theoretical viewpoints, and there is no single best explanatory model of mental illness. Future theories of mental illness causation will not depend exclusively on empirical verification of strictly biological processes but will take into account both classically described biological processes and non-classical models, including complexity theory, resulting in more complete explanations of the characteristics and causes of symptoms and mechanisms of action that result in beneficial responses to treatments. Part 1 of this article examined the limitations of the theory and contemporary clinical methods employed in Western psychiatry and discussed implications of emerging paradigms in physics and the biological sciences for the future of psychiatry. In part 2, a practical methodology, for planning integrative assessment and treatment strategies in mental health care is proposed. Using this methodology the integrative management of moderate and severe psychiatric symptoms is reviewed in detail. As the conceptual framework of Western medicine evolves toward an increasingly integrative perspective, novel understanding of complex relationships between biological, informational, and energetic processes associated with normal psychological functioning and mental illness will lead to more effective integrative assessment and treatment strategies addressing the causes or meanings of symptoms at multiple hierarchic levels of body-brain-mind.
Integrative mental health care: from theory to practice, part 1.
Lake, James
2007-01-01
Integrative approaches will lead to more accurate and different understandings of mental illness. Beneficial responses to complementary and alternative therapies provide important clues about the phenomenal nature of the human body in space-time and disparate biological, informational, and energetic factors associated with normal and abnormal psychological functioning. The conceptual framework of contemporary Western psychiatry includes multiple theoretical viewpoints, and there is no single best explanatory model of mental illness. Future theories of mental illness causation will not depend exclusively on empirical verification of strictly biological processes but will take into account both classically described biological processes and non-classical models, including complexity theory, resulting in more complete explanations of the characteristics and causes of symptoms and mechanisms of action that result in beneficial responses to treatments. Part 1 of this article examines the limitations of the theory and contemporary clinical methods employed in Western psychiatry and discusses implications of emerging paradigms in physics and the biological sciences for the future of psychiatry. In part 2, a practical methodology for planning integrative assessment and treatment strategies in mental health care is proposed. Using this methodology the integrative management of moderate and severe psychiatric symptoms is reviewed in detail. As the conceptual framework of Western medicine evolves toward an increasingly integrative perspective, novel understandings of complex relationships between biological, informational, and energetic processes associated with normal psychological functioning and mental illness will lead to more effective integrative assessment and treatment strategies addressing the causes or meanings of symptoms at multiple hierarchic levels of body-brain-mind.
Extracting features of Gaussian self-similar stochastic processes via the Bandt-Pompe approach.
Rosso, O A; Zunino, L; Pérez, D G; Figliola, A; Larrondo, H A; Garavaglia, M; Martín, M T; Plastino, A
2007-12-01
By recourse to appropriate information theory quantifiers (normalized Shannon entropy and Martín-Plastino-Rosso intensive statistical complexity measure), we revisit the characterization of Gaussian self-similar stochastic processes from a Bandt-Pompe viewpoint. We show that the ensuing approach exhibits considerable advantages with respect to other treatments. In particular, clear quantifiers gaps are found in the transition between the continuous processes and their associated noises.
Shankle, William R; Pooley, James P; Steyvers, Mark; Hara, Junko; Mangrola, Tushar; Reisberg, Barry; Lee, Michael D
2013-01-01
Determining how cognition affects functional abilities is important in Alzheimer disease and related disorders. A total of 280 patients (normal or Alzheimer disease and related disorders) received a total of 1514 assessments using the functional assessment staging test (FAST) procedure and the MCI Screen. A hierarchical Bayesian cognitive processing model was created by embedding a signal detection theory model of the MCI Screen-delayed recognition memory task into a hierarchical Bayesian framework. The signal detection theory model used latent parameters of discriminability (memory process) and response bias (executive function) to predict, simultaneously, recognition memory performance for each patient and each FAST severity group. The observed recognition memory data did not distinguish the 6 FAST severity stages, but the latent parameters completely separated them. The latent parameters were also used successfully to transform the ordinal FAST measure into a continuous measure reflecting the underlying continuum of functional severity. Hierarchical Bayesian cognitive processing models applied to recognition memory data from clinical practice settings accurately translated a latent measure of cognition into a continuous measure of functional severity for both individuals and FAST groups. Such a translation links 2 levels of brain information processing and may enable more accurate correlations with other levels, such as those characterized by biomarkers.
Myers-Briggs typology and Jungian individuation.
Myers, Steve
2016-06-01
Myers-Briggs typology is widely seen as equivalent to and representative of Jungian theory by the users of the Myers-Briggs Type Indicator (MBTI) and similar questionnaires. However, the omission of the transcendent function from the theory, and the use of typological functions as its foundation, has resulted in an inadvertent reframing of the process of individuation. This is despite some attempts to integrate individuation and typology, and reintroduce the transcendent function into Myers-Briggs theory. This paper examines the differing views of individuation in Myers-Briggs and Jungian theory, and some of the challenges of reconciling those differences, particularly in the context of normality. It proposes eight principles, drawn mainly from Jungian and classical post-Jungian work, that show how individuation as a process can be integrated with contemporary Myers-Briggs typology. These principles show individuation as being a natural process that can be encouraged outside of the analytic process. They make use of a wide range of opposites as well as typological functions, whilst being centred on the transcendent function. Central to the process is the alchemical image of the caduceus and a practical interpretation of the axiom of Maria, both of which Jung used to illustrate the process of individuation. © 2016, The Society of Analytical Psychology.
Nordmark, Sofi; Zingmark, Karin; Lindberg, Inger
2016-04-27
Discharge planning is a care process that aims to secure the transfer of care for the patient at transition from home to the hospital and back home. Information exchange and collaboration between care providers are essential, but deficits are common. A wide range of initiatives to improve the discharge planning process have been developed and implemented for the past three decades. However, there are still high rates of reported medical errors and adverse events related to failures in the discharge planning. Using theoretical frameworks such as Normalization Process Theory (NPT) can support evaluations of complex interventions and processes in healthcare. The aim of this study was to explore the embedding and integration of the DPP from the perspective of registered nurses, district nurses and homecare organizers. The study design was explorative, using the NPT as a framework to explore the embedding and integration of the DPP. Data consisted of written documentation from; workshops with staff, registered adverse events and system failures, web based survey and individual interviews with staff. Using the NPT as a framework to explore the embedding and integration of discharge planning after 10 years in use showed that the staff had reached a consensus of opinion of what the process was (coherence) and how they evaluated the process (reflexive monitoring). However, they had not reached a consensus of opinion of who performed the process (cognitive participation) and how it was performed (collective action). This could be interpreted as the process had not become normalized in daily practice. The result shows necessity to observe the implementation of old practices to better understand the needs of new ones before developing and implementing new practices or supportive tools within healthcare to reach the aim of development and to accomplish sustainable implementation. The NPT offers a generalizable framework for analysis, which can explain and shape the implementation process of old practices, before further development of new practices or supportive tools.
On simple aerodynamic sensitivity derivatives for use in interdisciplinary optimization
NASA Technical Reports Server (NTRS)
Doggett, Robert V., Jr.
1991-01-01
Low-aspect-ratio and piston aerodynamic theories are reviewed as to their use in developing aerodynamic sensitivity derivatives for use in multidisciplinary optimization applications. The basic equations relating surface pressure (or lift and moment) to normal wash are given and discussed briefly for each theory. The general means for determining selected sensitivity derivatives are pointed out. In addition, some suggestions in very general terms are included as to sample problems for use in studying the process of using aerodynamic sensitivity derivatives in optimization studies.
Theories of Impaired Consciousness in Epilepsy
Yu, Lissa; Blumenfeld, Hal
2015-01-01
Although the precise mechanisms for control of consciousness are not fully understood, emerging data show that conscious information processing depends on the activation of certain networks in the brain and that the impairment of consciousness is related to abnormal activity in these systems. Epilepsy can lead to transient impairment of consciousness, providing a window into the mechanisms necessary for normal consciousness. Thus, despite differences in behavioral manifestations, cause, and electrophysiology, generalized tonic–clonic, absence, and partial seizures engage similar anatomical structures and pathways. We review prior concepts of impaired consciousness in epilepsy, focusing especially on temporal lobe complex partial seizures, which are a common and debilitating form of epileptic unconsciousness. We discuss a “network inhibition hypothesis” in which focal temporal lobe seizure activity disrupts normal cortical–subcortical interactions, leading to depressed neocortical function and impaired consciousness. This review of the major prior theories of impaired consciousness in epilepsy allows us to put more recent data into context and to reach a better understanding of the mechanisms important for normal consciousness. PMID:19351355
Increased heart rate after exercise facilitates the processing of fearful but not disgusted faces.
Pezzulo, G; Iodice, P; Barca, L; Chausse, P; Monceau, S; Mermillod, M
2018-01-10
Embodied theories of emotion assume that emotional processing is grounded in bodily and affective processes. Accordingly, the perception of an emotion re-enacts congruent sensory and affective states; and conversely, bodily states congruent with a specific emotion facilitate emotional processing. This study tests whether the ability to process facial expressions (faces having a neutral expression, expressing fear, or disgust) can be influenced by making the participants' body state congruent with the expressed emotion (e.g., high heart rate in the case of faces expressing fear). We designed a task requiring participants to categorize pictures of male and female faces that either had a neutral expression (neutral), or expressed emotions whose linkage with high heart rate is strong (fear) or significantly weaker or absent (disgust). Critically, participants were tested in two conditions: with experimentally induced high heart rate (Exercise) and with normal heart rate (Normal). Participants processed fearful faces (but not disgusted or neutral faces) faster when they were in the Exercise condition than in the Normal condition. These results support the idea that an emotionally congruent body state facilitates the automatic processing of emotionally-charged stimuli and this effect is emotion-specific rather than due to generic factors such as arousal.
1983-08-01
ACCESSION NO «• TITLE (and Sublltle) TAILORED TESTING THEORY AND PRACTICE: A BASIC MODEL , NORMAL OGIVE SUBMODELS, AND TAILORED TESTING ALGORITHMS 7...single common-factor model , the author derives the two- and three-parametir normal ogfve il’^irTr^ functions as submodels. For both of these...PAOEfWiwi Dmia Bnfnd) NPRDC TR 83-32 AUGUST 1983 TAILORED TESTING THEORY AND PRACTICE: A BASIC MODEL , NORMAL OGIVE SUBMODELS, AND TAILORED TESTING
Impaired face detection may explain some but not all cases of developmental prosopagnosia.
Dalrymple, Kirsten A; Duchaine, Brad
2016-05-01
Developmental prosopagnosia (DP) is defined by severe face recognition difficulties due to the failure to develop the visual mechanisms for processing faces. The two-process theory of face recognition (Morton & Johnson, 1991) implies that DP could result from a failure of an innate face detection system; this failure could prevent an individual from then tuning higher-level processes for face recognition (Johnson, 2005). Work with adults indicates that some individuals with DP have normal face detection whereas others are impaired. However, face detection has not been addressed in children with DP, even though their results may be especially informative because they have had less opportunity to develop strategies that could mask detection deficits. We tested the face detection abilities of seven children with DP. Four were impaired at face detection to some degree (i.e. abnormally slow, or failed to find faces) while the remaining three children had normal face detection. Hence, the cases with impaired detection are consistent with the two-process account suggesting that DP could result from a failure of face detection. However, the cases with normal detection implicate a higher-level origin. The dissociation between normal face detection and impaired identity perception also indicates that these abilities depend on different neurocognitive processes. © 2015 John Wiley & Sons Ltd.
Qualitative and Quantitative Distinctions in Personality Disorder
Wright, Aidan G. C.
2011-01-01
The “categorical-dimensional debate” has catalyzed a wealth of empirical advances in the study of personality pathology. However, this debate is merely one articulation of a broader conceptual question regarding whether to define and describe psychopathology as a quantitatively extreme expression of normal functioning or as qualitatively distinct in its process. In this paper I argue that dynamic models of personality (e.g., object-relations, cognitive-affective processing system) offer the conceptual scaffolding to reconcile these seemingly incompatible approaches to characterizing the relationship between normal and pathological personality. I propose that advances in personality assessment that sample behavior and experiences intensively provide the empirical techniques, whereas interpersonal theory offers an integrative theoretical framework, for accomplishing this goal. PMID:22804676
On the reversibility of the Meissner effect and the angular momentum puzzle
NASA Astrophysics Data System (ADS)
Hirsch, J. E.
2016-10-01
It is generally believed that the laws of thermodynamics govern superconductivity as an equilibrium state of matter, and hence that the normal-superconductor transition in a magnetic field is reversible under ideal conditions. Because eddy currents are generated during the transition as the magnetic flux changes, the transition has to proceed infinitely slowly to generate no entropy. Experiments showed that to a high degree of accuracy no entropy was generated in these transitions. However, in this paper we point out that for the length of times over which these experiments extended, a much higher degree of irreversibility due to decay of eddy currents should have been detected than was actually observed. We also point out that within the conventional theory of superconductivity no explanation exists for why no Joule heat is generated in the superconductor to normal transition when the supercurrent stops. In addition we point out that within the conventional theory of superconductivity no mechanism exists for the transfer of momentum between the supercurrent and the body as a whole, which is necessary to ensure that the transition in the presence of a magnetic field respects momentum conservation. We propose a solution to all these questions based on the alternative theory of hole superconductivity. The theory proposes that in the normal-superconductor transition there is a flow and backflow of charge in direction perpendicular to the phase boundary when the phase boundary moves. We show that this flow and backflow explains the absence of Joule heat generated by Faraday eddy currents, the absence of Joule heat generated in the process of the supercurrent stopping, and the reversible transfer of momentum between the supercurrent and the body, provided the current carriers in the normal state are holes.
Left hemisphere regions are critical for language in the face of early left focal brain injury.
Raja Beharelle, Anjali; Dick, Anthony Steven; Josse, Goulven; Solodkin, Ana; Huttenlocher, Peter R; Levine, Susan C; Small, Steven L
2010-06-01
A predominant theory regarding early stroke and its effect on language development, is that early left hemisphere lesions trigger compensatory processes that allow the right hemisphere to assume dominant language functions, and this is thought to underlie the near normal language development observed after early stroke. To test this theory, we used functional magnetic resonance imaging to examine brain activity during category fluency in participants who had sustained pre- or perinatal left hemisphere stroke (n = 25) and in neurologically normal siblings (n = 27). In typically developing children, performance of a category fluency task elicits strong involvement of left frontal and lateral temporal regions and a lesser involvement of right hemisphere structures. In our cohort of atypically developing participants with early stroke, expressive and receptive language skills correlated with activity in the same left inferior frontal regions that support language processing in neurologically normal children. This was true independent of either the amount of brain injury or the extent that the injury was located in classical cortical language processing areas. Participants with bilateral activation in left and right superior temporal-inferior parietal regions had better language function than those with either predominantly left- or right-sided unilateral activation. The advantage conferred by left inferior frontal and bilateral temporal involvement demonstrated in our study supports a strong predisposition for typical neural language organization, despite an intervening injury, and argues against models suggesting that the right hemisphere fully accommodates language function following early injury.
Nakhaei, Maryam; Khankeh, Hamid Reza; Masoumi, Gholam Reza; Hosseini, Mohammad Ali; Parsa-Yekta, Zohreh
2016-01-01
Background Since life recovery after disasters is a subjective and multifaceted construct influenced by different factors, and survivors’ main concerns and experiences are not clear, the researchers intended to explore this process. Materials and Methods This study was conducted in 2011 - 2014 based on the grounded theory approach. Participants were selected by purposeful sampling followed by theoretical sampling to achieve conceptual and theoretical saturation. Data were collected through interviews, observation, focus group discussion, and document reviews. Data were analyzed by Strauss and Corbin’s (2008) recommended approach. Results Transcribed data from 26 interviews (managers, health care providers, and receivers), field notes, and other documents were analyzed, and 1,652 open codes were identified. The codes were categorized, using constant comparative analysis, into five main categories including reactive exposure, subsiding emotions, need for comprehensive health recovery, improvement of normalization (new normality achievement), and contextual factors. The process of life recovery after disaster was also explored. Conclusions The results clarified a deep perception of participants’ experiences after disaster. The path of life recovery after disasters involves participants’ striving to achieve a comprehensive health recovery, which starts with the need for all-inclusive health recovery as a main concern; this is the motivator for a responding strategy. This strategy is participatory, and the process is progressive; achievement of a new normality is the final goal, with new development and levels of empowerment. PMID:27703797
On the reversibility of the Meissner effect and the angular momentum puzzle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hirsch, J.E., E-mail: jhirsch@ucsd.edu
It is generally believed that the laws of thermodynamics govern superconductivity as an equilibrium state of matter, and hence that the normal-superconductor transition in a magnetic field is reversible under ideal conditions. Because eddy currents are generated during the transition as the magnetic flux changes, the transition has to proceed infinitely slowly to generate no entropy. Experiments showed that to a high degree of accuracy no entropy was generated in these transitions. However, in this paper we point out that for the length of times over which these experiments extended, a much higher degree of irreversibility due to decay ofmore » eddy currents should have been detected than was actually observed. We also point out that within the conventional theory of superconductivity no explanation exists for why no Joule heat is generated in the superconductor to normal transition when the supercurrent stops. In addition we point out that within the conventional theory of superconductivity no mechanism exists for the transfer of momentum between the supercurrent and the body as a whole, which is necessary to ensure that the transition in the presence of a magnetic field respects momentum conservation. We propose a solution to all these questions based on the alternative theory of hole superconductivity. The theory proposes that in the normal-superconductor transition there is a flow and backflow of charge in direction perpendicular to the phase boundary when the phase boundary moves. We show that this flow and backflow explains the absence of Joule heat generated by Faraday eddy currents, the absence of Joule heat generated in the process of the supercurrent stopping, and the reversible transfer of momentum between the supercurrent and the body, provided the current carriers in the normal state are holes. - Highlights: • The normal-superconductor phase transition is reversible. • Within the conventional theory, Foucault currents give rise to irreversibility. • To suppress Foucault currents, charge has to flow in direction perpendicular to the phase boundary. • The charge carriers have to be holes. • This solves also the angular momentum puzzle associated with the Meissner effect.« less
Toppi, J; Petti, M; Vecchiato, G; Cincotti, F; Salinari, S; Mattia, D; Babiloni, F; Astolfi, L
2013-01-01
Partial Directed Coherence (PDC) is a spectral multivariate estimator for effective connectivity, relying on the concept of Granger causality. Even if its original definition derived directly from information theory, two modifies were introduced in order to provide better physiological interpretations of the estimated networks: i) normalization of the estimator according to rows, ii) squared transformation. In the present paper we investigated the effect of PDC normalization on the performances achieved by applying the statistical validation process on investigated connectivity patterns under different conditions of Signal to Noise ratio (SNR) and amount of data available for the analysis. Results of the statistical analysis revealed an effect of PDC normalization only on the percentages of type I and type II errors occurred by using Shuffling procedure for the assessment of connectivity patterns. No effects of the PDC formulation resulted on the performances achieved during the validation process executed instead by means of Asymptotic Statistic approach. Moreover, the percentages of both false positives and false negatives committed by Asymptotic Statistic are always lower than those achieved by Shuffling procedure for each type of normalization.
The experience of weight management in normal weight adults.
Hernandez, Cheri Ann; Hernandez, David A; Wellington, Christine M; Kidd, Art
2016-11-01
No prior research has been done with normal weight persons specific to their experience of weight management. The purpose of this research was to discover the experience of weight management in normal weight individuals. Glaserian grounded theory was used. Qualitative data (focus group) and quantitative data (food diary, study questionnaire, and anthropometric measures) were collected. Weight management was an ongoing process of trying to focus on living (family, work, and social), while maintaining their normal weight targets through five consciously and unconsciously used strategies. Despite maintaining normal weights, the nutritional composition of foods eaten was grossly inadequate. These five strategies can be used to develop new weight management strategies that could be integrated into existing weight management programs, or could be developed into novel weight management interventions. Surprisingly, normal weight individuals require dietary assessment and nutrition education to prevent future negative health consequences. Copyright © 2016 Elsevier Inc. All rights reserved.
Drug Use Normalization: A Systematic and Critical Mixed-Methods Review.
Sznitman, Sharon R; Taubman, Danielle S
2016-09-01
Drug use normalization, which is a process whereby drug use becomes less stigmatized and more accepted as normative behavior, provides a conceptual framework for understanding contemporary drug issues and changes in drug use trends. Through a mixed-methods systematic review of the normalization literature, this article seeks to (a) critically examine how the normalization framework has been applied in empirical research and (b) make recommendations for future research in this area. Twenty quantitative, 26 qualitative, and 4 mixed-methods studies were identified through five electronic databases and reference lists of published studies. Studies were assessed for relevance, study characteristics, quality, and aspects of normalization examined. None of the studies applied the most rigorous research design (experiments) or examined all of the originally proposed normalization dimensions. The most commonly assessed dimension of drug use normalization was "experimentation." In addition to the original dimensions, the review identified the following new normalization dimensions in the literature: (a) breakdown of demographic boundaries and other risk factors in relation to drug use; (b) de-normalization; (c) drug use as a means to achieve normal goals; and (d) two broad forms of micro-politics associated with managing the stigma of illicit drug use: assimilative and transformational normalization. Further development in normalization theory and methodology promises to provide researchers with a novel framework for improving our understanding of drug use in contemporary society. Specifically, quasi-experimental designs that are currently being made feasible by swift changes in cannabis policy provide researchers with new and improved opportunities to examine normalization processes.
An account of the Speech-to-Song Illusion using Node Structure Theory.
Castro, Nichol; Mendoza, Joshua M; Tampke, Elizabeth C; Vitevitch, Michael S
2018-01-01
In the Speech-to-Song Illusion, repetition of a spoken phrase results in it being perceived as if it were sung. Although a number of previous studies have examined which characteristics of the stimulus will produce the illusion, there is, until now, no description of the cognitive mechanism that underlies the illusion. We suggest that the processes found in Node Structure Theory that are used to explain normal language processing as well as other auditory illusions might also account for the Speech-to-Song Illusion. In six experiments we tested whether the satiation of lexical nodes, but continued priming of syllable nodes may lead to the Speech-to-Song Illusion. The results of these experiments provide evidence for the role of priming, activation, and satiation as described in Node Structure Theory as an explanation of the Speech-to-Song Illusion.
Traffic Flow Density Distribution Based on FEM
NASA Astrophysics Data System (ADS)
Ma, Jing; Cui, Jianming
In analysis of normal traffic flow, it usually uses the static or dynamic model to numerical analyze based on fluid mechanics. However, in such handling process, the problem of massive modeling and data handling exist, and the accuracy is not high. Finite Element Method (FEM) is a production which is developed from the combination of a modern mathematics, mathematics and computer technology, and it has been widely applied in various domain such as engineering. Based on existing theory of traffic flow, ITS and the development of FEM, a simulation theory of the FEM that solves the problems existing in traffic flow is put forward. Based on this theory, using the existing Finite Element Analysis (FEA) software, the traffic flow is simulated analyzed with fluid mechanics and the dynamics. Massive data processing problem of manually modeling and numerical analysis is solved, and the authenticity of simulation is enhanced.
A Bayesian Semiparametric Item Response Model with Dirichlet Process Priors
ERIC Educational Resources Information Center
Miyazaki, Kei; Hoshino, Takahiro
2009-01-01
In Item Response Theory (IRT), item characteristic curves (ICCs) are illustrated through logistic models or normal ogive models, and the probability that examinees give the correct answer is usually a monotonically increasing function of their ability parameters. However, since only limited patterns of shapes can be obtained from logistic models…
Midlife Divorce and Archetypes for Women.
ERIC Educational Resources Information Center
Bobo, Terry Skinner
Midlife divorce for women can be a time for creative growth or divorce can lead to loneliness, bitterness, and depression. Middle-aged women appear to experience an inordinate amount of stress from divorce because of loss of roles and lack of new role models. Based upon role theory and divorce as a normal developmental process, a feminist…
Preliminary Characterization of Erythrocytes Deformability on the Entropy-Complexity Plane
Korol, Ana M; D’Arrigo, Mabel; Foresto, Patricia; Pérez, Susana; Martín, Maria T; Rosso, Osualdo A
2010-01-01
We present an application of wavelet-based Information Theory quantifiers (Normalized Total Shannon Entropy, MPR-Statistical Complexity and Entropy-Complexity plane) on red blood cells membrane viscoelasticity characterization. These quantifiers exhibit important localization advantages provided by the Wavelet Theory. The present approach produces a clear characterization of this dynamical system, finding out an evident manifestation of a random process on the red cell samples of healthy individuals, and its sharp reduction of randomness on analyzing a human haematological disease, such as β-thalassaemia minor. PMID:21611139
Origin and evolution of the free radical theory of aging: a brief personal history, 1954–2009.
Harman, Denham
2009-12-01
Aging is the progressive accumulation in an organism of diverse, deleterious changes with time that increase the chance of disease and death. The basic chemical process underlying aging was first advanced by the free radical theory of aging (FRTA) in 1954: the reaction of active free radicals, normally produced in the organisms, with cellular constituents initiates the changes associated with aging. The involvement of free radicals in aging is related to their key role in the origin and evolution of life. The initial low acceptance of the FRTA by the scientific community, its slow growth, manifested by meetings and occasional papers based on the theory, prompted this account of the intermittent growth of acceptance of the theory over the past nearly 55 years.
Circuitry to explain how the relative number of L and M cones shapes color experience
Schmidt, Brian P.; Touch, Phanith; Neitz, Maureen; Neitz, Jay
2016-01-01
The wavelength of light that appears unique yellow is surprisingly consistent across people even though the ratio of middle (M) to long (L) wavelength sensitive cones is strikingly variable. This observation has been explained by normalization to the mean spectral distribution of our shared environment. Our purpose was to reconcile the nearly perfect alignment of everyone's unique yellow through a normalization process with the striking variability in unique green, which varies by as much as 60 nm between individuals. The spectral location of unique green was measured in a group of volunteers whose cone ratios were estimated with a technique that combined genetics and flicker photometric electroretinograms. In contrast to unique yellow, unique green was highly dependent upon relative cone numerosity. We hypothesized that the difference in neural architecture of the blue-yellow and red-green opponent systems in the presence of a normalization process creates the surprising dependence of unique green on cone ratio. We then compared the predictions of different theories of color vision processing that incorporate L and M cone ratio and a normalization process. The results of this analysis reveal that—contrary to prevailing notions--postretinal contributions may not be required to explain the phenomena of unique hues. PMID:27366885
Computer-aided analysis of cutting processes for brittle materials
NASA Astrophysics Data System (ADS)
Ogorodnikov, A. I.; Tikhonov, I. N.
2017-12-01
This paper is focused on 3D computer simulation of cutting processes for brittle materials and silicon wafers. Computer-aided analysis of wafer scribing and dicing is carried out with the use of the ANSYS CAE (computer-aided engineering) software, and a parametric model of the processes is created by means of the internal ANSYS APDL programming language. Different types of tool tip geometry are analyzed to obtain internal stresses, such as a four-sided pyramid with an included angle of 120° and a tool inclination angle to the normal axis of 15°. The quality of the workpieces after cutting is studied by optical microscopy to verify the FE (finite-element) model. The disruption of the material structure during scribing occurs near the scratch and propagates into the wafer or over its surface at a short range. The deformation area along the scratch looks like a ragged band, but the stress width is rather low. The theory of cutting brittle semiconductor and optical materials is developed on the basis of the advanced theory of metal turning. The fall of stress intensity along the normal on the way from the tip point to the scribe line can be predicted using the developed theory and with the verified FE model. The crystal quality and dimensions of defects are determined by the mechanics of scratching, which depends on the shape of the diamond tip, the scratching direction, the velocity of the cutting tool and applied force loads. The disunity is a rate-sensitive process, and it depends on the cutting thickness. The application of numerical techniques, such as FE analysis, to cutting problems enhances understanding and promotes the further development of existing machining technologies.
Finch, Tracy L; Mair, Frances S; O'Donnell, Catherine; Murray, Elizabeth; May, Carl R
2012-05-17
Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1) describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2) identify key issues and methodological challenges for advancing work in this field. A 30-item instrument (Technology Adoption Readiness Scale (TARS)) for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT). NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice) was used by health care professionals. The developed instrument was pre-tested in two professional samples (N=46; N=231). Ratings of items representing normalisation 'processes' were significantly related to staff members' perceptions of whether or not e-health had become 'routine'. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts. To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1) greater attention to underlying theoretical assumptions and extent of translation work required; (2) the need for appropriate but flexible approaches to outcomes measurement; (3) representation of multiple perspectives and collaborative nature of work; and (4) emphasis on generic measurement approaches that can be flexibly tailored to particular contexts of study.
Trietsch, Jasper; van Steenkiste, Ben; Hobma, Sjoerd; Frericks, Arnoud; Grol, Richard; Metsemakers, Job; van der Weijden, Trudy
2014-12-01
A quality improvement strategy consisting of comparative feedback and peer review embedded in available local quality improvement collaboratives proved to be effective in changing the test-ordering behaviour of general practitioners. However, implementing this strategy was problematic. We aimed for large-scale implementation of an adapted strategy covering both test ordering and prescribing performance. Because we failed to achieve large-scale implementation, the aim of this study was to describe and analyse the challenges of the transferring process. In a qualitative study 19 regional health officers, pharmacists, laboratory specialists and general practitioners were interviewed within 6 months after the transfer period. The interviews were audiotaped, transcribed and independently coded by two of the authors. The codes were matched to the dimensions of the normalization process theory. The general idea of the strategy was widely supported, but generating the feedback was more complex than expected and the need for external support after transfer of the strategy remained high because participants did not assume responsibility for the work and the distribution of resources that came with it. Evidence on effectiveness, a national infrastructure for these collaboratives and a general positive attitude were not sufficient for normalization. Thinking about managing large databases, responsibility for tasks and distribution of resources should start as early as possible when planning complex quality improvement strategies. Merely exploring the barriers and facilitators experienced in a preceding trial is not sufficient. Although multifaceted implementation strategies to change professional behaviour are attractive, their inherent complexity is also a pitfall for large-scale implementation. © 2014 John Wiley & Sons, Ltd.
Replication of Cancellation Orders Using First-Passage Time Theory in Foreign Currency Market
NASA Astrophysics Data System (ADS)
Boilard, Jean-François; Kanazawa, Kiyoshi; Takayasu, Hideki; Takayasu, Misako
Our research focuses on the annihilation dynamics of limit orders in a spot foreign currency market for various currency pairs. We analyze the cancellation order distribution conditioned on the normalized distance from the mid-price; where the normalized distance is defined as the final distance divided by the initial distance. To reproduce real data, we introduce two simple models that assume the market price moves randomly and cancellation occurs either after fixed time t or following the Poisson process. Results of our model qualitatively reproduce basic statistical properties of cancellation orders of the data when limit orders are cancelled according to the Poisson process. We briefly discuss implication of our findings in the construction of more detailed microscopic models.
Genetic and Diagnostic Biomarker Development in ASD Toddlers Using Resting State Functional MRI
2017-11-01
Integration Theory of intelligence (Jung and Haier, Behave Brain Sci, 2007...predicting a number of age-related phenotypes. Measures of white matter integrity in the brain are heritable and highly sensitive to both normal and...pathological aging processes. We consider the phenotypic and genetic interrelationships between epigenetic age acceleration and white matter integrity
Bringing the Meaning Back In: Exploring Existentially Motivated Terrorism
2016-06-01
approaches, radicalization theory examines terrorists with a multidisciplinary integration of psychology and rational choice along pathways and processes...Individual psychological traits or abnormalities combine with upbringing and environment to lead an individual down a path toward violence. It is...profiles of individual terrorists finds but one commonality— their essential normalness.56 Explaining terrorism though abnormal psychology is
ERIC Educational Resources Information Center
Landi, Nicole; Perfetti, Charles A.
2007-01-01
The most prominent theories of reading consider reading comprehension ability to be a direct consequence of lower-level reading skills. Recently however, research has shown that some children with poor comprehension ability perform normally on tests of lower-level skills (e.g., decoding). One promising line of behavioral research has found…
ERIC Educational Resources Information Center
Sass, D. A.; Schmitt, T. A.; Walker, C. M.
2008-01-01
Item response theory (IRT) procedures have been used extensively to study normal latent trait distributions and have been shown to perform well; however, less is known concerning the performance of IRT with non-normal latent trait distributions. This study investigated the degree of latent trait estimation error under normal and non-normal…
Adolescent Perspectives Following Ostomy Surgery: A Grounded Theory Study.
Mohr, Lynn D; Hamilton, Rebekah J
2016-01-01
This purpose of this study was to provide a theoretical account of how adolescents aged 13 to 18 years process the experience of having an ostomy. Qualitative study using grounded theory design. The sample comprised of 12 English-speaking adolescents aged 13-18 years: 10 with an ostomy and 2 with medical management of their disease. Respondents completed audio-recorded interviews that were transcribed verbatim. Data were analyzed using the constant comparative method until data saturation occurred. Dedoose, a Web-based qualitative methods management tool, was used to capture major themes arising from the data. Study results indicate that for adolescents between 13 and 18 years of age, processing the experience of having an ostomy includes concepts of the "physical self" and "social self" with the goal of "normalizing." Subcategories of physical self include (a) changing reality, (b) learning, and (c) adapting. Subcategories of social self include (a) reentering and (b) disclosing. This study sheds light on how adolescents process the experience of having an ostomy and how health care providers can assist adolescents to move through the process to get back to their desired "normal" state. Health care providers can facilitate the adolescent through the ostomy experience by being proactive in conversations not only about care issues but also about school and family concerns and spirituality. Further research is needed in understanding how parents process their adolescents' ostomy surgery experience and how spirituality assists adolescents in coping and adjustment with body-altering events.
Sell, Stewart; Nicolini, Andrea; Ferrari, Paola; Biava, Pier M
2016-01-01
Current medical literature acknowledges that embryonic micro-environment is able to suppress tumor development. Administering carcinogenic substances during organogenesis in fact leads to embryonic malformations, but not to offspring tumor growth. Once organogenesis has ended, administration of carcinogenic substances causes a rise in offspring tumor development. These data indicate that cancer can be considered a deviation in normal development, which can be regulated by factors of the embryonic microenvironment. Furthermore, it has been demonstrated that teratoma differentiates into normal tissues once it is implanted in the embryo. Recently, it has been shown that implanting a melanoma in Zebrafish embryo did not result in a tumor development; however, it did in the adult specimen. This demonstrates that cancer cells can differentiate into normal tissues when implanted in the embryo. In addition, it was demonstrated that other tumors can revert into a normal phenotype and/or differentiate into normal tissue when implanted in the embryo. These studies led some authors to define cancer as a problem of developmental biology and to predict the present concept of "cancer stem cells theory". In this review, we record the most important researches about the reprogramming and differentiation treatments of cancer cells to better clarify how the substances taken from developing embryo or other biological substances can induce differentiation of malignant cells. Lastly, a model of cancer has been proposed here, conceived by one of us, which is consistent with the reality, as demonstrated by a great number of researches. This model integrates the theory of the "maturation arrest" of cancer cells as conceived by B. Pierce with the theory which describes cancer as a process of deterministic chaos determined by genetic and/or epigenetic alterations in differentiated cells, which leads a normal cell to become cancerous. All the researches here described demonstrated that cancer can be considered a problem of developmental biology and that one of the most important hallmarks of cancer is the loss of differentiation as already described by us in other articles.
Process service quality evaluation based on Dempster-Shafer theory and support vector machine.
Pei, Feng-Que; Li, Dong-Bo; Tong, Yi-Fei; He, Fei
2017-01-01
Human involvement influences traditional service quality evaluations, which triggers an evaluation's low accuracy, poor reliability and less impressive predictability. This paper proposes a method by employing a support vector machine (SVM) and Dempster-Shafer evidence theory to evaluate the service quality of a production process by handling a high number of input features with a low sampling data set, which is called SVMs-DS. Features that can affect production quality are extracted by a large number of sensors. Preprocessing steps such as feature simplification and normalization are reduced. Based on three individual SVM models, the basic probability assignments (BPAs) are constructed, which can help the evaluation in a qualitative and quantitative way. The process service quality evaluation results are validated by the Dempster rules; the decision threshold to resolve conflicting results is generated from three SVM models. A case study is presented to demonstrate the effectiveness of the SVMs-DS method.
Bechara, Antoine
2004-06-01
Most theories of choice assume that decisions derive from an assessment of the future outcomes of various options and alternatives through some type of cost-benefit analyses. The influence of emotions on decision-making is largely ignored. The studies of decision-making in neurological patients who can no longer process emotional information normally suggest that people make judgments not only by evaluating the consequences and their probability of occurring, but also and even sometimes primarily at a gut or emotional level. Lesions of the ventromedial (which includes the orbitofrontal) sector of the prefrontal cortex interfere with the normal processing of "somatic" or emotional signals, while sparing most basic cognitive functions. Such damage leads to impairments in the decision-making process, which seriously compromise the quality of decisions in daily life. The aim of this paper is to review evidence in support of "The Somatic Marker Hypothesis," which provides a systems-level neuroanatomical and cognitive framework for decision-making and suggests that the process of decision-making depends in many important ways on neural substrates that regulate homeostasis, emotion, and feeling. The implications of this theoretical framework for the normal and abnormal development of the orbitofrontal cortex are also discussed.
When speaker identity is unavoidable: Neural processing of speaker identity cues in natural speech.
Tuninetti, Alba; Chládková, Kateřina; Peter, Varghese; Schiller, Niels O; Escudero, Paola
2017-11-01
Speech sound acoustic properties vary largely across speakers and accents. When perceiving speech, adult listeners normally disregard non-linguistic variation caused by speaker or accent differences, in order to comprehend the linguistic message, e.g. to correctly identify a speech sound or a word. Here we tested whether the process of normalizing speaker and accent differences, facilitating the recognition of linguistic information, is found at the level of neural processing, and whether it is modulated by the listeners' native language. In a multi-deviant oddball paradigm, native and nonnative speakers of Dutch were exposed to naturally-produced Dutch vowels varying in speaker, sex, accent, and phoneme identity. Unexpectedly, the analysis of mismatch negativity (MMN) amplitudes elicited by each type of change shows a large degree of early perceptual sensitivity to non-linguistic cues. This finding on perception of naturally-produced stimuli contrasts with previous studies examining the perception of synthetic stimuli wherein adult listeners automatically disregard acoustic cues to speaker identity. The present finding bears relevance to speech normalization theories, suggesting that at an unattended level of processing, listeners are indeed sensitive to changes in fundamental frequency in natural speech tokens. Copyright © 2017 Elsevier Inc. All rights reserved.
The Role of Visual Processing Speed in Reading Speed Development
Lobier, Muriel; Dubois, Matthieu; Valdois, Sylviane
2013-01-01
A steady increase in reading speed is the hallmark of normal reading acquisition. However, little is known of the influence of visual attention capacity on children's reading speed. The number of distinct visual elements that can be simultaneously processed at a glance (dubbed the visual attention span), predicts single-word reading speed in both normal reading and dyslexic children. However, the exact processes that account for the relationship between the visual attention span and reading speed remain to be specified. We used the Theory of Visual Attention to estimate visual processing speed and visual short-term memory capacity from a multiple letter report task in eight and nine year old children. The visual attention span and text reading speed were also assessed. Results showed that visual processing speed and visual short term memory capacity predicted the visual attention span. Furthermore, visual processing speed predicted reading speed, but visual short term memory capacity did not. Finally, the visual attention span mediated the effect of visual processing speed on reading speed. These results suggest that visual attention capacity could constrain reading speed in elementary school children. PMID:23593117
The role of visual processing speed in reading speed development.
Lobier, Muriel; Dubois, Matthieu; Valdois, Sylviane
2013-01-01
A steady increase in reading speed is the hallmark of normal reading acquisition. However, little is known of the influence of visual attention capacity on children's reading speed. The number of distinct visual elements that can be simultaneously processed at a glance (dubbed the visual attention span), predicts single-word reading speed in both normal reading and dyslexic children. However, the exact processes that account for the relationship between the visual attention span and reading speed remain to be specified. We used the Theory of Visual Attention to estimate visual processing speed and visual short-term memory capacity from a multiple letter report task in eight and nine year old children. The visual attention span and text reading speed were also assessed. Results showed that visual processing speed and visual short term memory capacity predicted the visual attention span. Furthermore, visual processing speed predicted reading speed, but visual short term memory capacity did not. Finally, the visual attention span mediated the effect of visual processing speed on reading speed. These results suggest that visual attention capacity could constrain reading speed in elementary school children.
ERIC Educational Resources Information Center
Perry, Conrad; Ziegler, Johannes C.; Zorzi, Marco
2007-01-01
At least 3 different types of computational model have been shown to account for various facets of both normal and impaired single word reading: (a) the connectionist triangle model, (b) the dual-route cascaded model, and (c) the connectionist dual process model. Major strengths and weaknesses of these models are identified. In the spirit of…
Indirect techniques in nuclear astrophysics: a review.
Tribble, R E; Bertulani, C A; Cognata, M La; Mukhamedzhanov, A M; Spitaleri, C
2014-10-01
In this review, we discuss the present status of three indirect techniques that are used to determine reaction rates for stellar burning processes, asymptotic normalization coefficients, the Trojan Horse method and Coulomb dissociation. A comprehensive review of the theory behind each of these techniques is presented. This is followed by an overview of the experiments that have been carried out using these indirect approaches.
Gotlib Conn, Lesley; McKenzie, Marg; Pearsall, Emily A; McLeod, Robin S
2015-07-17
Enhanced recovery after surgery (ERAS) is a multimodal evidence-based approach to patient care that has become the standard in elective colorectal surgery. Implemented globally, ERAS programmes represent a considerable change in practice for many surgical care providers. Our current understanding of specific implementation and sustainability challenges is limited. In January 2013, we began a 2-year ERAS implementation for elective colorectal surgery in 15 academic hospitals in Ontario. The purpose of this study was to understand the process enablers and barriers that influenced the success of ERAS implementation in these centres with a view towards supporting sustainable change. A qualitative process evaluation was conducted from June to September 2014. Semi-structured interviews with implementation champions were completed, and an iterative inductive thematic analysis was conducted. Following a data-driven analysis, the Normalization Process Theory (NPT) was used as an analytic framework to understand the impact of various implementation processes. The NPT constructs were used as sensitizing concepts, reviewed against existing data categories for alignment and fit. Fifty-eight participants were included: 15 surgeons, 14 anaesthesiologists, 15 nurses, and 14 project coordinators. A number of process-related implementation enablers were identified: champions' belief in the value of the programme, the fit and cohesion of champions and their teams locally and provincially, a bottom-up approach to stakeholder engagement targeting organizational relationship-building, receptivity and support of division leaders, and the normalization of ERAS as everyday practice. Technical enablers identified included effective integration with existing clinical systems and using audit and feedback to report to hospital stakeholders. There was an overall optimism that ERAS implementation would be sustained, accompanied by concern about long-term organizational support. Successful ERAS implementation is achieved by a complex series of cognitive and social processes which previously have not been well described. Using the Normalization Process Theory as a framework, this analysis demonstrates the importance of champion coherence, external and internal relationship building, and the strategic management of a project's organization-level visibility as important to ERAS uptake and sustainability.
Lloyd, Amy; Joseph-Williams, Natalie; Edwards, Adrian; Rix, Andrew; Elwyn, Glyn
2013-09-05
Implementing shared decision making into routine practice is proving difficult, despite considerable interest from policy-makers, and is far more complex than merely making decision support interventions available to patients. Few have reported successful implementation beyond research studies. MAking Good Decisions In Collaboration (MAGIC) is a multi-faceted implementation program, commissioned by The Health Foundation (UK), to examine how best to put shared decision making into routine practice. In this paper, we investigate healthcare professionals' perspectives on implementing shared decision making during the MAGIC program, to examine the work required to implement shared decision making and to inform future efforts. The MAGIC program approached implementation of shared decision making by initiating a range of interventions including: providing workshops; facilitating development of brief decision support tools (Option Grids); initiating a patient activation campaign ('Ask 3 Questions'); gathering feedback using Decision Quality Measures; providing clinical leads meetings, learning events, and feedback sessions; and obtaining executive board level support. At 9 and 15 months (May and November 2011), two rounds of semi-structured interviews were conducted with healthcare professionals in three secondary care teams to explore views on the impact of these interventions. Interview data were coded by two reviewers using a framework derived from the Normalization Process Theory. A total of 54 interviews were completed with 31 healthcare professionals. Partial implementation of shared decision making could be explained using the four components of the Normalization Process Theory: 'coherence,' 'cognitive participation,' 'collective action,' and 'reflexive monitoring.' Shared decision making was integrated into routine practice when clinical teams shared coherent views of role and purpose ('coherence'). Shared decision making was facilitated when teams engaged in developing and delivering interventions ('cognitive participation'), and when those interventions fit with existing skill sets and organizational priorities ('collective action') resulting in demonstrable improvements to practice ('reflexive monitoring'). The implementation process uncovered diverse and conflicting attitudes toward shared decision making; 'coherence' was often missing. The study showed that implementation of shared decision making is more complex than the delivery of patient decision support interventions to patients, a portrayal that often goes unquestioned. Normalizing shared decision making requires intensive work to ensure teams have a shared understanding of the purpose of involving patients in decisions, and undergo the attitudinal shifts that many health professionals feel are required when comprehension goes beyond initial interpretations. Divergent views on the value of engaging patients in decisions remain a significant barrier to implementation.
2013-01-01
Background Implementing shared decision making into routine practice is proving difficult, despite considerable interest from policy-makers, and is far more complex than merely making decision support interventions available to patients. Few have reported successful implementation beyond research studies. MAking Good Decisions In Collaboration (MAGIC) is a multi-faceted implementation program, commissioned by The Health Foundation (UK), to examine how best to put shared decision making into routine practice. In this paper, we investigate healthcare professionals’ perspectives on implementing shared decision making during the MAGIC program, to examine the work required to implement shared decision making and to inform future efforts. Methods The MAGIC program approached implementation of shared decision making by initiating a range of interventions including: providing workshops; facilitating development of brief decision support tools (Option Grids); initiating a patient activation campaign (‘Ask 3 Questions’); gathering feedback using Decision Quality Measures; providing clinical leads meetings, learning events, and feedback sessions; and obtaining executive board level support. At 9 and 15 months (May and November 2011), two rounds of semi-structured interviews were conducted with healthcare professionals in three secondary care teams to explore views on the impact of these interventions. Interview data were coded by two reviewers using a framework derived from the Normalization Process Theory. Results A total of 54 interviews were completed with 31 healthcare professionals. Partial implementation of shared decision making could be explained using the four components of the Normalization Process Theory: ‘coherence,’ ‘cognitive participation,’ ‘collective action,’ and ‘reflexive monitoring.’ Shared decision making was integrated into routine practice when clinical teams shared coherent views of role and purpose (‘coherence’). Shared decision making was facilitated when teams engaged in developing and delivering interventions (‘cognitive participation’), and when those interventions fit with existing skill sets and organizational priorities (‘collective action’) resulting in demonstrable improvements to practice (‘reflexive monitoring’). The implementation process uncovered diverse and conflicting attitudes toward shared decision making; ‘coherence’ was often missing. Conclusions The study showed that implementation of shared decision making is more complex than the delivery of patient decision support interventions to patients, a portrayal that often goes unquestioned. Normalizing shared decision making requires intensive work to ensure teams have a shared understanding of the purpose of involving patients in decisions, and undergo the attitudinal shifts that many health professionals feel are required when comprehension goes beyond initial interpretations. Divergent views on the value of engaging patients in decisions remain a significant barrier to implementation. PMID:24006959
Theory of psychological adaptive modes.
Lehti, Juha
2016-05-01
When an individual is facing a stressor and normal stress-response mechanism cannot guarantee sufficient adaptation, special emotional states, adaptive modes, are activated (for example a depressive reaction). Adaptive modes are involuntary states of mind, they are of comprehensive nature, they interfere with normal functioning, and they cannot be repressed or controlled the same way as many emotions. Their transformational nature differentiates them from other emotional states. The object of the adaptive mode is to optimize the problem-solving abilities according to the situation that has provoked the mode. Cognitions and emotions during the adaptive mode are different than in a normal mental state. These altered cognitions and emotional reactions guide the individual to use the correct coping skills in order to deal with the stressor. Successful adaptation will cause the adaptive mode to fade off since the adaptive mode is no longer necessary, and the process as a whole will lead to raised well-being. However, if the adaptation process is inadequate, then the transformation period is prolonged, and the adaptive mode will turn into a dysfunctional state. Many psychiatric disorders are such maladaptive processes. The maladaptive processes can be turned into functional ones by using adaptive skills that are used in functional adaptive processes. Copyright © 2016 Elsevier Ltd. All rights reserved.
Small bending and stretching of sandwich-type shells
NASA Technical Reports Server (NTRS)
Reissner, Eric
1950-01-01
A theory has been developed for small bending and stretching of sandwich-type shells. This theory is an extension of the known theory of homogeneous thin elastic shells. It was found that two effects are important in the present problem, which are not normally of importance in the theory of curved shells: (1) the effect of transverse shear deformation and (2) the effect of transverse normal stress deformation. The first of these two effects has been known to be of importance in the theory of plates and beams. The second effect was found to occur in a manner which is typical for shells and has no counterpart in flat-plate theory. The general results of this report have been applied to the solution of problems concerning flat plates, circular rings, circular cylindrical shells, and spherical shells. In each case numerical examples have been given, illustrating the magnitude of the effects of transverse shear and normal stress deformation.
Fuzzy-Trace Theory and Lifespan Cognitive Development
Brainerd, C J.; Reyna, Valerie F.
2015-01-01
Fuzzy-trace theory (FTT) emphasizes the use of core theoretical principles, such as the verbatim-gist distinction, to predict new findings about cognitive development that are counterintuitive from the perspective of other theories or of common-sense. To the extent that such predictions are confirmed, the range of phenomena that are explained expands without increasing the complexity of the theory's assumptions. We examine research on recent examples of such predictions during four epochs of cognitive development: childhood, adolescence, young adulthood, and late adulthood. During the first two, the featured predictions are surprising developmental reversals in false memory (childhood) and in risky decision making (adolescence). During young adulthood, FTT predicts that a retrieval operation that figures centrally in dual-process theories of memory, recollection, is bivariate rather than univariate. During the late adulthood, FTT identifies a retrieval operation, reconstruction, that has been omitted from current theories of normal memory declines in aging and pathological declines in dementia. The theory predicts that reconstruction is a major factor in such declines and that it is able to forecast future dementia. PMID:26644632
Fuzzy-Trace Theory and Lifespan Cognitive Development.
Brainerd, C J; Reyna, Valerie F
2015-12-01
Fuzzy-trace theory (FTT) emphasizes the use of core theoretical principles, such as the verbatim-gist distinction, to predict new findings about cognitive development that are counterintuitive from the perspective of other theories or of common-sense. To the extent that such predictions are confirmed, the range of phenomena that are explained expands without increasing the complexity of the theory's assumptions. We examine research on recent examples of such predictions during four epochs of cognitive development: childhood, adolescence, young adulthood, and late adulthood. During the first two, the featured predictions are surprising developmental reversals in false memory (childhood) and in risky decision making (adolescence). During young adulthood, FTT predicts that a retrieval operation that figures centrally in dual-process theories of memory, recollection, is bivariate rather than univariate. During the late adulthood, FTT identifies a retrieval operation, reconstruction, that has been omitted from current theories of normal memory declines in aging and pathological declines in dementia. The theory predicts that reconstruction is a major factor in such declines and that it is able to forecast future dementia.
Holtrop, Jodi Summers; Potworowski, Georges; Fitzpatrick, Laurie; Kowalk, Amy; Green, Lee A
2016-08-15
Care management in primary care can be effective in helping patients with chronic disease improve their health status, however, primary care practices are often challenged with implementation. Further, there are different ways to structure care management that may make implementation more or less successful. Normalization process theory (NPT) provides a means of understanding how a new complex intervention can become routine (normalized) in practice. In this study, we used NPT to understand how care management structure affected how well care management became routine in practice. Data collection involved semi-structured interviews and observations conducted at 25 practices in five physician organizations in Michigan, USA. Practices were selected to reflect variation in physician organizations, type of care management program, and degree of normalization. Data were transcribed, qualitatively coded and analyzed, initially using an editing approach and then a template approach with NPT as a guiding framework. Seventy interviews and 25 observations were completed. Two key structures for care management organization emerged: practice-based care management where the care managers were embedded in the practice as part of the practice team; and centralized care management where the care managers worked independently of the practice work flow and was located outside the practice. There were differences in normalization of care management across practices. Practice-based care management was generally better normalized as compared to centralized care management. Differences in normalization were well explained by the NPT, and in particular the collective action construct. When care managers had multiple and flexible opportunities for communication (interactional workability), had the requisite knowledge, skills, and personal characteristics (skill set workability), and the organizational support and resources (contextual integration), a trusting professional relationship (relational integration) developed between practice providers and staff and the care manager. When any of these elements were missing, care management implementation appeared to be affected negatively. Although care management can introduce many new changes into delivery of clinical practice, implementing it successfully as a new complex intervention is possible. NPT can be helpful in explaining differences in implementing a new care management program with a view to addressing them during implementation planning.
Enhanced activation of the left hemisphere promotes normative decision making.
Corser, Ryan; Jasper, John D
2014-01-01
Previous studies have reported that enhanced activation of the left cerebral hemisphere reduces risky-choice, attribute, and goal-framing effects relative to enhanced activation of the right cerebral hemisphere. The present study sought to extend these findings and show that enhanced activation of the left hemisphere also reduces violations of other normative principles, besides the invariance principle. Participants completed ratio bias (Experiment 1, N = 296) and base rate neglect problems (Experiment 2, N = 145) under normal (control) viewing or with the right or left hemisphere primarily activated by imposing a unidirectional gaze. In Experiment 1 we found that enhanced left hemispheric activation reduced the ratio bias relative to normal viewing and a group experiencing enhanced right hemispheric activation. In Experiment 2 enhanced left hemispheric activation resulted in using base rates more than normal viewing, but not significantly more than enhanced right hemispheric activation. Results suggest that hemispheric asymmetries can affect higher-order cognitive processes, such as decision-making biases. Possible theoretical accounts are discussed as well as implications for dual-process theories.
The use of normal forms for analysing nonlinear mechanical vibrations
Neild, Simon A.; Champneys, Alan R.; Wagg, David J.; Hill, Thomas L.; Cammarano, Andrea
2015-01-01
A historical introduction is given of the theory of normal forms for simplifying nonlinear dynamical systems close to resonances or bifurcation points. The specific focus is on mechanical vibration problems, described by finite degree-of-freedom second-order-in-time differential equations. A recent variant of the normal form method, that respects the specific structure of such models, is recalled. It is shown how this method can be placed within the context of the general theory of normal forms provided the damping and forcing terms are treated as unfolding parameters. The approach is contrasted to the alternative theory of nonlinear normal modes (NNMs) which is argued to be problematic in the presence of damping. The efficacy of the normal form method is illustrated on a model of the vibration of a taut cable, which is geometrically nonlinear. It is shown how the method is able to accurately predict NNM shapes and their bifurcations. PMID:26303917
Atypical incus necrosis: a case report and literature review.
Choudhury, N; Kumar, G; Krishnan, M; Gatland, D J
2008-10-01
We report an atypical case of ossicular necrosis affecting the incus, in the absence of any history of chronic serous otitis media. We also discuss the current theories of incus necrosis. A male patient presented with a history of right unilateral hearing loss and tinnitus. Audiometry confirmed right conductive deafness; tympanometry was normal bilaterally. He underwent a right exploratory tympanotomy, which revealed atypical erosion of the proximal long process of the incus. Middle-ear examination was otherwise normal, with a mobile stapes footplate. The redundant long process of the incus was excised and a partial ossicular replacement prosthesis was inserted, resulting in improved hearing. Ossicular pathologies most commonly affect the incus. The commonest defect is an absent lenticular and distal long process of the incus, which is most commonly associated with chronic otitis media. This is the first reported case of ossicular necrosis, particularly of the proximal long process of the incus, in the absence of chronic middle-ear pathology.
Sherman, Deborah Witt; Rosedale, Mary; Haber, Judith
2012-05-01
To develop a substantive theory of the process of breast cancer survivorship. Grounded theory. A LISTSERV announcement posted on the SHARE Web site and purposeful recruitment of women known to be diagnosed and treated for breast cancer. 15 women diagnosed with early-stage breast cancer. Constant comparative analysis. Breast cancer survivorship. The core variable identified was Reclaiming Life on One's Own Terms. The perceptions and experiences of the participants revealed overall that the diagnosis of breast cancer was a turning point in life and the stimulus for change. That was followed by the recognition of breast cancer as now being a part of life, leading to the necessity of learning to live with breast cancer, and finally, creating a new life after breast cancer. Participants revealed that breast cancer survivorship is a process marked and shaped by time, the perception of support, and coming to terms with the trauma of a cancer diagnosis and the aftermath of treatment. The process of survivorship continues by assuming an active role in self-healing, gaining a new perspective and reconciling paradoxes, creating a new mindset and moving to a new normal, developing a new way of being in the world on one's own terms, and experiencing growth through adversity beyond survivorship. The process of survivorship for women with breast cancer is an evolutionary journey with short- and long-term challenges. This study shows the development of an empirically testable theory of survivorship that describes and predicts women's experiences following breast cancer treatment from the initial phase of recovery and beyond. The theory also informs interventions that not only reduce negative outcomes, but promote ongoing healing, adjustment, and resilience over time.
Tamuz, Michal; Harrison, Michael I
2006-01-01
Objective To identify the distinctive contributions of high-reliability theory (HRT) and normal accident theory (NAT) as frameworks for examining five patient safety practices. Data Sources/Study Setting We reviewed and drew examples from studies of organization theory and health services research. Study Design After highlighting key differences between HRT and NAT, we applied the frames to five popular safety practices: double-checking medications, crew resource management (CRM), computerized physician order entry (CPOE), incident reporting, and root cause analysis (RCA). Principal Findings HRT highlights how double checking, which is designed to prevent errors, can undermine mindfulness of risk. NAT emphasizes that social redundancy can diffuse and reduce responsibility for locating mistakes. CRM promotes high reliability organizations by fostering deference to expertise, rather than rank. However, HRT also suggests that effective CRM depends on fundamental changes in organizational culture. NAT directs attention to an underinvestigated feature of CPOE: it tightens the coupling of the medication ordering process, and tight coupling increases the chances of a rapid and hard-to-contain spread of infrequent, but harmful errors. Conclusions Each frame can make a valuable contribution to improving patient safety. By applying the HRT and NAT frames, health care researchers and administrators can identify health care settings in which new and existing patient safety interventions are likely to be effective. Furthermore, they can learn how to improve patient safety, not only from analyzing mishaps, but also by studying the organizational consequences of implementing safety measures. PMID:16898984
Transition theory and its relevance to patients with chronic wounds.
Neil, J A; Barrell, L M
1998-01-01
A wound, in the broadest sense, is a disruption of normal anatomic structure and function. Acute wounds progress through a timely and orderly sequence of repair that leads to the restoration of functional integrity. In chronic wounds, this timely and orderly sequence goes awry. As a result, people with chronic wounds often face not only physiological difficulties but emotional ones as well. The study of body image and its damage as a result of a chronic wound fits well with Selder's transition theory. This article describes interviews with seven patients with chronic wounds. The themes that emerged from those interviews were compared with Selder's theory to describe patients' experience with chronic wounds as a transition process that can be identified and better understood by healthcare providers.
ERIC Educational Resources Information Center
Hufano, Linda D.
The study examined emotional-motivational personality characteristics of 15 learning disabled, 15 normal achieving, and 15 high achieving students (grades 3-5). The study tested the hypothesis derived from the A-R-D (attitude-reinforcer-discriminative) theory of motivation that learning disabled (LD) children differ from normal and high achieving…
NASA Astrophysics Data System (ADS)
Saikia, Banashree
2017-03-01
An overview of predominant theoretical models used for predicting the thermal conductivities of dielectric materials is given. The criteria used for different theoretical models are explained. This overview highlights a unified theory based on temperature-dependent thermal-conductivity theories, and a drifting of the equilibrium phonon distribution function due to normal three-phonon scattering processes causes transfer of phonon momentum to (a) the same phonon modes (KK-S model) and (b) across the phonon modes (KK-H model). Estimates of the lattice thermal conductivities of LiF and Mg2Sn for the KK-H model are presented graphically.
Urban American Indian Adolescent Girls: Framing Sexual Risk Behavior
Martyn, Kristy K.; Momper, Sandra L.; Loveland-Cherry, Carol J.; Low, Lisa Kane
2014-01-01
Purpose American Indian (AI) adolescent girls have higher rates of sexual activity, births and STIs compared to the national average. The purpose of this study was to explore factors that influence urban adolescent AI girls' sexual risk behavior (SRB). Design A qualitative study was conducted using grounded theory methodology to reveal factors and processes that influence SRB. Methods Talking circles, individual interviews, and event history calendars were used with 20 urban AI 15-19 year old girls to explore influences on their sexual behavior. Findings The generated theory, Framing Sexual Risk Behavior, describes both social and structural factors and processes that influenced the girls' sexual behaviors. The theory extends Bronfenbrenner's ecological model by identifying microsystem, mesosystem, and macrosystem influences on sexual behavior, including: Microsystem: Being “Normal,” Native, and Having Goals; Mesosystem: Networks of Family and Friends, Environmental Influences, and Sex Education; and Macrosystem: Tribal Traditions/History and Federal Policy. Discussion Urban AI girls reported similar social and structural influences on SRB as urban adolescents from other racial and ethnic groups. However, differences were noted in the family structure, cultural heritage, and unique history of AIs. Implications for Practice This theory can be used in culturally responsive practice with urban AI girls. PMID:24803532
NASA Technical Reports Server (NTRS)
Slemp, Wesley C. H.; Kapania, Rakesh K.; Tessler, Alexander
2010-01-01
Computation of interlaminar stresses from the higher-order shear and normal deformable beam theory and the refined zigzag theory was performed using the Sinc method based on Interpolation of Highest Derivative. The Sinc method based on Interpolation of Highest Derivative was proposed as an efficient method for determining through-the-thickness variations of interlaminar stresses from one- and two-dimensional analysis by integration of the equilibrium equations of three-dimensional elasticity. However, the use of traditional equivalent single layer theories often results in inaccuracies near the boundaries and when the lamina have extremely large differences in material properties. Interlaminar stresses in symmetric cross-ply laminated beams were obtained by solving the higher-order shear and normal deformable beam theory and the refined zigzag theory with the Sinc method based on Interpolation of Highest Derivative. Interlaminar stresses and bending stresses from the present approach were compared with a detailed finite element solution obtained by ABAQUS/Standard. The results illustrate the ease with which the Sinc method based on Interpolation of Highest Derivative can be used to obtain the through-the-thickness distributions of interlaminar stresses from the beam theories. Moreover, the results indicate that the refined zigzag theory is a substantial improvement over the Timoshenko beam theory due to the piecewise continuous displacement field which more accurately represents interlaminar discontinuities in the strain field. The higher-order shear and normal deformable beam theory more accurately captures the interlaminar stresses at the ends of the beam because it allows transverse normal strain. However, the continuous nature of the displacement field requires a large number of monomial terms before the interlaminar stresses are computed as accurately as the refined zigzag theory.
Visual feature integration with an attention deficit.
Arguin, M; Cavanagh, P; Joanette, Y
1994-01-01
Treisman's feature integration theory proposes that the perception of illusory conjunctions of correctly encoded visual features is due to the failure of an attentional process. This hypothesis was examined by studying brain-damaged subjects who had previously been shown to have difficulty in attending to contralesional stimulation. These subjects exhibited a massive feature integration deficit for contralesional stimulation relative to ipsilesional displays. In contrast, both normal age-matched controls and brain-damaged subjects who did not exhibit any evidence of an attention deficit showed comparable feature integration performance with left- and right-hemifield stimulation. These observations indicate the crucial function of attention for visual feature integration in normal perception.
Field migration rates of tidal meanders recapitulate fluvial morphodynamics
NASA Astrophysics Data System (ADS)
Finotello, Alvise; Lanzoni, Stefano; Ghinassi, Massimiliano; Marani, Marco; Rinaldo, Andrea; D'Alpaos, Andrea
2018-02-01
The majority of tidal channels display marked meandering features. Despite their importance in oil-reservoir formation and tidal landscape morphology, questions remain on whether tidal-meander dynamics could be understood in terms of fluvial processes and theory. Key differences suggest otherwise, like the periodic reversal of landscape-forming tidal flows and the widely accepted empirical notion that tidal meanders are stable landscape features, in stark contrast with their migrating fluvial counterparts. On the contrary, here we show that, once properly normalized, observed migration rates of tidal and fluvial meanders are remarkably similar. Key to normalization is the role of tidal channel width that responds to the strong spatial gradients of landscape-forming flow rates and tidal prisms. We find that migration dynamics of tidal meanders agree with nonlinear theories for river meander evolution. Our results challenge the conventional view of tidal channels as stable landscape features and suggest that meandering tidal channels recapitulate many fluvial counterparts owing to large gradients of tidal prisms across meander wavelengths.
An on-line modified least-mean-square algorithm for training neurofuzzy controllers.
Tan, Woei Wan
2007-04-01
The problem hindering the use of data-driven modelling methods for training controllers on-line is the lack of control over the amount by which the plant is excited. As the operating schedule determines the information available on-line, the knowledge of the process may degrade if the setpoint remains constant for an extended period. This paper proposes an identification algorithm that alleviates "learning interference" by incorporating fuzzy theory into the normalized least-mean-square update rule. The ability of the proposed methodology to achieve faster learning is examined by employing the algorithm to train a neurofuzzy feedforward controller for controlling a liquid level process. Since the proposed identification strategy has similarities with the normalized least-mean-square update rule and the recursive least-square estimator, the on-line learning rates of these algorithms are also compared.
An on-line analysis of syntactic processing in Broca's and Wernicke's aphasia.
Zurif, E; Swinney, D; Prather, P; Solomon, J; Bushell, C
1993-10-01
This paper is about syntactic processing in aphasia. Specifically, we present data concerning the ability of Broca's and Wernicke's aphasic patients to link moved constituents and empty elements in real time. We show that Wernicke's aphasic patients carry out this syntactic analysis in a normal fashion, but that Broca's aphasic patients do not. We discuss these data in the context of some current grammar-based theories of comprehension limitations in aphasia and in terms of the different functional commitments of the brain regions implicated in Broca's and Wernicke's aphasia, respectively.
2014-01-01
Background There is a well-recognized need for greater use of theory to address research translational gaps. Normalization Process Theory (NPT) provides a set of sociological tools to understand and explain the social processes through which new or modified practices of thinking, enacting, and organizing work are implemented, embedded, and integrated in healthcare and other organizational settings. This review of NPT offers readers the opportunity to observe how, and in what areas, a particular theoretical approach to implementation is being used. In this article we review the literature on NPT in order to understand what interventions NPT is being used to analyze, how NPT is being operationalized, and the reported benefits, if any, of using NPT. Methods Using a framework analysis approach, we conducted a qualitative systematic review of peer-reviewed literature using NPT. We searched 12 electronic databases and all citations linked to six key NPT development papers. Grey literature/unpublished studies were not sought. Limitations of English language, healthcare setting and year of publication 2006 to June 2012 were set. Results Twenty-nine articles met the inclusion criteria; in the main, NPT is being applied to qualitatively analyze a diverse range of complex interventions, many beyond its original field of e-health and telehealth. The NPT constructs have high stability across settings and, notwithstanding challenges in applying NPT in terms of managing overlaps between constructs, there is evidence that it is a beneficial heuristic device to explain and guide implementation processes. Conclusions NPT offers a generalizable framework that can be applied across contexts with opportunities for incremental knowledge gain over time and an explicit framework for analysis, which can explain and potentially shape implementation processes. This is the first review of NPT in use and it generates an impetus for further and extended use of NPT. We recommend that in future NPT research, authors should explicate their rationale for choosing NPT as their theoretical framework and, where possible, involve multiple stakeholders including service users to enable analysis of implementation from a range of perspectives. PMID:24383661
McEvoy, Rachel; Ballini, Luciana; Maltoni, Susanna; O'Donnell, Catherine A; Mair, Frances S; Macfarlane, Anne
2014-01-02
There is a well-recognized need for greater use of theory to address research translational gaps. Normalization Process Theory (NPT) provides a set of sociological tools to understand and explain the social processes through which new or modified practices of thinking, enacting, and organizing work are implemented, embedded, and integrated in healthcare and other organizational settings. This review of NPT offers readers the opportunity to observe how, and in what areas, a particular theoretical approach to implementation is being used. In this article we review the literature on NPT in order to understand what interventions NPT is being used to analyze, how NPT is being operationalized, and the reported benefits, if any, of using NPT. Using a framework analysis approach, we conducted a qualitative systematic review of peer-reviewed literature using NPT. We searched 12 electronic databases and all citations linked to six key NPT development papers. Grey literature/unpublished studies were not sought. Limitations of English language, healthcare setting and year of publication 2006 to June 2012 were set. Twenty-nine articles met the inclusion criteria; in the main, NPT is being applied to qualitatively analyze a diverse range of complex interventions, many beyond its original field of e-health and telehealth. The NPT constructs have high stability across settings and, notwithstanding challenges in applying NPT in terms of managing overlaps between constructs, there is evidence that it is a beneficial heuristic device to explain and guide implementation processes. NPT offers a generalizable framework that can be applied across contexts with opportunities for incremental knowledge gain over time and an explicit framework for analysis, which can explain and potentially shape implementation processes. This is the first review of NPT in use and it generates an impetus for further and extended use of NPT. We recommend that in future NPT research, authors should explicate their rationale for choosing NPT as their theoretical framework and, where possible, involve multiple stakeholders including service users to enable analysis of implementation from a range of perspectives.
Cancer biomarker discovery: the entropic hallmark.
Berretta, Regina; Moscato, Pablo
2010-08-18
It is a commonly accepted belief that cancer cells modify their transcriptional state during the progression of the disease. We propose that the progression of cancer cells towards malignant phenotypes can be efficiently tracked using high-throughput technologies that follow the gradual changes observed in the gene expression profiles by employing Shannon's mathematical theory of communication. Methods based on Information Theory can then quantify the divergence of cancer cells' transcriptional profiles from those of normally appearing cells of the originating tissues. The relevance of the proposed methods can be evaluated using microarray datasets available in the public domain but the method is in principle applicable to other high-throughput methods. Using melanoma and prostate cancer datasets we illustrate how it is possible to employ Shannon Entropy and the Jensen-Shannon divergence to trace the transcriptional changes progression of the disease. We establish how the variations of these two measures correlate with established biomarkers of cancer progression. The Information Theory measures allow us to identify novel biomarkers for both progressive and relatively more sudden transcriptional changes leading to malignant phenotypes. At the same time, the methodology was able to validate a large number of genes and processes that seem to be implicated in the progression of melanoma and prostate cancer. We thus present a quantitative guiding rule, a new unifying hallmark of cancer: the cancer cell's transcriptome changes lead to measurable observed transitions of Normalized Shannon Entropy values (as measured by high-throughput technologies). At the same time, tumor cells increment their divergence from the normal tissue profile increasing their disorder via creation of states that we might not directly measure. This unifying hallmark allows, via the the Jensen-Shannon divergence, to identify the arrow of time of the processes from the gene expression profiles, and helps to map the phenotypical and molecular hallmarks of specific cancer subtypes. The deep mathematical basis of the approach allows us to suggest that this principle is, hopefully, of general applicability for other diseases.
Gould, D J; Hale, R; Waters, E; Allen, D
2016-12-01
All health workers should take responsibility for infection prevention and control (IPC). Recent reduction in key reported healthcare-associated infections in the UK is impressive, but the determinants of success are unknown. It is imperative to understand how IPC strategies operate as new challenges arise and threats of antimicrobial resistance increase. The authors undertook a retrospective, independent evaluation of an action plan to enhance IPC and 'ownership' (individual accountability) for IPC introduced throughout a healthcare organization. Twenty purposively selected informants were interviewed. Data were analysed inductively. Normalization Process Theory (NPT) was applied to interpret the findings and explain how the action plan was operating. Six themes emerged through inductive analysis. Theme 1: 'Ability to make sense of ownership' provided evidence of the first element of NPT (coherence). Regardless of occupational group or seniority, informants understood the importance of IPC ownership and described what it entailed. They identified three prerequisites: 'Always being vigilant' (Theme 2), 'Importance of access to information' (Theme 3) and 'Being able to learn together in a no-blame culture' (Theme 4). Data relating to each theme provided evidence of the other elements of NPT that are required to embed change: planning implementation (cognitive participation), undertaking the work necessary to achieve change (collective action), and reflection on what else is needed to promote change as part of continuous quality improvement (reflexive monitoring). Informants identified barriers (e.g. workload) and facilitators (clear lines of communication and expectations for IPC). Eighteen months after implementing the action plan incorporating IPC ownership, there was evidence of continuous service improvement and significant reduction in infection rates. Applying a theory that identifies factors that promote/inhibit routine incorporation ('normalization') of IPC into everyday health care can help explain the success of IPC initiatives and inform implementation. Copyright © 2016 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.
Positive experiences for participants in suicide bereavement groups: a grounded theory model.
Groos, Anita D; Shakespeare-Finch, Jane
2013-01-01
Grounded Theory was used to examine the experiences of 13 participants who had attended psycho-educational support groups for those bereaved by suicide. Results demonstrated core and central categories that fit well with group therapeutic factors developed by I. D. Yalom (1995) and emphasized the importance of universality, imparting information and instilling hope, catharsis and self-disclosure, and broader meaning-making processes surrounding acceptance or adjustment. Participants were commonly engaged in a lengthy process of oscillating between loss-oriented and restoration-focused reappraisals. The functional experience of the group comprised feeling normal within the group, providing a sense of permission to feel and to express emotions and thoughts and to bestow meaning. Structural variables of information and guidance and different perspectives on the suicide and bereavement were gained from other participants, the facilitators, group content, and process. Personal changes, including in relationships and in their sense of self assisted participants to develop an altered and more positive personal narrative.
Impaired "affective theory of mind" is associated with right ventromedial prefrontal damage.
Shamay-Tsoory, S G; Tomer, R; Berger, B D; Goldsher, D; Aharon-Peretz, J
2005-03-01
To examine the hypothesis that patients with ventromedial (VM) frontal lesions are impaired in the affective rather than cognitive facets of theory of mind (ToM). Prefrontal brain damage may result in impaired social behavior, especially when the damage involves the orbitofrontal/VM prefrontal cortex (PFC). It has been previously suggested that deficits in ToM may account for such aberrant behavior. However, inconsistent results have been reported, and different regions within the frontal cortex have been associated with ToM impairment. The performance of 26 patients with localized lesions in the PFC was compared with responses of 13 patients with posterior lesions and 13 normal control subjects. Three ToM tasks differing in the level of emotional processing involved were used: second-order false belief task, understanding ironic utterances, and identifying social faux pas. The results indicated that patients with VM (but not dorsolateral) prefrontal lesions were significantly impaired in irony and faux pas but not in second-order false belief as compared with patients with posterior lesions and normal control subjects. Lesions in the right VM area were associated with the most severe ToM deficit. These results are discussed in terms of the cognitive and affective facets of "mind-reading" processes mediated by the VM cortex.
Accurate Thermal Stresses for Beams: Normal Stress
NASA Technical Reports Server (NTRS)
Johnson, Theodore F.; Pilkey, Walter D.
2002-01-01
Formulations for a general theory of thermoelasticity to generate accurate thermal stresses for structural members of aeronautical vehicles were developed in 1954 by Boley. The formulation also provides three normal stresses and a shear stress along the entire length of the beam. The Poisson effect of the lateral and transverse normal stresses on a thermally loaded beam is taken into account in this theory by employing an Airy stress function. The Airy stress function enables the reduction of the three-dimensional thermal stress problem to a two-dimensional one. Numerical results from the general theory of thermoelasticity are compared to those obtained from strength of materials. It is concluded that the theory of thermoelasticity for prismatic beams proposed in this paper can be used instead of strength of materials when precise stress results are desired.
Accurate Thermal Stresses for Beams: Normal Stress
NASA Technical Reports Server (NTRS)
Johnson, Theodore F.; Pilkey, Walter D.
2003-01-01
Formulations for a general theory of thermoelasticity to generate accurate thermal stresses for structural members of aeronautical vehicles were developed in 1954 by Boley. The formulation also provides three normal stresses and a shear stress along the entire length of the beam. The Poisson effect of the lateral and transverse normal stresses on a thermally loaded beam is taken into account in this theory by employing an Airy stress function. The Airy stress function enables the reduction of the three-dimensional thermal stress problem to a two-dimensional one. Numerical results from the general theory of thermoelasticity are compared to those obtained from strength of materials. It is concluded that the theory of thermoelasticity for prismatic beams proposed in this paper can be used instead of strength of materials when precise stress results are desired.
Metafusion: A breakthrough in metallurgy
NASA Technical Reports Server (NTRS)
Joseph, Adrian A.
1994-01-01
The Metafuse Process is a patented development in the field of thin film coatings utilizing cold fusion which results in a true inter-dispersion of dissimilar materials along a gradual transition gradient through a boundary of several hundred atomic layers. The process is performed at ambient temperatures and pressures requiring relatively little energy and creating little or no heat. The process permits a remarkable range of material combinations and joining of materials which are normally incompatible. Initial applications include titanium carbide into and onto the copper resistance welding electrodes and tungsten carbide onto the cutting edges of tool steel blades. The process is achieved through application of an RF signal of low power and is based on the theory of vacancy fusion.
The interactive processes of accommodation and vergence.
Semmlow, J L; Bérard, P V; Vercher, J L; Putteman, A; Gauthier, G M
1994-01-01
A near target generates two different, though related stimuli: image disparity and image blur. Fixation of that near target evokes three motor responses: the so-called oculomotor "near triad". It has long been known that both disparity and blur stimuli are each capable of independently generating all three responses, and a recent theory of near triad control (the Dual Interactive Theory) describes how these stimulus components normally work together in the aid of near vision. However, this theory also indicates that when the system becomes unbalanced, as in high AC/A ratios of some accommodative esotropes, the two components will become antagonistic. In this situation, the interaction between the blur and disparity driven components exaggerates the imbalance created in the vergence motor output. Conversely, there is enhanced restoration when the AC/A ratio is effectively reduced surgically.
Pigarev, Ivan N; Pigareva, Marina L
2017-01-01
It was noticed long ago that sleep disorders or interruptions to the normal sleep pattern were associated with various gastrointestinal disorders. We review the studies which established the causal link between these disorders and sleep impairment. However, the mechanism of interactions between the quality of sleep and gastrointestinal pathophysiology remained unclear. Recently, the visceral theory of sleep was formulated. This theory proposes that the same brain structures, and particularly the same cortical sensory areas, which in wakefulness are involved in processing of the exteroceptive information, switch during sleep to the processing of information coming from various visceral systems. We review the studies which demonstrated that neurons of the various cortical areas (occipital, parietal, frontal) during sleep began to fire in response to activation coming from the stomach and small intestine. These data demonstrate that, during sleep, the computational power of the central nervous system, including all cortical areas, is engaged in restoration of visceral systems. Thus, the general mechanism of the interaction between quality of sleep and health became clear.
Emotion effects on implicit and explicit musical memory in normal aging.
Narme, Pauline; Peretz, Isabelle; Strub, Marie-Laure; Ergis, Anne-Marie
2016-12-01
Normal aging affects explicit memory while leaving implicit memory relatively spared. Normal aging also modifies how emotions are processed and experienced, with increasing evidence that older adults (OAs) focus more on positive information than younger adults (YAs). The aim of the present study was to investigate how age-related changes in emotion processing influence explicit and implicit memory. We used emotional melodies that differed in terms of valence (positive or negative) and arousal (high or low). Implicit memory was assessed with a preference task exploiting exposure effects, and explicit memory with a recognition task. Results indicated that effects of valence and arousal interacted to modulate both implicit and explicit memory in YAs. In OAs, recognition was poorer than in YAs; however, recognition of positive and high-arousal (happy) studied melodies was comparable. Insofar as socioemotional selectivity theory (SST) predicts a preservation of the recognition of positive information, our findings are not fully consistent with the extension of this theory to positive melodies since recognition of low-arousal (peaceful) studied melodies was poorer in OAs. In the preference task, YAs showed stronger exposure effects than OAs, suggesting an age-related decline of implicit memory. This impairment is smaller than the one observed for explicit memory (recognition), extending to the musical domain the dissociation between explicit memory decline and implicit memory relative preservation in aging. Finally, the disproportionate preference for positive material seen in OAs did not translate into stronger exposure effects for positive material suggesting no age-related emotional bias in implicit memory. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Gundersen, H J; Seefeldt, T; Osterby, R
1980-01-01
The width of individual glomerular epithelial foot processes appears very different on electron micrographs. A method for obtainining distributions of the true width of foot processes from that of their apparent width on electron micrographs has been developed based on geometric probability theory pertaining to a specific geometric model. Analyses of foot process width in humans and rats show a remarkable interindividual invariance implying rigid control and therefore great biological significance of foot process width or a derivative thereof. The very low inter-individual variation of the true width, shown in the present paper, makes it possible to demonstrate slight changes in rather small groups of patients or experimental animals.
NASA Astrophysics Data System (ADS)
Zhang, Hui; Li, Zhifang; Li, Hui
2012-12-01
In order to study scattering properties of normal and cancerous tissues from human stomach, we collect images for human gastric specimens by using phase-contrast microscope. The images were processed by the way of mathematics morphology. The equivalent particle size distribution of tissues can be obtained. Combining with Mie scattering theory, the scattering properties of tissues can be calculated. Assume scattering of light in biological tissue can be seen as separate scattering events by different particles, total scattering properties can be equivalent to as scattering sum of particles with different diameters. The results suggest that scattering coefficient of the cancerous tissue is significantly higher than that of normal tissue. The scattering phase function is different especially in the backscattering area. Those are significant clinical benefits to diagnosis cancerous tissue
Inter-individual cognitive variability in children with Asperger's syndrome
Gonzalez-Gadea, Maria Luz; Tripicchio, Paula; Rattazzi, Alexia; Baez, Sandra; Marino, Julian; Roca, Maria; Manes, Facundo; Ibanez, Agustin
2014-01-01
Multiple studies have tried to establish the distinctive profile of individuals with Asperger's syndrome (AS). However, recent reports suggest that adults with AS feature heterogeneous cognitive profiles. The present study explores inter-individual variability in children with AS through group comparison and multiple case series analysis. All participants completed an extended battery including measures of fluid and crystallized intelligence, executive functions, theory of mind, and classical neuropsychological tests. Significant group differences were found in theory of mind and other domains related to global information processing. However, the AS group showed high inter-individual variability (both sub- and supra-normal performance) on most cognitive tasks. Furthermore, high fluid intelligence correlated with less general cognitive impairment, high cognitive flexibility, and speed of motor processing. In light of these findings, we propose that children with AS are characterized by a distinct, uneven pattern of cognitive strengths and weaknesses. PMID:25132817
Revealing the Formation Mechanism of Ultra-Diffuse Galaxies
NASA Astrophysics Data System (ADS)
Garmire, Gordon
2017-09-01
Recently a population of large, very low optical surface brightness galaxies, so called ultra-diffuse galaxies (UDGs), were discovered in the outskirts of Coma clusters. Stellar line-of-sight velocity dispersions suggest large dark matter halo masses of 10^12 M_sun with very low baryon fractions ( 1%). The outstanding question waiting to be answered is: How do UDGs form and evolve? One theory is that UDGs are related to bright galaxies, however they are prevented from building a normal stellar population through various violent processes, such as gas stripping. We propose to observe Dragonfly 44, the most massive UDG known, for 100 ks with ACIS-I to test some of the formation theories.
Striegel, Deborah A.; Hara, Manami; Periwal, Vipul
2015-01-01
Pancreatic islets of Langerhans consist of endocrine cells, primarily α, β and δ cells, which secrete glucagon, insulin, and somatostatin, respectively, to regulate plasma glucose. β cells form irregular locally connected clusters within islets that act in concert to secrete insulin upon glucose stimulation. Due to the central functional significance of this local connectivity in the placement of β cells in an islet, it is important to characterize it quantitatively. However, quantification of the seemingly stochastic cytoarchitecture of β cells in an islet requires mathematical methods that can capture topological connectivity in the entire β-cell population in an islet. Graph theory provides such a framework. Using large-scale imaging data for thousands of islets containing hundreds of thousands of cells in human organ donor pancreata, we show that quantitative graph characteristics differ between control and type 2 diabetic islets. Further insight into the processes that shape and maintain this architecture is obtained by formulating a stochastic theory of β-cell rearrangement in whole islets, just as the normal equilibrium distribution of the Ornstein-Uhlenbeck process can be viewed as the result of the interplay between a random walk and a linear restoring force. Requiring that rearrangements maintain the observed quantitative topological graph characteristics strongly constrained possible processes. Our results suggest that β-cell rearrangement is dependent on its connectivity in order to maintain an optimal cluster size in both normal and T2D islets. PMID:26266953
Striegel, Deborah A; Hara, Manami; Periwal, Vipul
2015-08-01
Pancreatic islets of Langerhans consist of endocrine cells, primarily α, β and δ cells, which secrete glucagon, insulin, and somatostatin, respectively, to regulate plasma glucose. β cells form irregular locally connected clusters within islets that act in concert to secrete insulin upon glucose stimulation. Due to the central functional significance of this local connectivity in the placement of β cells in an islet, it is important to characterize it quantitatively. However, quantification of the seemingly stochastic cytoarchitecture of β cells in an islet requires mathematical methods that can capture topological connectivity in the entire β-cell population in an islet. Graph theory provides such a framework. Using large-scale imaging data for thousands of islets containing hundreds of thousands of cells in human organ donor pancreata, we show that quantitative graph characteristics differ between control and type 2 diabetic islets. Further insight into the processes that shape and maintain this architecture is obtained by formulating a stochastic theory of β-cell rearrangement in whole islets, just as the normal equilibrium distribution of the Ornstein-Uhlenbeck process can be viewed as the result of the interplay between a random walk and a linear restoring force. Requiring that rearrangements maintain the observed quantitative topological graph characteristics strongly constrained possible processes. Our results suggest that β-cell rearrangement is dependent on its connectivity in order to maintain an optimal cluster size in both normal and T2D islets.
Medication errors: definitions and classification
Aronson, Jeffrey K
2009-01-01
To understand medication errors and to identify preventive strategies, we need to classify them and define the terms that describe them. The four main approaches to defining technical terms consider etymology, usage, previous definitions, and the Ramsey–Lewis method (based on an understanding of theory and practice). A medication error is ‘a failure in the treatment process that leads to, or has the potential to lead to, harm to the patient’. Prescribing faults, a subset of medication errors, should be distinguished from prescription errors. A prescribing fault is ‘a failure in the prescribing [decision-making] process that leads to, or has the potential to lead to, harm to the patient’. The converse of this, ‘balanced prescribing’ is ‘the use of a medicine that is appropriate to the patient's condition and, within the limits created by the uncertainty that attends therapeutic decisions, in a dosage regimen that optimizes the balance of benefit to harm’. This excludes all forms of prescribing faults, such as irrational, inappropriate, and ineffective prescribing, underprescribing and overprescribing. A prescription error is ‘a failure in the prescription writing process that results in a wrong instruction about one or more of the normal features of a prescription’. The ‘normal features’ include the identity of the recipient, the identity of the drug, the formulation, dose, route, timing, frequency, and duration of administration. Medication errors can be classified, invoking psychological theory, as knowledge-based mistakes, rule-based mistakes, action-based slips, and memory-based lapses. This classification informs preventive strategies. PMID:19594526
The use of information theory for the evaluation of biomarkers of aging and physiological age.
Blokh, David; Stambler, Ilia
2017-04-01
The present work explores the application of information theoretical measures, such as entropy and normalized mutual information, for research of biomarkers of aging. The use of information theory affords unique methodological advantages for the study of aging processes, as it allows evaluating non-linear relations between biological parameters, providing the precise quantitative strength of those relations, both for individual and multiple parameters, showing cumulative or synergistic effect. Here we illustrate those capabilities utilizing a dataset on heart disease, including diagnostic parameters routinely available to physicians. The use of information-theoretical methods, utilizing normalized mutual information, revealed the exact amount of information that various diagnostic parameters or their combinations contained about the persons' age. Based on those exact informative values for the correlation of measured parameters with age, we constructed a diagnostic rule (a decision tree) to evaluate physiological age, as compared to chronological age. The present data illustrated that younger subjects suffering from heart disease showed characteristics of people of higher age (higher physiological age). Utilizing information-theoretical measures, with additional data, it may be possible to create further clinically applicable information-theory-based markers and models for the evaluation of physiological age, its relation to age-related diseases and its potential modifications by therapeutic interventions. Copyright © 2017 Elsevier B.V. All rights reserved.
Neuronal network models of epileptogenesis
Abdullahi, Aminu T.; Adamu, Lawan H.
2017-01-01
Epilepsy is a chronic neurological condition, following some trigger, transforming a normal brain to one that produces recurrent unprovoked seizures. In the search for the mechanisms that best explain the epileptogenic process, there is a growing body of evidence suggesting that the epilepsies are network level disorders. In this review, we briefly describe the concept of neuronal networks and highlight 2 methods used to analyse such networks. The first method, graph theory, is used to describe general characteristics of a network to facilitate comparison between normal and abnormal networks. The second, dynamic causal modelling, is useful in the analysis of the pathways of seizure spread. We concluded that the end results of the epileptogenic process are best understood as abnormalities of neuronal circuitry and not simply as molecular or cellular abnormalities. The network approach promises to generate new understanding and more targeted treatment of epilepsy. PMID:28416779
Restoring primacy in amnesic free recall: evidence for the recency theory of primacy.
Dewar, Michaela; Brown, Gordon D A; Della Sala, Sergio
2011-09-01
Primacy and recency effects at immediate recall are thought to reflect the independent functioning of a long-term memory store (primacy) and a short-term memory store (recency). Key evidence for this theory comes from amnesic patients who show severe long-term memory storage deficits, coupled with profoundly attenuated primacy. Here we challenge this dominant dual-store theory of immediate recall by demonstrating that attenuated primacy in amnesic patients can reflect abnormal working memory rehearsal processes. D.A., a patient with severe amnesia, presented with profoundly attenuated primacy when using her preferred atypical noncumulative rehearsal strategy. In contrast, despite her severe amnesia, she showed normal primacy when her rehearsal was matched with that of controls via an externalized cumulative rehearsal schedule. Our data are in keeping with the "recency theory of primacy" and suggest that primacy at immediate recall is dependent upon medial temporal lobe involvement in cumulative rehearsal rather than long-term memory storage.
Unifying Theories of Psychedelic Drug Effects
Swanson, Link R.
2018-01-01
How do psychedelic drugs produce their characteristic range of acute effects in perception, emotion, cognition, and sense of self? How do these effects relate to the clinical efficacy of psychedelic-assisted therapies? Efforts to understand psychedelic phenomena date back more than a century in Western science. In this article I review theories of psychedelic drug effects and highlight key concepts which have endured over the last 125 years of psychedelic science. First, I describe the subjective phenomenology of acute psychedelic effects using the best available data. Next, I review late 19th-century and early 20th-century theories—model psychoses theory, filtration theory, and psychoanalytic theory—and highlight their shared features. I then briefly review recent findings on the neuropharmacology and neurophysiology of psychedelic drugs in humans. Finally, I describe recent theories of psychedelic drug effects which leverage 21st-century cognitive neuroscience frameworks—entropic brain theory, integrated information theory, and predictive processing—and point out key shared features that link back to earlier theories. I identify an abstract principle which cuts across many theories past and present: psychedelic drugs perturb universal brain processes that normally serve to constrain neural systems central to perception, emotion, cognition, and sense of self. I conclude that making an explicit effort to investigate the principles and mechanisms of psychedelic drug effects is a uniquely powerful way to iteratively develop and test unifying theories of brain function. PMID:29568270
van Mierlo, Pieter; Lie, Octavian; Staljanssens, Willeke; Coito, Ana; Vulliémoz, Serge
2018-04-26
We investigated the influence of processing steps in the estimation of multivariate directed functional connectivity during seizures recorded with intracranial EEG (iEEG) on seizure-onset zone (SOZ) localization. We studied the effect of (i) the number of nodes, (ii) time-series normalization, (iii) the choice of multivariate time-varying connectivity measure: Adaptive Directed Transfer Function (ADTF) or Adaptive Partial Directed Coherence (APDC) and (iv) graph theory measure: outdegree or shortest path length. First, simulations were performed to quantify the influence of the various processing steps on the accuracy to localize the SOZ. Afterwards, the SOZ was estimated from a 113-electrodes iEEG seizure recording and compared with the resection that rendered the patient seizure-free. The simulations revealed that ADTF is preferred over APDC to localize the SOZ from ictal iEEG recordings. Normalizing the time series before analysis resulted in an increase of 25-35% of correctly localized SOZ, while adding more nodes to the connectivity analysis led to a moderate decrease of 10%, when comparing 128 with 32 input nodes. The real-seizure connectivity estimates localized the SOZ inside the resection area using the ADTF coupled to outdegree or shortest path length. Our study showed that normalizing the time-series is an important pre-processing step, while adding nodes to the analysis did only marginally affect the SOZ localization. The study shows that directed multivariate Granger-based connectivity analysis is feasible with many input nodes (> 100) and that normalization of the time-series before connectivity analysis is preferred.
Understanding the impact of accreditation on quality in healthcare: A grounded theory approach.
Desveaux, L; Mitchell, J I; Shaw, J; Ivers, N M
2017-11-01
To explore how organizations respond to and interact with the accreditation process and the actual and potential mechanisms through which accreditation may influence quality. Qualitative grounded theory study. Organizations who had participated in Accreditation Canada's Qmentum program during January 2014-June 2016. Individuals who had coordinated the accreditation process or were involved in managing or promoting quality. The accreditation process is largely viewed as a quality assurance process, which often feeds in to quality improvement activities if the feedback aligns with organizational priorities. Three key stages are required for accreditation to impact quality: coherence, organizational buy-in and organizational action. These stages map to constructs outlined in Normalization Process Theory. Coherence is established when an organization and its staff perceive that accreditation aligns with the organization's beliefs, context and model of service delivery. Organizational buy-in is established when there is both a conceptual champion and an operational champion, and is influenced by both internal and external contextual factors. Quality improvement action occurs when organizations take purposeful action in response to observations, feedback or self-reflection resulting from the accreditation process. The accreditation process has the potential to influence quality through a series of three mechanisms: coherence, organizational buy-in and collective quality improvement action. Internal and external contextual factors, including individual characteristics, influence an organization's experience of accreditation. © The Author 2017. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com
Trosko, James E; Tai, Mei-Hui
2006-01-01
Inflammation, induced by microbial agents, radiation, endogenous or exogenous chemicals, has been associated with chronic diseases, including cancer. Since carcinogenesis has been characterized as consisting of the 'initiation', 'promotion' and 'progression' phases, the inflammatory process could affect any or all three phases. The stem cell theory of carcinogenesis has been given a revival, in that isolated human adult stem cells have been isolated and shown to be 'targets' for neoplastic transformation. Oct4, a transcription factor, has been associated with adult stem cells, as well as their immortalized and tumorigenic derivatives, but not with the normal differentiated daughters. These data are consistent with the stem cell theory of carcinogenesis. In addition, Gap Junctional Intercellular Communication (GJIC) seems to play a major role in cell growth. Inhibition of GJIC by non-genotoxic chemicals or various oncogenes seems to be the mechanism for the tumor promotion and progression phases of carcinogenesis. Many of the toxins, synthetic non-genotoxicants, and endogenous inflammatory factors have been shown to inhibit GJIC and act as tumor promoters. The inhibition of GJIC might be the mechanism by which the inflammatory process affects cancer and that to intervene during tumor promotion with anti-inflammatory factors might be the most efficacious anti-cancer strategy.
Challenging the Ideology of Normal in Schools
ERIC Educational Resources Information Center
Annamma, Subini A.; Boelé, Amy L.; Moore, Brooke A.; Klingner, Janette
2013-01-01
In this article, we build on Brantlinger's work to critique the binary of normal and abnormal applied in US schools that create inequities in education. Operating from a critical perspective, we draw from Critical Race Theory, Disability Studies in Education, and Cultural/Historical Activity Theory to build a conceptual framework for…
Introduction to "Queering the Writing Center"
ERIC Educational Resources Information Center
Eodice, Michele
2010-01-01
Queer theory challenges what is "normal" and questions the mechanics behind individuals and their institutions' efforts to maintain "normal." Queer theory can help a person get over himself/herself, and, as a result, the words, bodies, spaces, and beliefs that he/she holds dear will be called upon to respond. Harry Denny's article instructs…
High resolution beamforming on large aperture vertical line arrays: Processing synthetic data
NASA Astrophysics Data System (ADS)
Tran, Jean-Marie Q.; Hodgkiss, William S.
1990-09-01
This technical memorandum studies the beamforming of large aperture line arrays deployed vertically in the water column. The work concentrates on the use of high resolution techniques. Two processing strategies are envisioned: (1) full aperture coherent processing which offers in theory the best processing gain; and (2) subaperture processing which consists in extracting subapertures from the array and recombining the angular spectra estimated from these subarrays. The conventional beamformer, the minimum variance distortionless response (MVDR) processor, the multiple signal classification (MUSIC) algorithm and the minimum norm method are used in this study. To validate the various processing techniques, the ATLAS normal mode program is used to generate synthetic data which constitute a realistic signals environment. A deep-water, range-independent sound velocity profile environment, characteristic of the North-East Pacific, is being studied for two different 128 sensor arrays: a very long one cut for 30 Hz and operating at 20 Hz; and a shorter one cut for 107 Hz and operating at 100 Hz. The simulated sound source is 5 m deep. The full aperture and subaperture processing are being implemented with curved and plane wavefront replica vectors. The beamforming results are examined and compared to the ray-theory results produced by the generic sonar model.
Hiding in plain sight: communication theory in implementation science.
Manojlovich, Milisa; Squires, Janet E; Davies, Barbara; Graham, Ian D
2015-04-23
Poor communication among healthcare professionals is a pressing problem, contributing to widespread barriers to patient safety. The word "communication" means to share or make common. In the literature, two communication paradigms dominate: (1) communication as a transactional process responsible for information exchange, and (2) communication as a transformational process responsible for causing change. Implementation science has focused on information exchange attributes while largely ignoring transformational attributes of communication. In this paper, we debate the merits of encompassing both paradigms. We conducted a two-staged literature review searching for the concept of communication in implementation science to understand how communication is conceptualized. Twenty-seven theories, models, or frameworks were identified; only Rogers' Diffusion of Innovations theory provides a definition of communication and includes both communication paradigms. Most models (notable exceptions include Diffusion of Innovations, The Ottawa Model of Research Use, and Normalization Process Theory) describe communication as a transactional process. But thinking of communication solely as information transfer or exchange misrepresents reality. We recommend that implementation science theories (1) propose and test the concept of shared understanding when describing communication, (2) acknowledge that communication is multi-layered, identify at least a few layers, and posit how identified layers might affect the development of shared understanding, (3) acknowledge that communication occurs in a social context, providing a frame of reference for both individuals and groups, (4) acknowledge the unpredictability of communication (and healthcare processes in general), and (5) engage with and draw on work done by communication theorists. Implementation science literature has conceptualized communication as a transactional process (when communication has been mentioned at all), thereby ignoring a key contributor to implementation intervention success. When conceptualized as a transformational process, the focus of communication moves to shared understanding and is grounded in human interactions and the way we go about constructing knowledge. Instead of hiding in plain sight, we suggest explicitly acknowledging the role that communication plays in our implementation efforts. By using both paradigms, we can investigate when communication facilitates implementation, when it does not, and how to improve it so that our implementation and clinical interventions are embraced by clinicians and patients alike.
Quantized mode of a leaky cavity
NASA Astrophysics Data System (ADS)
Dutra, S. M.; Nienhuis, G.
2000-12-01
We use Thomson's classical concept of mode of a leaky cavity to develop a quantum theory of cavity damping. This theory generalizes the conventional system-reservoir theory of high-Q cavity damping to arbitrary Q. The small system now consists of damped oscillators corresponding to the natural modes of the leaky cavity rather than undamped oscillators associated with the normal modes of a fictitious perfect cavity. The formalism unifies semiclassical Fox-Li modes and the normal modes traditionally used for quantization. It also lays the foundations for a full quantum description of excess noise. The connection with Siegman's semiclassical work is straightforward. In a wider context, this theory constitutes a radical departure from present models of dissipation in quantum mechanics: unlike conventional models, system and reservoir operators no longer commute with each other. This noncommutability is an unavoidable consequence of having to use natural cavity modes rather than normal modes of a fictitious perfect cavity.
Rastatter, M; Dell, C W; McGuire, R A; Loren, C
1987-03-01
Previous studies investigating hemispheric organization for processing concrete and abstract nouns have provided conflicting results. Using manual reaction time tasks some studies have shown that the right hemisphere is capable of analyzing concrete words but not abstract. Others, however, have inferred that the left hemisphere is the sole analyzer of both types of lexicon. The present study tested these issues further by measuring vocal reaction times of normal subjects to unilaterally presented concrete and abstract items. Results were consistent with a model of functional localization which suggests that the minor hemisphere is capable of differentially processing both types of lexicon in the presence of a dominant left hemisphere.
Bamford, Claire; Heaven, Ben; May, Carl; Moynihan, Paula
2012-10-30
Optimizing the dietary intake of older people can prevent nutritional deficiencies and diet-related diseases, thereby improving quality of life. However, there is evidence that the nutritional intake of older people living in care homes is suboptimal, with high levels of saturated fat, salt, and added sugars. The UK Food Standards Agency therefore developed nutrient- and food-based guidance for residential care homes. The acceptability of these guidelines and their feasibility in practice is unknown. This study used the Normalization Process Theory (NPT) to understand the barriers and facilitators to implementing the guidelines and inform future implementation. We conducted a process evaluation in five care homes in the north of England using qualitative methods (observation and interviews) to explore the views of managers, care staff, catering staff, and domestic staff. Data were analyzed thematically and discussed in data workshops; emerging themes were then mapped to the constructs of NPT. Many staff perceived the guidelines as unnecessarily restrictive and irrelevant to older people. In terms of NPT, the guidelines simply did not make sense (coherence), and as a result, relatively few staff invested in the guidelines (cognitive participation). Even where staff supported the guidelines, implementation was hampered by a lack of nutritional knowledge and institutional support (collective action). Finally, the absence of observable benefits to clients confirmed the negative preconceptions of many staff, with limited evidence of reappraisal following implementation (reflexive monitoring). The successful implementation of the nutrition guidelines requires that the fundamental issues relating to their perceived value and fit with other priorities and goals be addressed. Specialist support is needed to equip staff with the technical knowledge and skills required for menu analysis and development and to devise ways of evaluating the outcomes of modified menus. NPT proved useful in conceptualizing barriers to implementation; robust links with behavior-change theories would further increase the practical utility of NPT.
2012-01-01
Background Optimizing the dietary intake of older people can prevent nutritional deficiencies and diet-related diseases, thereby improving quality of life. However, there is evidence that the nutritional intake of older people living in care homes is suboptimal, with high levels of saturated fat, salt, and added sugars. The UK Food Standards Agency therefore developed nutrient- and food-based guidance for residential care homes. The acceptability of these guidelines and their feasibility in practice is unknown. This study used the Normalization Process Theory (NPT) to understand the barriers and facilitators to implementing the guidelines and inform future implementation. Methods We conducted a process evaluation in five care homes in the north of England using qualitative methods (observation and interviews) to explore the views of managers, care staff, catering staff, and domestic staff. Data were analyzed thematically and discussed in data workshops; emerging themes were then mapped to the constructs of NPT. Results Many staff perceived the guidelines as unnecessarily restrictive and irrelevant to older people. In terms of NPT, the guidelines simply did not make sense (coherence), and as a result, relatively few staff invested in the guidelines (cognitive participation). Even where staff supported the guidelines, implementation was hampered by a lack of nutritional knowledge and institutional support (collective action). Finally, the absence of observable benefits to clients confirmed the negative preconceptions of many staff, with limited evidence of reappraisal following implementation (reflexive monitoring). Conclusions The successful implementation of the nutrition guidelines requires that the fundamental issues relating to their perceived value and fit with other priorities and goals be addressed. Specialist support is needed to equip staff with the technical knowledge and skills required for menu analysis and development and to devise ways of evaluating the outcomes of modified menus. NPT proved useful in conceptualizing barriers to implementation; robust links with behavior-change theories would further increase the practical utility of NPT. PMID:23110857
Cicchetti, Dante
2016-01-01
Developmental theories can be affirmed, challenged, and augmented by incorporating knowledge about atypical ontogenesis. Investigations of the biological, socioemotional, and personality development in individuals with high-risk conditions and psychopathological disorders can provide an entrée into the study of system organization, disorganization, and reorganization. This article examines child maltreatment to illustrate the benefit that can be derived from the study of individuals subjected to nonnormative caregiving experiences. Relative to an average expectable environment, which consists of a species-specific range of environmental conditions that support adaptive development among genetically normal individuals, maltreating families fail to provide many of the experiences that are required for normal development. Principles gleaned from the field of developmental psychopathology provide a framework for understanding multilevel functioning in normality and pathology. Knowledge of normative developmental processes provides the impetus to design and implement randomized control trial (RCT) interventions that can promote resilient functioning in maltreated children.
Intra- and interpattern relations in letter recognition.
Sanocki, T
1991-11-01
Strings of 4 unrelated letters were backward masked at varying durations to examine 3 major issues. (a) One issue concerned relational features. Letters with abnormal relations but normal elements were created by interchanging elements between large and small normal letters. Overall accuracy was higher for letters with normal relations, consistent with the idea that relational features are important in recognition. (b) Interpattern relations were examined by mixing large and small letters within strings. Relative to pure strings, accuracy was reduced, but only for small letters and only when in mixed strings. This effect can be attributed to attentional priority for larger forms over smaller forms, which also explains global precedence with hierarchical forms. (c) Forced-choice alternatives were manipulated in Experiments 2 and 3 to test feature integration theory. Relational information was found to be processed at least as early as feature presence or absence.
Sell, Stewart
2008-01-01
Identification of the cells in the liver that produce alpha-fetoprotein during development, in response to liver injury and during the early stages of chemical hepatocarcinogenesis led to the conclusion that maturation arrest of liver-determined tissue stem cells was the cellular process that gives rise to hepatocellular carcinomas. When the cellular changes in these processes were compared to that of the formation of teratocarcinomas, the hypothesis arose that all cancers arise from maturation arrest of tissue-determined stem cells. This was essentially a reinterpretation of the embryonal rest theory of cancer whereby tissue stem cells take the role of embryonal rests. A corollary of the stem cell theory of the origin of cancer is that cancers contain the same functional cell populations as normal tissues: stem cells, transit-amplifying cells and mature cells. Cancer stem cells retain the essential feature of normal stem cells: the ability to self-renew. Growth of cancers is due to continued proliferation of cancer transit-amplifying cells that do not differentiate to mature cells (maturation arrest). On the other hand, cancer stem cells generally divide very rarely and contribute little to tumor growth. However, the presence of cancer stem cells in tumors is believed to be responsible for the properties of immortalization, transplantability and resistance to therapy characteristic of cancers. Current therapies for cancer (chemotherapy, radiotherapy, antiangiogenesis and differentiation therapy) are directed against the cancer transit-amplifying cells. When these therapies are discontinued, the cancer reforms from the cancer stem cells. Therapy directed toward interruption of the cell signaling pathways that maintain cancer stem cells could lead to new modalities to the prevention of regrowth of the cancer. Copyright 2008 S. Karger AG, Basel.
Sell, Stewart
2008-01-01
Identification of the cells in the liver that produce alpha-fetoprotein (AFP) during development, in response to liver injury, and during the early stages of chemical hepatocarcinogenesis led to the conclusion that maturation arrest of liver-determined tissue stem cells was the cellular process that gives rise to hepatocellular carcinomas (HCC). When the cellular changes in these processes were compared that of the formation of teratocarcinomas, the hypothesis arose that all cancers arise from maturation arrest of tissue determined stem cells. This was essentially a reinterpretation of the embryonal rest theory of cancer whereby tissue stem cells take the role of embryonal rests. A corollary of the stem cell theory of the origin of cancer is that cancers contain the same functional cell populations as do normal tissues: stem cells, transit-amplifying cells, and mature cells. Cancer stem cells retain the essential feature of normal stem cells: the ability to self-renew. Growth of cancers is due to continued proliferation of cancer transit-amplifying cells that do not differentiate to mature cells (maturation arrest). On the other hand, cancer stem cells generally divide very rarely and contribute little to tumor growth. However, the presence of cancer stem cells in tumors is believed to be responsible for the properties of immortalization, transplantability and resistance to therapy characteristic of cancers. Current therapies for cancer (chemotherapy, radiotherapy, anti-angiogenesis and differentiation therapy) are directed against the cancer transit amplifying cells. When these therapies are discontinued, the cancer re-forms from the cancer stem cells. Therapy directed toward interruption of the cell-signaling pathways that maintain cancer stem cells could lead to new modalities to the prevention of re-growth of the cancer. PMID:18612221
McCarthy, R A
2001-02-01
Clinical and normal psychology have had a long tradition of close interaction in British psychology. The roots of this interplay may predate the development of the British Psychological Society, but the Society has encouraged and supported this line of research since its inception. One fundamental British insight has been to consider the evidence from pathology as a potential constraint on theories of normal function. In turn, theories of normal function have been used to understand and illuminate cognitive pathology. This review discusses some of the areas in which clinical contributions to cognitive theory have been most substantial. As with other contributions to this volume, attempts are also made to read the runes and anticipate future developments.
Cancer patients' experiences with nature: Normalizing dichotomous realities.
Blaschke, Sarah; O'Callaghan, Clare C; Schofield, Penelope; Salander, Pär
2017-01-01
To explore cancer patients' subjective experiences with nature in order to examine the relevance of nature-based care opportunities in cancer care contexts. The rationale was to describe the underlying mechanisms of this interaction and produce translatable knowledge. Qualitative research design informed by grounded theory. Sampling was initially convenience and then theoretical. Competent adults with any cancer diagnosis were eligible to participate in a semi-structured interview exploring views about the role of nature in their lives. Audio-recorded and transcribed interviews were analyzed using inductive, cyclic, and constant comparative analysis. Twenty cancer patients (9 female) reported detailed description about their experiences with nature from which a typology of five common nature interactions emerged. A theory model was generated constituting a core category and two inter-related themes explaining a normalization process in which patients negotiate their shifting realities (Core Category). Nature functioned as a support structure and nurtured patients' inner and outer capacities to respond and connect more effectively (Theme A). Once enabled and comforted, patients could engage survival and reconstructive maneuvers and explore the consequences of cancer (Theme B). A dynamic relationship was evident between moving away while, simultaneously, advancing towards the cancer reality in order to accept a shifting normality. From a place of comfort and safety, patients felt supported to deal differently and more creatively with the threat and demands of cancer diagnosis, treatment and outlook. New understanding about nature's role in cancer patients' lives calls attention to recognizing additional forms of psychosocial care that encourage patients' own coping and creative processes to deal with their strain and, in some cases, reconstruct everyday lives. Further research is required to determine how nature opportunities can be feasibly delivered in the cancer care setting. Copyright © 2016 Elsevier Ltd. All rights reserved.
Adaptive neural coding: from biological to behavioral decision-making
Louie, Kenway; Glimcher, Paul W.; Webb, Ryan
2015-01-01
Empirical decision-making in diverse species deviates from the predictions of normative choice theory, but why such suboptimal behavior occurs is unknown. Here, we propose that deviations from optimality arise from biological decision mechanisms that have evolved to maximize choice performance within intrinsic biophysical constraints. Sensory processing utilizes specific computations such as divisive normalization to maximize information coding in constrained neural circuits, and recent evidence suggests that analogous computations operate in decision-related brain areas. These adaptive computations implement a relative value code that may explain the characteristic context-dependent nature of behavioral violations of classical normative theory. Examining decision-making at the computational level thus provides a crucial link between the architecture of biological decision circuits and the form of empirical choice behavior. PMID:26722666
Local vibrational modes of the water dimer - Comparison of theory and experiment
NASA Astrophysics Data System (ADS)
Kalescky, R.; Zou, W.; Kraka, E.; Cremer, D.
2012-12-01
Local and normal vibrational modes of the water dimer are calculated at the CCSD(T)/CBS level of theory. The local H-bond stretching frequency is 528 cm-1 compared to a normal mode stretching frequency of just 143 cm-1. The adiabatic connection scheme between local and normal vibrational modes reveals that the lowering is due to mass coupling, a change in the anharmonicity, and coupling with the local HOH bending modes. The local mode stretching force constant is related to the strength of the H-bond whereas the normal mode stretching force constant and frequency lead to an erroneous underestimation of the H-bond strength.
Colarusso, Calvin A
2012-10-01
Moses Laufer described the central masturbation fantasy as an essentially adolescent phenomenon that leads to the final sexual organization. In this paper the central masturbation fantasy, formulated during the preoedipal and oedipal years, consolidated in adolescence, and in a process of continuous evolution across the life cycle, is considered an essential concept for understanding sexuality in heterosexual males. Sex and aggression, as posited in Freud's dual theory of the drives, are core components of all masturbation fantasies, across the diagnostic spectrum, from the most normal/neurotic to the most criminally bizarre. Clinical examples illustrate both points. The tendency among clinicians, particularly clinical associates, to avoid analyzing all aspects of masturbation is discussed.
Long-wave theory for a new convective instability with exponential growth normal to the wall.
Healey, J J
2005-05-15
A linear stability theory is presented for the boundary-layer flow produced by an infinite disc rotating at constant angular velocity in otherwise undisturbed fluid. The theory is developed in the limit of long waves and when the effects of viscosity on the waves can be neglected. This is the parameter regime recently identified by the author in a numerical stability investigation where a curious new type of instability was found in which disturbances propagate and grow exponentially in the direction normal to the disc, (i.e. the growth takes place in a region of zero mean shear). The theory describes the mechanisms controlling the instability, the role and location of critical points, and presents a saddle-point analysis describing the large-time evolution of a wave packet in frames of reference moving normal to the disc. The theory also shows that the previously obtained numerical solutions for numerically large wavelengths do indeed lie in the asymptotic long-wave regime, and so the behaviour and mechanisms described here may apply to a number of cross-flow instability problems.
Normalized value coding explains dynamic adaptation in the human valuation process.
Khaw, Mel W; Glimcher, Paul W; Louie, Kenway
2017-11-28
The notion of subjective value is central to choice theories in ecology, economics, and psychology, serving as an integrated decision variable by which options are compared. Subjective value is often assumed to be an absolute quantity, determined in a static manner by the properties of an individual option. Recent neurobiological studies, however, have shown that neural value coding dynamically adapts to the statistics of the recent reward environment, introducing an intrinsic temporal context dependence into the neural representation of value. Whether valuation exhibits this kind of dynamic adaptation at the behavioral level is unknown. Here, we show that the valuation process in human subjects adapts to the history of previous values, with current valuations varying inversely with the average value of recently observed items. The dynamics of this adaptive valuation are captured by divisive normalization, linking these temporal context effects to spatial context effects in decision making as well as spatial and temporal context effects in perception. These findings suggest that adaptation is a universal feature of neural information processing and offer a unifying explanation for contextual phenomena in fields ranging from visual psychophysics to economic choice.
The analysis of harmonic generation coefficients in the ablative Rayleigh-Taylor instability
NASA Astrophysics Data System (ADS)
Lu, Yan; Fan, Zhengfeng; Lu, Xinpei; Ye, Wenhua; Zou, Changlin; Zhang, Ziyun; Zhang, Wen
2017-10-01
In this research, we use the numerical simulation method to investigate the generation coefficients of the first three harmonics and the zeroth harmonic in the Ablative Rayleigh-Taylor Instability. It is shown that the interface shifts to the low temperature side during the ablation process. In consideration of the third-order perturbation theory, the first three harmonic amplitudes of the weakly nonlinear regime are calculated and then the harmonic generation coefficients are obtained by curve fitting. The simulation results show that the harmonic generation coefficients changed with time and wavelength. Using the higher-order perturbation theory, we find that more and more harmonics are generated in the later weakly nonlinear stage, which is caused by the negative feedback of the later higher harmonics. Furthermore, extending the third-order theory to the fifth-order theory, we find that the second and the third harmonics coefficients linearly depend on the wavelength, while the feedback coefficients are almost constant. Further analysis also shows that when the fifth-order theory is considered, the normalized effective amplitudes of second and third harmonics can reach about 25%-40%, which are only 15%-25% in the frame of the previous third-order theory. Therefore, the third order perturbation theory is needed to be modified by the higher-order theory when ηL reaches about 20% of the perturbation wavelength.
Aerodynamic characteristics of horizontal tail surfaces
NASA Technical Reports Server (NTRS)
Silverstein, Abe; Katzoff, S
1940-01-01
Collected data are presented on the aerodynamic characteristics of 17 horizontal tail surfaces including several with balanced elevators and two with end plates. Curves are given for coefficients of normal force, drag, and elevator hinge moment. A limited analysis of the results has been made. The normal-force coefficients are in better agreement with the lifting-surface theory of Prandtl and Blenk for airfoils of low aspect ratio than with the usual lifting-line theory. Only partial agreement exists between the elevator hinge-moment coefficients and those predicted by Glauert's thin-airfoil theory.
ERIC Educational Resources Information Center
Yuan, Ke-Hai; Lu, Laura
2008-01-01
This article provides the theory and application of the 2-stage maximum likelihood (ML) procedure for structural equation modeling (SEM) with missing data. The validity of this procedure does not require the assumption of a normally distributed population. When the population is normally distributed and all missing data are missing at random…
The Distribution of the Product Explains Normal Theory Mediation Confidence Interval Estimation.
Kisbu-Sakarya, Yasemin; MacKinnon, David P; Miočević, Milica
2014-05-01
The distribution of the product has several useful applications. One of these applications is its use to form confidence intervals for the indirect effect as the product of 2 regression coefficients. The purpose of this article is to investigate how the moments of the distribution of the product explain normal theory mediation confidence interval coverage and imbalance. Values of the critical ratio for each random variable are used to demonstrate how the moments of the distribution of the product change across values of the critical ratio observed in research studies. Results of the simulation study showed that as skewness in absolute value increases, coverage decreases. And as skewness in absolute value and kurtosis increases, imbalance increases. The difference between testing the significance of the indirect effect using the normal theory versus the asymmetric distribution of the product is further illustrated with a real data example. This article is the first study to show the direct link between the distribution of the product and indirect effect confidence intervals and clarifies the results of previous simulation studies by showing why normal theory confidence intervals for indirect effects are often less accurate than those obtained from the asymmetric distribution of the product or from resampling methods.
Heim, Stefan; Grande, Marion; Meffert, Elisabeth; Eickhoff, Simon B; Schreiber, Helen; Kukolja, Juraj; Shah, Nadim Jon; Huber, Walter; Amunts, Katrin
2010-12-01
Recent theories of developmental dyslexia explain reading deficits in terms of deficient phonological awareness, attention, visual and auditory processing, or automaticity. Since dyslexia has a neurobiological basis, the question arises how the reader's proficiency in these cognitive variables affects the brain regions involved in visual word recognition. This question was addressed in two fMRI experiments with 19 normally reading children (Experiment 1) and 19 children with dyslexia (Experiment 2). First, reading-specific brain activation was assessed by contrasting the BOLD signal for reading aloud words vs. overtly naming pictures of real objects. Next, ANCOVAs with brain activation during reading the individuals' scores for all five cognitive variables assessed outside the scanner as covariates were performed. Whereas the normal readers' brain activation during reading showed co-variation effects predominantly in the right hemisphere, the reverse pattern was observed for the dyslexics. In particular, middle frontal gyrus, inferior parietal cortex, and precuneus showed contralateral effects for controls as compared to dyslexics. In line with earlier findings in the literature, these data hint at a global change in hemispheric asymmetry during cognitive processing in dyslexic readers, which, in turn, might affect reading proficiency. Copyright © 2010 Elsevier Inc. All rights reserved.
Li, Jingrui; Kondov, Ivan; Wang, Haobin; Thoss, Michael
2015-04-10
A recently developed methodology to simulate photoinduced electron transfer processes at dye-semiconductor interfaces is outlined. The methodology employs a first-principles-based model Hamiltonian and accurate quantum dynamics simulations using the multilayer multiconfiguration time-dependent Hartree approach. This method is applied to study electron injection in the dye-semiconductor system coumarin 343-TiO2. Specifically, the influence of electronic-vibrational coupling is analyzed. Extending previous work, we consider the influence of Dushinsky rotation of the normal modes as well as anharmonicities of the potential energy surfaces on the electron transfer dynamics.
Matheson, Lauren; Boulton, Mary; Lavender, Verna; Protheroe, Andrew; Brand, Sue; Wanat, Marta; Watson, Eila
2016-02-01
Testicular cancer commonly affects men in the prime of their lives. While survival rates are excellent, little previous research has examined men's experiences of adjustment to survivorship. We aimed to explore this issue in younger testicular cancer survivors. In-depth qualitative interviews were conducted with testicular cancer survivors over two time points approximately 6 months apart in the year following treatment completion. Interviews were analysed using a grounded theory approach. The sample included 18 testicular cancer survivors between 22 and 44 years (mean age 34). A grounded theory was developed, which explained the process of positive adjustment over the first year following the treatment completion in terms of men's ability to dismantle the present and future threats of cancer, involving the key transitions of gaining a sense of perspective and striving to get on with life and restore normality. These were facilitated by six key processes. The processes that explained a negative adjustment trajectory are also presented. These findings contribute to the understanding of the psychosocial impact of testicular cancer on younger men's lives and have implications for the provision of support to testicular cancer survivors. Further investigation into the feasibility of one-on-one peer support interventions is warranted, as well as informal support that respects men's desire for independence. Understanding the processes involved in adjustment highlights ways in which health professionals can offer support to those struggling to adjust through challenging illness beliefs, encouraging emotional disclosure and facilitating peer mentoring.
Prediction of static friction coefficient in rough contacts based on the junction growth theory
NASA Astrophysics Data System (ADS)
Spinu, S.; Cerlinca, D.
2017-08-01
The classic approach to the slip-stick contact is based on the framework advanced by Mindlin, in which localized slip occurs on the contact area when the local shear traction exceeds the product between the local pressure and the static friction coefficient. This assumption may be too conservative in the case of high tractions arising at the asperities tips in the contact of rough surfaces, because the shear traction may be allowed to exceed the shear strength of the softer material. Consequently, the classic frictional contact model is modified in this paper so that gross sliding occurs when the junctions formed between all contacting asperities are independently sheared. In this framework, when the contact tractions, normal and shear, exceed the hardness of the softer material on the entire contact area, the material of the asperities yields and the junction growth process ends in all contact regions, leading to gross sliding inception. This friction mechanism is implemented in a previously proposed numerical model for the Cattaneo-Mindlin slip-stick contact problem, which is modified to accommodate the junction growth theory. The frictionless normal contact problem is solved first, then the tangential force is gradually increased, until gross sliding inception. The contact problems in the normal and in the tangential direction are successively solved, until one is stabilized in relation to the other. The maximum tangential force leading to a non-vanishing stick area is the static friction force that can be sustained by the rough contact. The static friction coefficient is eventually derived as the ratio between the latter friction force and the normal force.
Normal Science Education and its Dangers: The Case of School Chemistry
NASA Astrophysics Data System (ADS)
Van Berkel, Berry; De Vos, Wobbe; Verdonk, Adri H.; Pilot, Albert
We started the Conceptual Structure of School Chemistry research project, a part of which is reported on here, with an attempt to solve the problem of the hidden structure in school chemistry. In order to solve that problem, and informed by previous research, we performed a content analysis of school chemistry textbooks and syllabi. This led us to the hypothesis that school chemistry curricula are based on an underlying, coherent structure of chemical concepts that students are supposed to learn for the purpose of explaining and predicting chemical phenomena. The elicited comments and criticisms of an International Forum of twenty-eight researchers of chemical education, though, refuted the central claims of this hypothesis. This led to a descriptive theory of the currently dominant school chemistry curriculum in terms of a rigid combination of a specific substantive structure, based on corpuscular theory, a specific philosophical structure, educational positivism, and a specific pedagogical structure, involving initiatory and preparatory training of future chemists. Secondly, it led to an explanatory theory of the structure of school chemistry - based on Kuhn's theory of normal science and scientific training - in which dominant school chemistry is interpreted as a form of normal science education. Since the former has almost all characteristics in common with the latter, dominant school chemistry must be regarded as normal chemistry education. Forum members also formulated a number of normative criticisms on dominant school chemistry, which we interpret as specific dangers of normal chemistry education, complementing Popper's discussion of the general dangers of normal science and its teaching. On the basis of these criticisms, it is argued that normal chemistry education is isolated from common sense, everyday life and society, history and philosophy of science, technology, school physics, and from chemical research.
Fonseca, A C; Yule, W
1995-12-01
Two studies were conducted to test the hypotheses derived from Eysenck's and Gray's theories of personality regarding antisocial behavior. For this purpose the Eysenck Personality Questionnaire (Junior) (EPQ-Junior) and a card task aimed at measuring sensitivity to reward were used in each of the studies. The first study compared a group of juvenile delinquents with a group of nondelinquents and the second study compared a group of severely conduct-disordered children with a group of normal children. The results did not support Eysenck's claim that delinquents score higher than their normal counterparts on extraversion, neuroticism, and psychoticism. Some support was found for the hypothesis derived from Gray's theory: Children and adolescents with severe antisocial behavior were more sensitive to rewards than their normal counterparts.
Kletenik-Edelman, Orly; Reichman, David R; Rabani, Eran
2011-01-28
A novel quantum mode coupling theory combined with a kinetic approach is developed for the description of collective density fluctuations in quantum liquids characterized by Boltzmann statistics. Three mode-coupling approximations are presented and applied to study the dynamic response of para-hydrogen near the triple point and normal liquid helium above the λ-transition. The theory is compared with experimental results and to the exact imaginary time data generated by path integral Monte Carlo simulations. While for liquid para-hydrogen the combination of kinetic and quantum mode-coupling theory provides semi-quantitative results for both short and long time dynamics, it fails for normal liquid helium. A discussion of this failure based on the ideal gas limit is presented.
Hypergame theory applied to cyber attack and defense
NASA Astrophysics Data System (ADS)
House, James Thomas; Cybenko, George
2010-04-01
This work concerns cyber attack and defense in the context of game theory--specifically hypergame theory. Hypergame theory extends classical game theory with the ability to deal with differences in players' expertise, differences in their understanding of game rules, misperceptions, and so forth. Each of these different sub-scenarios, or subgames, is associated with a probability--representing the likelihood that the given subgame is truly "in play" at a given moment. In order to form an optimal attack or defense policy, these probabilities must be learned if they're not known a-priori. We present hidden Markov model and maximum entropy approaches for accurately learning these probabilities through multiple iterations of both normal and modified game play. We also give a widely-applicable approach for the analysis of cases where an opponent is aware that he is being studied, and intentionally plays to spoil the process of learning and thereby obfuscate his attributes. These are considered in the context of a generic, abstract cyber attack example. We demonstrate that machine learning efficacy can be heavily dependent on the goals and styles of participant behavior. To this end detailed simulation results under various combinations of attacker and defender behaviors are presented and analyzed.
Hierarchical nonlinear dynamics of human attention.
Rabinovich, Mikhail I; Tristan, Irma; Varona, Pablo
2015-08-01
Attention is the process of focusing mental resources on a specific cognitive/behavioral task. Such brain dynamics involves different partially overlapping brain functional networks whose interconnections change in time according to the performance stage, and can be stimulus-driven or induced by an intrinsically generated goal. The corresponding activity can be described by different families of spatiotemporal discrete patterns or sequential dynamic modes. Since mental resources are finite, attention modalities compete with each other at all levels of the hierarchy, from perception to decision making and behavior. Cognitive activity is a dynamical process and attention possesses some universal dynamical characteristics. Thus, it is time to apply nonlinear dynamical theory for the description and prediction of hierarchical attentional tasks. Such theory has to include the analyses of attentional control stability, the time cost of attention switching, the finite capacity of informational resources in the brain, and the normal and pathological bifurcations of attention sequential dynamics. In this paper we have integrated today's knowledge, models and results in these directions. Copyright © 2015 Elsevier Ltd. All rights reserved.
Dugdale, Stephanie; Elison, Sarah; Davies, Glyn; Ward, Jonathan
2017-06-01
There is insufficient research examining the implementation of complex novel interventions within health care. This may be due to a lack of qualitative research providing subjective insights into these implementation processes. The authors investigate the advantages of applying behavior change theories to conceptualize qualitative data describing the processes of implementation of complex interventions. Breaking Free Online (BFO), a digital treatment intervention for substance misuse, is described as an example of a complex intervention. The authors review previous qualitative research which explored initial diffusion, or spread, of the BFO program, and its subsequent normalization as part of standard treatment for substance misuse within the health and social care charity, "Change, Grow, Live" (CGL). The use of behavior change models to structure qualitative interview findings enabled identification of facilitators and barriers to the use of BFO within CGL. These findings have implications for the development of implementation research in novel health care interventions.
Robert, Kylie A; Brunet-Rossinni, Anja; Bronikowski, Anne M
2007-06-01
We test the 'free radical theory of aging' using six species of colubrid snakes (numerous, widely distributed, non-venomous snakes of the family Colubridae) that exhibit long (> 15 years) or short (< 10 years) lifespans. Because the 'rate of living theory' predicts metabolic rates to be correlated with rates of aging and oxidative damage results from normal metabolic processes we sought to answer whether physiological parameters and locomotor performance (which is a good predictor of survival in juvenile snakes) mirrored the evolution of lifespans in these colubrid snakes. We measured whole animal metabolic rate (oxygen consumption Vo2), locomotor performance, cellular metabolic rate (mitochondrial oxygen consumption), and oxidative stress potential (hydrogen peroxide production by mitochondria). Longer-lived colubrid snakes have greater locomotor performance and reduced hydrogen peroxide production than short-lived species, while whole animal metabolic rates and mitochondrial efficiency did not differ with lifespan. We present the first measures testing the 'free radical theory of aging' using reptilian species as model organisms. Using reptiles with different lifespans as model organisms should provide greater insight into mechanisms of aging.
Li, Rui; Ye, Hongfei; Zhang, Weisheng; Ma, Guojun; Su, Yewang
2015-10-29
Spring constant calibration of the atomic force microscope (AFM) cantilever is of fundamental importance for quantifying the force between the AFM cantilever tip and the sample. The calibration within the framework of thin plate theory undoubtedly has a higher accuracy and broader scope than that within the well-established beam theory. However, thin plate theory-based accurate analytic determination of the constant has been perceived as an extremely difficult issue. In this paper, we implement the thin plate theory-based analytic modeling for the static behavior of rectangular AFM cantilevers, which reveals that the three-dimensional effect and Poisson effect play important roles in accurate determination of the spring constants. A quantitative scaling law is found that the normalized spring constant depends only on the Poisson's ratio, normalized dimension and normalized load coordinate. Both the literature and our refined finite element model validate the present results. The developed model is expected to serve as the benchmark for accurate calibration of rectangular AFM cantilevers.
NASA Astrophysics Data System (ADS)
Becker, Leif E.; Shelley, Michael J.
2000-11-01
First normal stress differences in shear flow are a fundamental property of Non-Newtonian fluids. Experiments involving dilute suspensions of slender fibers exhibit a sharp transition to non-zero normal stress differences beyond a critical shear rate, but existing continuum theories for rigid rods predict neither this transition nor the corresponding magnitude of this effect. We present the first conclusive evidence that elastic instabilities are predominantly responsible for observed deviations from the dilute suspension theory of rigid rods. Our analysis is based on slender body theory and the equilibrium equations of elastica. A straight slender body executing its Jeffery orbit in Couette flow is subject to axial fluid forcing, alternating between compression and tension. We present a stability analysis showing that elastic instabilities are possible for strong flows. Simulations give the fully non-linear evolution of this shape instability, and show that flexibility of the fibers alone is sufficient to cause both shear-thinning and significant first normal stress differences.
A stress-induced phase transition model for semi-crystallize shape memory polymer
NASA Astrophysics Data System (ADS)
Guo, Xiaogang; Zhou, Bo; Liu, Liwu; Liu, Yanju; Leng, Jinsong
2014-03-01
The developments of constitutive models for shape memory polymer (SMP) have been motivated by its increasing applications. During cooling or heating process, the phase transition which is a continuous time-dependent process happens in semi-crystallize SMP and the various individual phases form at different temperature and in different configuration. Then, the transformation between these phases occurred and shape memory effect will emerge. In addition, stress applied on SMP is an important factor for crystal melting during phase transition. In this theory, an ideal phase transition model considering stress or pre-strain is the key to describe the behaviors of shape memory effect. So a normal distributed model was established in this research to characterize the volume fraction of each phase in SMP during phase transition. Generally, the experiment results are partly backward (in heating process) or forward (in cooling process) compared with the ideal situation considering delay effect during phase transition. So, a correction on the normal distributed model is needed. Furthermore, a nonlinear relationship between stress and phase transition temperature Tg is also taken into account for establishing an accurately normal distributed phase transition model. Finally, the constitutive model which taking the stress as an influence factor on phase transition was also established. Compared with the other expressions, this new-type model possesses less parameter and is more accurate. For the sake of verifying the rationality and accuracy of new phase transition and constitutive model, the comparisons between the simulated and experimental results were carried out.
Selfishness, warfare, and economics; or integration, cooperation, and biology
Salvucci, Emiliano
2012-01-01
The acceptance of Darwin's theory of evolution by natural selection is not complete and it has been pointed out its limitation to explain the complex processes that constitute the transformation of species. It is necessary to discuss the explaining power of the dominant paradigm. It is common that new discoveries bring about contradictions that are intended to be overcome by adjusting results to the dominant reductionist paradigm using all sorts of gradations and combinations that are admitted for each case. In addition to the discussion on the validity of natural selection, modern findings represent a challenge to the interpretation of the observations with the Darwinian view of competition and struggle for life as theoretical basis. New holistic interpretations are emerging related to the Net of Life, in which the interconnection of ecosystems constitutes a dynamic and self-regulating biosphere: viruses are recognized as a macroorganism with a huge collection of genes, most unknown that constitute the major planet's gene pool. They play a fundamental role in evolution since their sequences are capable of integrating into the genomes in an “infective” way and become an essential part of multicellular organisms. They have content with “biological sense” i.e., they appear as part of normal life processes and have a serious role as carrier elements of complex genetic information. Antibiotics are cell signals with main effects on general metabolism and transcription on bacterial cells and communities. The hologenome theory considers an organism and all of its associated symbiotic microbes (parasites, mutualists, synergists, amensalists) as a result of symbiopoiesis. Microbes, helmints, that are normally understood as parasites are cohabitants and they have cohabited with their host and drive the evolution and existence of the partners. Each organism is the result of integration of complex systems. The eukaryotic organism is the result of combination of bacterial, virus, and eukaryotic DNA and it is the result of the interaction of its own genome with the genome of its microbiota, and their metabolism are intertwined (as a “superorganism”) along evolution. The darwinian paradigm had its origin in the free market theories and concepts of Malthus and Spencer. Then, nature was explained on the basis of market theories moving away from an accurate explanation of natural phenomena. It is necessary to acknowledge the limitations of the dominant dogma. These new interpretations about biological processes, molecules, roles of viruses in nature, and microbial interactions are remarkable points to be considered in order to construct a solid theory adjusted to the facts and with less speculations and tortuous semantic traps. PMID:22919645
Selfishness, warfare, and economics; or integration, cooperation, and biology.
Salvucci, Emiliano
2012-01-01
The acceptance of Darwin's theory of evolution by natural selection is not complete and it has been pointed out its limitation to explain the complex processes that constitute the transformation of species. It is necessary to discuss the explaining power of the dominant paradigm. It is common that new discoveries bring about contradictions that are intended to be overcome by adjusting results to the dominant reductionist paradigm using all sorts of gradations and combinations that are admitted for each case. In addition to the discussion on the validity of natural selection, modern findings represent a challenge to the interpretation of the observations with the Darwinian view of competition and struggle for life as theoretical basis. New holistic interpretations are emerging related to the Net of Life, in which the interconnection of ecosystems constitutes a dynamic and self-regulating biosphere: viruses are recognized as a macroorganism with a huge collection of genes, most unknown that constitute the major planet's gene pool. They play a fundamental role in evolution since their sequences are capable of integrating into the genomes in an "infective" way and become an essential part of multicellular organisms. They have content with "biological sense" i.e., they appear as part of normal life processes and have a serious role as carrier elements of complex genetic information. Antibiotics are cell signals with main effects on general metabolism and transcription on bacterial cells and communities. The hologenome theory considers an organism and all of its associated symbiotic microbes (parasites, mutualists, synergists, amensalists) as a result of symbiopoiesis. Microbes, helmints, that are normally understood as parasites are cohabitants and they have cohabited with their host and drive the evolution and existence of the partners. Each organism is the result of integration of complex systems. The eukaryotic organism is the result of combination of bacterial, virus, and eukaryotic DNA and it is the result of the interaction of its own genome with the genome of its microbiota, and their metabolism are intertwined (as a "superorganism") along evolution. The darwinian paradigm had its origin in the free market theories and concepts of Malthus and Spencer. Then, nature was explained on the basis of market theories moving away from an accurate explanation of natural phenomena. It is necessary to acknowledge the limitations of the dominant dogma. These new interpretations about biological processes, molecules, roles of viruses in nature, and microbial interactions are remarkable points to be considered in order to construct a solid theory adjusted to the facts and with less speculations and tortuous semantic traps.
Health as normal function: a weak link in Daniels's theory of just health distribution.
Krag, Erik
2014-10-01
Drawing on Christopher Boorse's Biostatistical Theory (BST), Norman Daniels contends that a genuine health need is one which is necessary to restore normal functioning - a supposedly objective notion which he believes can be read from the natural world without reference to potentially controversial normative categories. But despite his claims to the contrary, this conception of health harbors arbitrary evaluative judgments which make room for intractable disagreement as to which conditions should count as genuine health needs and therefore which needs should be met. I begin by offering a brief summary of Boorse's BST, the theory to which Daniels appeals for providing the conception of health as normal functioning upon which his overall distributive scheme rests. Next, I consider what I call practical objections to Daniels's use of Boorse's theory. Finally I recount Elseljin Kingma's theoretical objection to Boorse's BST and discuss its impact on Daniels's overall theory. Though I conclude that Boorse's view, so weakened, will no longer be able to sustain the judgments which Daniels's theory uses it to reach, in the end, I offer Daniels an olive branch by briefly sketching an alternative strategy for reaching suitably objective conclusions regarding the health and/or disease status of various conditions. © 2012 John Wiley & Sons Ltd.
Binding in agrammatic aphasia: Processing to comprehension
Janet Choy, Jungwon; Thompson, Cynthia K.
2010-01-01
Background Theories of comprehension deficits in Broca’s aphasia have largely been based on the pattern of deficit found with movement constructions. However, some studies have found comprehension deficits with binding constructions, which do not involve movement. Aims This study investigates online processing and offline comprehension of binding constructions, such as reflexive (e.g., himself) and pronoun (e.g., him) constructions in unimpaired and aphasic individuals in an attempt to evaluate theories of agrammatic comprehension. Methods & Procedures Participants were eight individuals with agrammatic Broca’s aphasia and eight age-matched unimpaired individuals. We used eyetracking to examine online processing of binding constructions while participants listened to stories. Offline comprehension was also tested. Outcomes & Results The eye movement data showed that individuals with Broca’s aphasia were able to automatically process the correct antecedent of reflexives and pronouns. In addition, their syntactic processing of binding was not delayed compared to normal controls. Nevertheless, offline comprehension of both pronouns and reflexives was significantly impaired compared to the control participants. This comprehension failure was reflected in the aphasic participants’ eye movements at sentence end, where fixations to the competitor increased. Conclusions These data suggest that comprehension difficulties with binding constructions seen in agrammatic aphasic patients are not due to a deficit in automatic syntactic processing or delayed processing. Rather, they point to a possible deficit in lexical integration. PMID:20535243
ERIC Educational Resources Information Center
Weiwei, Huang
2016-01-01
As a theory based on the hypothesis of "happy man" about human nature, happiness management plays a significant guiding role in the optimization of the training model of local Chinese normal university students during the transitional period. Under the guidance of this theory, China should adhere to the people-oriented principle,…
ERIC Educational Resources Information Center
O'Connor, Akira R.; Moulin, Christopher J. A.
2006-01-01
We report the case of a 25-year-old healthy, blind male, MT, who experiences normal patterns of deja vu. The optical pathway delay theory of deja vu formation assumes that neuronal input from the optical pathways is necessary for the formation of the experience. Surprisingly, although the sensation of deja vu is known to be experienced by blind…
Destination memory and cognitive theory of mind in normal ageing.
El Haj, Mohamad; Raffard, Stéphane; Gély-Nargeot, Marie-Christine
2016-01-01
Destination memory is the ability to remember the destination to which a piece of information has been addressed (e.g., "Did I tell you about the promotion?"). This ability is found to be impaired in normal ageing. Our work aimed to link this deterioration to the decline in theory of mind. Forty younger adults (M age = 23.13 years, SD = 4.00) and 36 older adults (M age = 69.53 years, SD = 8.93) performed a destination memory task. They also performed the False-belief test addressing cognitive theory of mind and the Reading the mind in the eyes test addressing affective theory of mind. Results showed significant deterioration in destination memory, cognitive theory of mind and affective theory of mind in the older adults. The older adults' performance on destination memory was significantly correlated with and predicted by their performance on cognitive theory of mind. Difficulties in the ability to interpret and predict others' mental states are related to destination memory decline in older adults.
Hooker, Leesa; Small, Rhonda; Humphreys, Cathy; Hegarty, Kelsey; Taft, Angela
2015-03-28
In Victoria, Australia, Maternal and Child Health (MCH) services deliver primary health care to families with children 0-6 years, focusing on health promotion, parenting support and early intervention. Family violence (FV) has been identified as a major public health concern, with increased prevalence in the child-bearing years. Victorian Government policy recommends routine FV screening of all women attending MCH services. Using Normalization Process Theory (NPT), we aimed to understand the barriers and facilitators of implementing an enhanced screening model into MCH nurse clinical practice. NPT informed the process evaluation of a pragmatic, cluster randomised controlled trial in eight MCH nurse teams in metropolitan Melbourne, Victoria, Australia. Using mixed methods (surveys and interviews), we explored the views of MCH nurses, MCH nurse team leaders, FV liaison workers and FV managers on implementation of the model. Quantitative data were analysed by comparing proportionate group differences and change within trial arm over time between interim and impact nurse surveys. Qualitative data were inductively coded, thematically analysed and mapped to NPT constructs (coherence, cognitive participation, collective action and reflexive monitoring) to enhance our understanding of the outcome evaluation. MCH nurse participation rates for interim and impact surveys were 79% (127/160) and 71% (114/160), respectively. Twenty-three key stakeholder interviews were completed. FV screening work was meaningful and valued by participants; however, the implementation coincided with a significant (government directed) change in clinical practice which impacted on full engagement with the model (coherence and cognitive participation). The use of MCH nurse-designed FV screening/management tools in focussed women's health consultations and links with FV services enhanced the participants' work (collective action). Monitoring of FV work (reflexive monitoring) was limited. The use of theory-based process evaluation helped identify both what inhibited and enhanced intervention effectiveness. Successful implementation of an enhanced FV screening model for MCH nurses occurred in the context of focussed women's health consultations, with the use of a maternal health and wellbeing checklist and greater collaboration with FV services. Improving links with these services and the ongoing appraisal of nurse work would overcome the barriers identified in this study.
Wang, Min; Fu, Jinfeng; Lv, Mengying; Tian, Yuan; Xu, Fengguo; Song, Rui; Zhang, Zunjian
2014-09-01
As a specific item mentioned in traditional Chinese medicine theory, processing can fulfill different requirements of therapies. Crude and wine-processed rhubarbs are used as drastic and mild laxatives, respectively. In this study, a practical method based on ultra-fast liquid chromatography coupled with diode-array detection and ion trap time-of-flight mass spectrometry was developed to screen and analyze multiple absorbed bioactive components and metabolites in the serum of both normal and acute blood stasis rats after oral administration of crude or wine-processed rhubarbs. A total of 16 compounds, mainly including phase II metabolites, were tentatively identified. Possible explanations for the processing-induced changes in pharmacological effects of traditional Chinese medicines were first explored at serum pharmacochemistry level. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
[Specificity hypothesis of a theory of mind deficit in early childhood autism].
Kissgen, R; Schleiffer, R
2002-02-01
In order to test the hypothesis that a theory of mind deficit is specific for autism, the present study presents the first replication of the Sally-Anne test (Baron-Cohen, Leslie & Frith, 1985) in the German-speaking countries. The Sally-Anne test was administered to 16 autistic, 24 probands with Down's syndrome and 20 normal preschool prosands. The intelligence of the autistic group and that with Down's syndrome was measured by the CPM/SPM. In addition, the ADI-R was used with the principal caregivers of the autistic and Down's syndrome subjects. With regard to the clinical diagnosis, theory of mind deficit turned out to be not specific for autism. Six of 16 (37.5%) autistic subjects passed the theory of mind tasks. Thus performance in the autistic group surpassed that of both control groups. Out of 16 autistic subjects, autism could be confirmed in only 8 on the basis of the ADI-R diagnostic criteria, only one of whom showed a theory of mind. The autistic individuals with a theory of mind differed significantly in their mean IQ from those without this ability. Spectrum and specificity of a theory of mind deficit in autism remain controversial. For further research it seems important to administer the ADI-R during the diagnostic process. The findings suggest that the clinical diagnosis of autism is not precise enough to distinguish between autism and nonautistic mental handicap.
Kemper, A. F.; Sentef, M. A.; Moritz, B.; ...
2017-07-13
Here. we review recent work on the theory for pump/probe photoemission spectroscopy of electron-phonon mediated superconductors in both the normal and the superconducting states. We describe the formal developments that allow one to solve the Migdal-Eliashberg theory in nonequilibrium for an ultrashort laser pumping field, and explore the solutions which illustrate the relaxation as energy is transferred from electrons to phonons. We also focus on exact results emanating from sum rules and approximate numerical results which describe rules of thumb for relaxation processes. Additionally, in the superconducting state, we describe how Anderson-Higgs oscillations can be excited due to the nonlinearmore » coupling with the electric field and describe mechanisms where pumping the system enhances superconductivity.« less
Howard, Marc W.; Bessette-Symons, Brandy; Zhang, Yaofei; Hoyer, William J.
2006-01-01
Younger and older adults were tested on recognition memory for pictures. The Yonelinas high threshold (YHT) model, a formal implementation of two-process theory, fit the response distribution data of both younger and older adults significantly better than a normal unequal variance signal detection model. Consistent with this finding, non-linear zROC curves were obtained for both groups. Estimates of recollection from the YHT model were significantly higher for younger than older adults. This deficit was not a consequence of a general decline in memory; older adults showed comparable overall accuracy and in fact a non-significant increase in their familiarity scores. Implications of these results for theories of recognition memory and the mnemonic deficit associated with aging are discussed. PMID:16594795
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kemper, A. F.; Sentef, M. A.; Moritz, B.
Here. we review recent work on the theory for pump/probe photoemission spectroscopy of electron-phonon mediated superconductors in both the normal and the superconducting states. We describe the formal developments that allow one to solve the Migdal-Eliashberg theory in nonequilibrium for an ultrashort laser pumping field, and explore the solutions which illustrate the relaxation as energy is transferred from electrons to phonons. We also focus on exact results emanating from sum rules and approximate numerical results which describe rules of thumb for relaxation processes. Additionally, in the superconducting state, we describe how Anderson-Higgs oscillations can be excited due to the nonlinearmore » coupling with the electric field and describe mechanisms where pumping the system enhances superconductivity.« less
An Overview of Non-pathological Geroneuropsychology: Implications for Nursing Practice and Research
Graham, Martha A.; Fazeli, Pariya L.; Heaton, Karen; Moneyham, Linda
2011-01-01
One aspect of successful aging is maintaining cognitive functioning; that includes both subjective cognitive functioning and objective cognitive functioning even in lieu of subtle cognitive deficits that occur with normal, non-pathological aging. Age-related cognitive deficits emerge across several domains including attention, memory, language, speed of processing, executive, and psychomotor, just to name a few. A primary theory explaining such cognitive deficits is cognitive reserve theory; it posits that biological factors such as demyelination and oxidative stress interfere with neuronal communication which eventually produces observable deficits in cognitive functioning. Therefore, it is important to maintain or improve cognitive reserve in order to augment cognitive functioning in later life. This article provides a general overview of the principles of geroneuropsychology along with implications for nursing practice and research. PMID:22210304
Thermal Theory of Combustion and Explosion. 3; Theory of Normal Flame Propagation
NASA Technical Reports Server (NTRS)
Semenov, N. N.
1942-01-01
The technical memorandum covers experimental data on flame propagation, the velocity of flame propagation, analysis of the old theoretical views of flame propagation, confirmation of the theory for simple reactions (theory of combustion of explosive substances and in particular nitroglycol), and check of the theory by example of a chain oxidizing reaction (theory of flame propagation in carbon monoxide, air and carbon monoxide - oxygen mixtures).
On the Stem Cell Origin of Cancer
Sell, Stewart
2010-01-01
In each major theory of the origin of cancer—field theory, chemical carcinogenesis, infection, mutation, or epigenetic change—the tissue stem cell is involved in the generation of cancer. Although the cancer type is identified by the more highly differentiated cells in the cancer cell lineage or hierarchy (transit-amplifying cells), the property of malignancy and the molecular lesion of the cancer exist in the cancer stem cell. In the case of teratocarcinomas, normal germinal stem cells have the potential to become cancers if placed in an environment that allows expression of the cancer phenotype (field theory). In cancers due to chemically induced mutations, viral infections, somatic and inherited mutations, or epigenetic changes, the molecular lesion or infection usually first occurs in the tissue stem cells. Cancer stem cells then give rise to transit-amplifying cells and terminally differentiated cells, similar to what happens in normal tissue renewal. However, the major difference between cancer growth and normal tissue renewal is that whereas normal transit amplifying cells usually differentiate and die, at various levels of differentiation, the cancer transit-amplifying cells fail to differentiate normally and instead accumulate (ie, they undergo maturation arrest), resulting in cancer growth. PMID:20431026
Order, topology and preference
NASA Technical Reports Server (NTRS)
Sertel, M. R.
1971-01-01
Some standard order-related and topological notions, facts, and methods are brought to bear on central topics in the theory of preference and the theory of optimization. Consequences of connectivity are considered, especially from the viewpoint of normally preordered spaces. Examples are given showing how the theory of preference, or utility theory, can be applied to social analysis.
The enhancement mechanism of wine-processed Radix Scutellaria on NTG-induced migraine rats.
Cui, Cheng-Long; He, Xin; Dong, Cui-Lan; Song, Zi-Jing; Ji, Jun; Wang, Xue; Wang, Ling; Wang, Jiao-Ying; Du, Wen-Juan; Wang, Chong-Zhi; Yuan, Chun-Su; Guo, Chang-Run; Zhang, Chun-Feng
2017-07-01
To elucidate the increasing dissolution and enhancement mechanism of wine-processed Radix Scutellaria (RS) by fractal theory in nitroglycerin (NTG)-induced migraine rats. We prepared three RS from the process with 10% (S1), 15% (S2), 20% (S3) (v/m) rice wine. Mercury intrusion porosimetry and scanning electron microscope were employed to explore the internal structure of RS and the components dissolution of RS was analyzed by HPLC. Rats were randomly allocated into following groups and orally given different solutions for 10days: normal group (NOR, normal saline), model group (MOD, normal saline), Tianshu capsule group (TSC, 0.425mg/kg), ibuprofen group (IBU, 0.0821mg/kg), crude RS group (CRU, 1.04mg/kg) and wine-processed RS group (WP, 1.04mg/kg) followed by bolus subcutaneously injection of NTG (10mg/kg) to induce migraine model except NOR. Biochemical indexes (nitric oxide-NO, calcitonin-gene-related peptide-CGRP, and endothelin-ET) and c-fos positive cells were measured with commercial kits and immunohistochemical method, separately. Total surface area significantly increased in wine-processed RS (p<0.05) while fractal dimension markedly decreased (p<0.05) compared with crude RS. Additionally, S3 owned the highest increase of dissolution including the percentage increase of total extract, total flavonoids and main compounds (all p<0.05 vs S1 and S2). Pharmacodynamic data showed c-fos positive cells significantly decreased (p<0.05) in WP compared with MOD and the level of NO, CGRP, ET in WP was better than that of CRU. Wine-processed RS could be a promising candidate medicine for migraine treatment due to its increased component dissolution. Copyright © 2017 Elsevier Masson SAS. All rights reserved.
The presentation of "pro-anorexia" in online group interactions.
Gavin, Jeff; Rodham, Karen; Poyer, Helen
2008-03-01
Although pro-anorexia online support forums and the narratives that occur within them are increasingly the focus of research, none, to date, focuses closely on issues of identity within this online context. Our aim in conducting this study was to examine the presentation of pro-anorexia via an interpretive phenomenological analysis of postings to a pro-anorexia ("pro-ana") online discussion forum. Analysis indicates that pro-anorexic identities are normalized and strengthened through the normalization of participants' pro-ana thoughts and behaviors, and the group bond created through sharing a secret identity. This process renders participants less likely to reveal their pro-ana identity to friends and family in the real world. The implications of our findings are discussed in relation to the theory of identity demarginalization.
[Systems analysis of colour music corrective effect].
Gumeniuk, V A; Batova, N Ia; Mel'nikova, T S; Glazachev, O S; Golubeva, N K; Klimina, N V; Hubner, P
1998-01-01
In the context of P. K. Anokhin's theory of functional systems, the corrective effects of various combinations of medical therapeutical resonance music (MTRM) and dynamic colour exposure were analyzed. As compared to rehabilitative music programmes, MRTM was shown to have a more pronounced relaxing effect as manifested both in the optimization of emotion and in the activity of autonomic regulation of cardiovascular functions. On combined MRTM and dynamic colour flow exposures, the relaxing effect is most marked. In the examinees, the personality and situation anxieties diminish, mood improves, cardiovascular parameters become normal, the rate of metabolic processes and muscular rigidity reduce, the spectral power of alpha-rhythm increases, these occurring predominantly in the anterior region of the brain. The findings suggest the high efficiency of the chosen way of normalizing the functional status of man.
Spin foam models for quantum gravity
NASA Astrophysics Data System (ADS)
Perez, Alejandro
The definition of a quantum theory of gravity is explored following Feynman's path-integral approach. The aim is to construct a well defined version of the Wheeler-Misner- Hawking ``sum over four geometries'' formulation of quantum general relativity (GR). This is done by means of exploiting the similarities between the formulation of GR in terms of tetrad-connection variables (Palatini formulation) and a simpler theory called BF theory. One can go from BF theory to GR by imposing certain constraints on the BF-theory configurations. BF theory contains only global degrees of freedom (topological theory) and it can be exactly quantized á la Feynman introducing a discretization of the manifold. Using the path integral for BF theory we define a path integration for GR imposing the BF-to-GR constraints on the BF measure. The infinite degrees of freedom of gravity are restored in the process, and the restriction to a single discretization introduces a cut- off in the summed-over configurations. In order to capture all the degrees of freedom a sum over discretization is implemented. Both the implementation of the BF-to-GR constraints and the sum over discretizations are obtained by means of the introduction of an auxiliary field theory (AFT). 4-geometries in the path integral for GR are given by the Feynman diagrams of the AFT which is in this sense dual to GR. Feynman diagrams correspond to 2-complexes labeled by unitary irreducible representations of the internal gauge group (corresponding to tetrad rotation in the connection to GR). A model for 4-dimensional Euclidean quantum gravity (QG) is defined which corresponds to a different normalization of the Barrett-Crane model. The model is perturbatively finite; divergences appearing in the Barrett-Crane model are cured by the new normalization. We extend our techniques to the Lorentzian sector, where we define two models for four-dimensional QG. The first one contains only time-like representations and is shown to be perturbatively finite. The second model contains both time-like and space-like representations. The spectrum of geometrical operators coincide with the prediction of the canonical approach of loop QG. At the moment, the convergence properties of the model are less understood and remain for future investigation.
Cognitive Decline in Patients with Chronic Hydrocephalus and Normal Aging: ‘Growing into Deficits’
de Beer, Marlijn H.; Scheltens, Philip
2016-01-01
Background/Aim To explore the theory of ‘growing into deficits’, a concept known from developmental neurology, in a series of cases with chronic hydrocephalus (CH). Methods Patients were selected from the Amsterdam Dementia Cohort and underwent extensive dementia screening. Results Twelve patients with CH were selected, in whom Alzheimer's disease was considered unlikely, based on biomarker information and follow-up. Mean Mini-Mental State Examination score was 24 (range 7-30). Most patients were functioning on a level of mild dementia [Clinical Dementia Rating score of 0.5 in 8/11 (66.7%) patients]. On neuropsychological examination, memory and executive functions, as well as processing speed were most frequently impaired. Conclusion In our opinion, the theory of ‘growing into deficits’ shows a parallel with the clinical course of CH and normal aging when Alzheimer's disease was considered very unlikely, because most of these patients were functioning well for a very large part of their lives. The altered cerebrospinal fluid dynamics might make the brain more vulnerable to aging-related changes, leading to a faster cognitive decline in CH patients compared to healthy subjects, especially in case of concomitant brain damage such as traumatic brain injury or meningitis. PMID:27920793
Development of the Thai healthy aging model: A grounded theory study.
Thiamwong, Ladda; McManus, Michael S; Suwanno, Jom
2013-06-01
To develop a model of healthy aging from the perspective of Thais, a grounded theory approach, including in-depth interviews and focus groups, was used. A purposive sample of 39 community-dwelling adults aged 40-85 years old was interviewed. The Thai healthy aging model composed of three themes: normality, nature, and dharma. In Thai, they are called tham-ma-da, tham-ma-chat, and tham-ma, or "Thai 3Ts". The theme of normality encompasses subthemes of staying physically active by being involved in plenty of physical activities, and being mentally active with creative and thoughtful hobbies and work. The theme of nature encompasses subthemes of living simply and being careful with money. The theme of dharma encompasses subthemes of enjoyment through helping family and participating in community activities, staying away from stress and worries by talking openly and honestly with someone, making merit, and helping other people without expecting anything in return. A greater understanding of healthy aging is a benefit for older adults and healthcare providers in an intervention-design process. Research can contribute valuable information to shape policy for healthy aging as well. © 2013 Wiley Publishing Asia Pty Ltd.
Zadran, Sohila; Remacle, Francoise; Levine, Raphael
2014-01-01
Gliomablastoma multiform (GBM) is the most fatal form of all brain cancers in humans. Currently there are limited diagnostic tools for GBM detection. Here, we applied surprisal analysis, a theory grounded in thermodynamics, to unveil how biomolecule energetics, specifically a redistribution of free energy amongst microRNAs (miRNAs), results in a system deviating from a non-cancer state to the GBM cancer -specific phenotypic state. Utilizing global miRNA microarray expression data of normal and GBM patients tumors, surprisal analysis characterizes a miRNA system response capable of distinguishing GBM samples from normal tissue biopsy samples. We indicate that the miRNAs contributing to this system behavior is a disease phenotypic state specific to GBM and is therefore a unique GBM-specific thermodynamic signature. MiRNAs implicated in the regulation of stochastic signaling processes crucial in the hallmarks of human cancer, dominate this GBM-cancer phenotypic state. With this theory, we were able to distinguish with high fidelity GBM patients solely by monitoring the dynamics of miRNAs present in patients' biopsy samples. We anticipate that the GBM-specific thermodynamic signature will provide a critical translational tool in better characterizing cancer types and in the development of future therapeutics for GBM.
A sharp interface model for void growth in irradiated materials
NASA Astrophysics Data System (ADS)
Hochrainer, Thomas; El-Azab, Anter
2015-03-01
A thermodynamic formalism for the interaction of point defects with free surfaces in single-component solids has been developed and applied to the problem of void growth by absorption of point defects in irradiated metals. This formalism consists of two parts, a detailed description of the dynamics of defects within the non-equilibrium thermodynamic frame, and the application of the second law of thermodynamics to provide closure relations for all kinetic equations. Enforcing the principle of non-negative entropy production showed that the description of the problem of void evolution under irradiation must include a relationship between the normal fluxes of defects into the void surface and the driving thermodynamic forces for the void surface motion; these thermodynamic forces are identified for both vacancies and interstitials and the relationships between these forces and the normal point defect fluxes are established using the concepts of transition state theory. The latter theory implies that the defect accommodation into the surface is a thermally activated process. Numerical examples are given to illustrate void growth dynamics in this new formalism and to investigate the effect of the surface energy barriers on void growth. Consequences for phase field models of void growth are discussed.
Effects of Stress on Judgment and Decision Making in Dynamic Tasks
1991-06-01
their normal working conditions, (2) to ascertain whether the results from lens model theory and research in static tasks generalize to these...8217 normal work environment. A further generalization from lens model theory is that those precursors (secondary cues) that are more conceptual in...potential microburst cases. Although this sample of cases is admittedly smaller than desirable, many hours of technical work were required to remove
A review of attachment theory in the context of adolescent parenting.
Flaherty, Serena Cherry; Sadler, Lois S
2011-01-01
The purpose of this article is to review attachment theory and relate the attachment perspective to adolescent mothers and their children. Attachment theory explains positive maternal-infant attachment as a dyadic relationship between the infant and mother that provides the infant with a secure base from which to explore the world. With respect to cognitive, social, and behavioral domains, securely attached infants tend to have more favorable long-term outcomes, while insecurely attached infants are more likely to have adverse outcomes. Adolescent parenthood can disrupt normal adolescent development, and this disruption influences development of the emotional and cognitive capacities necessary for maternal behaviors that foster secure attachment. However, it appears that if specialized supports are in place to facilitate the process of developing attachment, infants of adolescent mothers can obtain higher rates of secure attachment than normative samples in this population. Copyright © 2011 National Association of Pediatric Nurse Practitioners. Published by Mosby, Inc. All rights reserved.
Theoretical perspectives accounting for adolescent homosexuality.
Savin-Williams, R C
1988-03-01
Few topics in sexology elicit such a diversity of opinions and emotions as the question of etiology of homosexuality. Views frequently carry with them implicit or explicit messages concerning the psychologic health of this sexual orientation. Theories of sexual development usually portray adolescence as a critical time in the life course because of changes in: 1) anatomy and physiology; 2) psychologic functioning: the reawakening, renewal, and reliving of previously established sexual relations and drives; and/or 3) social conditions: an increased exposure and adherence to societal messages concerning appropriate and inappropriate social and sexual behaviors and relationships. This paper provides a brief overview of several major theories--evolutionary biology, psychoanalysis, and social processes--as they relate to the development of sexual orientation. In addition, an ethologic perspective that synthesizes various etiologic theories, as they relate to homosexuality during adolescence, is briefly reviewed. In these discussions, the issue of whether homosexuality is a normal or abnormal developmental state during adolescence is also addressed.
Acoustic-gravity waves in atmospheric and oceanic waveguides.
Godin, Oleg A
2012-08-01
A theory of guided propagation of sound in layered, moving fluids is extended to include acoustic-gravity waves (AGWs) in waveguides with piecewise continuous parameters. The orthogonality of AGW normal modes is established in moving and motionless media. A perturbation theory is developed to quantify the relative significance of the gravity and fluid compressibility as well as sensitivity of the normal modes to variations in sound speed, flow velocity, and density profiles and in boundary conditions. Phase and group speeds of the normal modes are found to have certain universal properties which are valid for waveguides with arbitrary stratification. The Lamb wave is shown to be the only AGW normal mode that can propagate without dispersion in a layered medium.
Toward a Unified Consciousness Theory
ERIC Educational Resources Information Center
Johnson, Richard H.
1977-01-01
The beginning of a holistic theory that can treat paranormal phenomena as normal human development is presented. Implications for counseling, counselor education, and counselor supervision are discussed. (Author)
Construction and updating of event models in auditory event processing.
Huff, Markus; Maurer, Annika E; Brich, Irina; Pagenkopf, Anne; Wickelmaier, Florian; Papenmeier, Frank
2018-02-01
Humans segment the continuous stream of sensory information into distinct events at points of change. Between 2 events, humans perceive an event boundary. Present theories propose changes in the sensory information to trigger updating processes of the present event model. Increased encoding effort finally leads to a memory benefit at event boundaries. Evidence from reading time studies (increased reading times with increasing amount of change) suggest that updating of event models is incremental. We present results from 5 experiments that studied event processing (including memory formation processes and reading times) using an audio drama as well as a transcript thereof as stimulus material. Experiments 1a and 1b replicated the event boundary advantage effect for memory. In contrast to recent evidence from studies using visual stimulus material, Experiments 2a and 2b found no support for incremental updating with normally sighted and blind participants for recognition memory. In Experiment 3, we replicated Experiment 2a using a written transcript of the audio drama as stimulus material, allowing us to disentangle encoding and retrieval processes. Our results indicate incremental updating processes at encoding (as measured with reading times). At the same time, we again found recognition performance to be unaffected by the amount of change. We discuss these findings in light of current event cognition theories. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Zhang, Yuan-Ming; Zhang, Yinghao; Guo, Mingyue
2017-03-01
Wang's et al. article [1] is the first to integrate game theory (especially evolutionary game theory) with epigenetic modification of zygotic genomes. They described and assessed a modeling framework based on evolutionary game theory to quantify, how sperms and oocytes interact through epigenetic processes, to determine embryo development. They also studied the internal mechanisms for normal embryo development: 1) evolutionary interactions between DNA methylation of the paternal and maternal genomes, and 2) the application of game theory to formulate and quantify how different genes compete or cooperate to regulate embryogenesis through methylation. Although it is not very comprehensive and profound regarding game theory modeling, this article bridges the gap between evolutionary game theory and the epigenetic control of embryo development by powerful ordinary differential equations (ODEs). The epiGame framework includes four aspects: 1) characterizing how epigenetic game theory works by the strategy matrix, in which the pattern and relative magnitude of the methylation effects on embryogenesis, are described by the cooperation and competition mechanisms, 2) quantifying the game that the direction and degree of P-M interactions over embryo development can be explained by the sign and magnitude of interaction parameters in model (2), 3) modeling epigenetic interactions within the morula, especially for two coupled nonlinear ODEs, with explicit functions in model (4), which provide a good fit to the observed data for the two sexes (adjusted R2 = 0.956), and 4) revealing multifactorial interactions in embryogenesis from the coupled ODEs in model (2) to triplet ODEs in model (6). Clearly, this article extends game theory from evolutionary game theory to epigenetic game theory.
Hooker, Leesa; Small, Rhonda; Taft, Angela
2016-03-01
To investigate factors contributing to the sustained domestic violence screening and support practices of Maternal and Child Health nurses 2 years after a randomized controlled trial. Domestic violence screening by healthcare professionals has been implemented in many primary care settings. Barriers to screening exist and screening rates remain low. Evidence for longer term integration of nurse screening is minimal. Trial outcomes showed sustained safety planning behaviours by intervention group nurses. Process evaluation in 2-year follow-up of a cluster randomized controlled trial. Evaluation included a repeat online nurse survey and 14 interviews (July-September 2013). Survey analysis included comparison of proportionate group difference between arms and between trial baseline and 2 year follow-up surveys. Framework analysis was used to assess qualitative data. Normalization Process Theory informed evaluation design and interpretation of results. Survey response was 77% (n = 123/160). Sustainability of nurse identification of domestic violence appeared to be due to greater nurse discussion and domestic violence disclosure by women, facilitated by use of a maternal health and well-being checklist. Over time, intervention group nurses used the maternal checklist more at specific maternal health visits and found the checklist the most helpful resource assisting their domestic violence work. Nurses' spoke of a degree of 'normalization' to domestic violence screening that will need constant investment to maintain. Sustainable domestic violence screening and support outcomes can be achieved in an environment of comprehensive, nurse designed and theory driven implementation. Continuing training, discussion and monitoring of domestic violence work is needed to retain sustainable practices. © 2015 John Wiley & Sons Ltd.
The Development of Genetics in the Light of Thomas Kuhn's Theory of Scientific Revolutions.
Portin, Petter
2015-01-01
The concept of a paradigm is in the key position in Thomas Kuhn's theory of scientific revolutions. A paradigm is the framework within which the results, concepts, hypotheses and theories of scientific research work are understood. According to Kuhn, a paradigm guides the working and efforts of scientists during the time period which he calls the period of normal science. Before long, however, normal science leads to unexplained matters, a situation that then leads the development of the scientific discipline in question to a paradigm shift--a scientific revolution. When a new theory is born, it has either gradually emerged as an extension of the past theory, or the old theory has become a borderline case in the new theory. In the former case, one can speak of a paradigm extension. According to the present author, the development of modern genetics has, until very recent years, been guided by a single paradigm, the Mendelian paradigm which Gregor Mendel launched 150 years ago, and under the guidance of this paradigm the development of genetics has proceeded in a normal fashion in the spirit of logical positivism. Modern discoveries in genetics have, however, created a situation which seems to be leading toward a paradigm shift. The most significant of these discoveries are the findings of adaptive mutations, the phenomenon of transgenerational epigenetic inheritance, and, above all, the present deeply critical state of the concept of the gene.
Reward value-based gain control: divisive normalization in parietal cortex.
Louie, Kenway; Grattan, Lauren E; Glimcher, Paul W
2011-07-20
The representation of value is a critical component of decision making. Rational choice theory assumes that options are assigned absolute values, independent of the value or existence of other alternatives. However, context-dependent choice behavior in both animals and humans violates this assumption, suggesting that biological decision processes rely on comparative evaluation. Here we show that neurons in the monkey lateral intraparietal cortex encode a relative form of saccadic value, explicitly dependent on the values of the other available alternatives. Analogous to extra-classical receptive field effects in visual cortex, this relative representation incorporates target values outside the response field and is observed in both stimulus-driven activity and baseline firing rates. This context-dependent modulation is precisely described by divisive normalization, indicating that this standard form of sensory gain control may be a general mechanism of cortical computation. Such normalization in decision circuits effectively implements an adaptive gain control for value coding and provides a possible mechanistic basis for behavioral context-dependent violations of rationality.
Franzen, Jessica; Brinkmann, Kerstin
2016-12-01
Theories and research on depression point to reduced responsiveness during reward anticipation and in part also during punishment anticipation. They also suggest weaker affective responses to reward consumption and unchanged affective responses to punishment consumption. However, studies investigating incentive anticipation using effort mobilization and incentive consumption using facial expressions are scarce. The present studies tested reward and punishment responsiveness in a subclinically depressed sample, manipulating a monetary reward (Study 1) and a monetary punishment (Study 2). Effort mobilization was operationalized as cardiovascular reactivity, while facial expressions were measured by facial electromyographic reactivity. Compared to nondysphorics, dysphorics showed reduced pre-ejection period (PEP) reactivity and blunted self-reported wanting during reward anticipation but reduced PEP reactivity and normal self-reported wanting during punishment anticipation. Compared to nondysphorics, dysphorics showed reduced zygomaticus major muscle reactivity and blunted self-reported liking during reward consumption but normal corrugator supercilii muscle reactivity and normal self-reported disliking during punishment consumption. Copyright © 2016. Published by Elsevier B.V.
As the wheel turns: a centennial reflection on Freud's Three Essays on the Theory of Sexuality.
Person, Ethel Spector
2005-01-01
Freud's theories of psychosexual development, while highly original, were anchored in the explosion of scientific studies of sex in the nineteenth century. Most of these studies were based on masturbation, homosexuality, and deviance, with little attention given to normal sexuality. Around the turn of the century, the narrow interest in pathological sexuality and sexual physiology gradually gave way to a broader interest in normal sexuality. It was in the context of these expanding studies of sexuality that Freud proposed the first psychological view of sexuality, a theory that defined sex as being at the interface between soma and psyche. Libido theory, which Freud developed, is a theory of drives and conflicts. For Freud, libido was the major force in personality development, and he posited sexual conflicts as the heart of neuroses, sexual fixations as the essence of perversions. This article traces the way Freud's libido theory has served as one of the mainsprings in the development of psychoanalytic theory. It also addresses the major revisions that have taken place in libido theory, with a focus primarily on object relations theory, and the impact of culture on the way sex and sexual mores are parsed.
Carbon Nanotube Bonding Strength Enhancement Using Metal "Wicking" Process
NASA Technical Reports Server (NTRS)
Lamb, James L.; Dickie, Matthew R.; Kowalczyk, Robert S.; Liao, Anna; Bronikowski, Michael J.
2012-01-01
Carbon nanotubes grown from a surface typically have poor bonding strength at the interface. A process has been developed for adding a metal coat to the surface of carbon nano tubes (CNTs) through a wicking process, which could lead to an enhanced bonding strength at the interface. This process involves merging CNTs with indium as a bump-bonding enhancement. Classical capillary theory would not normally allow materials that do not wet carbon or graphite to be drawn into the spacings by capillary action because the contact angle is greater than 90 degrees. However, capillary action can be induced through JPL's ability to fabricate oriented CNT bundles to desired spacings, and through the use of deposition techniques and temperature to control the size and mobility of the liquid metal streams and associated reservoirs. A reflow and plasma cleaning process has also been developed and demonstrated to remove indium oxide, and to obtain smooth coatings on the CNT bundles.
NASA Astrophysics Data System (ADS)
Molina, A.; Laborda, E.; Compton, R. G.
2014-03-01
Simple theory for the electrochemical study of reversible ion transfer processes at micro- and nano-liquid|liquid interfaces supported on a capillary is presented. Closed-form expressions are obtained for the response in normal pulse and differential double pulse voltammetries, which describe adequately the particular behaviour of these systems due to the ‘asymmetric’ ion diffusion inside and outside the capillary. The use of different potential pulse techniques for the determination of the formal potential and diffusion coefficients of the ion is examined. For this, very simple analytical expressions are presented for the half-wave potential in NPV and the peak potential in DDPV.
Cancer Theory from Systems Biology Point of View
NASA Astrophysics Data System (ADS)
Wang, Gaowei; Tang, Ying; Yuan, Ruoshi; Ao, Ping
In our previous work, we have proposed a novel cancer theory, endogenous network theory, to understand mechanism underlying cancer genesis and development. Recently, we apply this theory to hepatocellular carcinoma (HCC). A core endogenous network of hepatocyte was established by integrating the current understanding of hepatocyte at molecular level. Quantitative description of the endogenous network consisted of a set of stochastic differential equations which could generate many local attractors with obvious or non-obvious biological functions. By comparing with clinical observation and experimental data, the results showed that two robust attractors from the model reproduced the main known features of normal hepatocyte and cancerous hepatocyte respectively at both modular and molecular level. In light of our theory, the genesis and progression of cancer is viewed as transition from normal attractor to HCC attractor. A set of new insights on understanding cancer genesis and progression, and on strategies for cancer prevention, cure, and care were provided.
Reception of Theory: Film-Television Studies and the Frankfurt School.
ERIC Educational Resources Information Center
Steinman, Clay
1988-01-01
Discusses the Critical Theory of the Frankfurt School and how it offers a way of seeing normally obscured relations of social power in the details of modern capitalist culture. Concentrates on claims about critical theory that have functioned as strategies of denial. (MS)
Can dual processing theory explain physics students' performance on the Force Concept Inventory?
NASA Astrophysics Data System (ADS)
Wood, Anna K.; Galloway, Ross K.; Hardy, Judy
2016-12-01
According to dual processing theory there are two types, or modes, of thinking: system 1, which involves intuitive and nonreflective thinking, and system 2, which is more deliberate and requires conscious effort and thought. The Cognitive Reflection Test (CRT) is a widely used and robust three item instrument that measures the tendency to override system 1 thinking and to engage in reflective, system 2 thinking. Each item on the CRT has an intuitive (but wrong) answer that must be rejected in order to answer the item correctly. We therefore hypothesized that performance on the CRT may give useful insights into the cognitive processes involved in learning physics, where success involves rejecting the common, intuitive ideas about the world (often called misconceptions) and instead carefully applying physical concepts. This paper presents initial results from an ongoing study examining the relationship between students' CRT scores and their performance on the Force Concept Inventory (FCI), which tests students' understanding of Newtonian mechanics. We find that a higher CRT score predicts a higher FCI score for both precourse and postcourse tests. However, we also find that the FCI normalized gain is independent of CRT score. The implications of these results are discussed.
PAST-TENSE GENERATION FROM FORM VERSUS MEANING: BEHAVIOURAL DATA AND SIMULATION EVIDENCE
Woollams, Anna M.; Joanisse, Marc; Patterson, Karalyn
2009-01-01
The standard task used to study inflectional processing of verbs involves presentation of the stem form from which the participant is asked to generate the past tense. This task reveals a processing disadvantage for irregular relative to regular English verbs, more pronounced for lower-frequency items. Dual- and single-mechanism theories of inflectional morphology are both able to account for this pattern; but the models diverge in their predictions concerning the magnitude of the regularity effect expected when the task involves past-tense generation from meaning. In this study, we asked normal speakers to generate the past tense from either form (verb stem) or meaning (action picture). The robust regularity effect observed in the standard form condition was no longer reliable when participants were required to generate the past tense from meaning. This outcome would appear problematic for dual-mechanism theories to the extent that they assume the process of inflection requires stem retrieval. By contrast, it supports single-mechanism models that consider stem retrieval to be task-dependent. We present a single-mechanism model of verb inflection incorporating distributed phonological and semantic representations that reproduces this task-dependent pattern. PMID:20161125
Efficient High-Fidelity, Geometrically Exact, Multiphysics Structural Models
2011-10-14
fuctionally graded core. International Journal for Numerical Methods in Engineering, 68:940– 966, 2006. 7F. Shang, Z. Wang, and Z. Li. Analysis of...normal deformable plate theory and MLPG method with radial basis fuctions . Composite Structures, 80:539– 552, 2007. 17W. Zhen and W. Chen. A higher-order...functionally graded plates by using higher-order shear and normal deformable plate theory and MLPG method with radial basis fuctions . Composite Structures, 80
A Higher-Order Bending Theory for Laminated Composite and Sandwich Beams
NASA Technical Reports Server (NTRS)
Cook, Geoffrey M.
1997-01-01
A higher-order bending theory is derived for laminated composite and sandwich beams. This is accomplished by assuming a special form for the axial and transverse displacement expansions. An independent expansion is also assumed for the transverse normal stress. Appropriate shear correction factors based on energy considerations are used to adjust the shear stiffness. A set of transverse normal correction factors is introduced, leading to significant improvements in the transverse normal strain and stress for laminated composite and sandwich beams. A closed-form solution to the cylindrical elasticity solutions for a wide range of beam aspect ratios and commonly used material systems. Accurate shear stresses for a wide range of laminates, including the challenging unsymmetric composite and sandwich laminates, are obtained using an original corrected integration scheme. For application of the theory to a wider range of problems, guidelines for finite element approximations are presented.
Hawking radiation and classical tunneling: A ray phase space approach
NASA Astrophysics Data System (ADS)
Tracy, E. R.; Zhigunov, D.
2016-01-01
Acoustic waves in fluids undergoing the transition from sub- to supersonic flow satisfy governing equations similar to those for light waves in the immediate vicinity of a black hole event horizon. This acoustic analogy has been used by Unruh and others as a conceptual model for "Hawking radiation." Here, we use variational methods, originally introduced by Brizard for the study of linearized MHD, and ray phase space methods, to analyze linearized acoustics in the presence of background flows. The variational formulation endows the evolution equations with natural Hermitian and symplectic structures that prove useful for later analysis. We derive a 2 × 2 normal form governing the wave evolution in the vicinity of the "event horizon." This shows that the acoustic model can be reduced locally (in ray phase space) to a standard (scalar) tunneling process weakly coupled to a unidirectional non-dispersive wave (the "incoming wave"). Given the normal form, the Hawking "thermal spectrum" can be derived by invoking standard tunneling theory, but only by ignoring the coupling to the incoming wave. Deriving the normal form requires a novel extension of the modular ray-based theory used previously to study tunneling and mode conversion in plasmas. We also discuss how ray phase space methods can be used to change representation, which brings the problem into a form where the wave functions are less singular than in the usual formulation, a fact that might prove useful in numerical studies.
Fliss, Rafika; Lemerre, Marion; Mollard, Audrey
2016-06-01
Compromised theory of mind (ToM) can be explained either by a failure to implement specific representational capacities (mental state representations) or by more general executive selection demands. In older adult populations, evidence supporting affected executive functioning and cognitive ToM in normal aging are reported. However, links between these two functions remain unclear. In the present paper, we address these shortcomings by using a specific task of ToM and classical executive tasks. We studied, using an original cognitive ToM task, the effect of age on ToM performances, in link with the progressive executive decline. 96 elderly participants were recruited. They were asked to perform a cognitive ToM task, and 5 executive tests (Stroop test and Hayling Sentence Completion Test to appreciate inhibitory process, Trail Making Test and Verbal Fluency for shifting assessment and backward span dedicated to estimate working memory capacity). The results show changes in cognitive ToM performance according to executive demands. Correlational studies indicate a significant relationship between ToM performance and the selected executive measures. Regression analyzes demonstrates that level of vocabulary and age as the best predictors of ToM performance. The results are consistent with the hypothesis that ToM deficits are related to age-related domain-general decline rather than as to a breakdown in specialized representational system. The implications of these findings for the nature of social cognition tests in normal aging are also discussed.
Scalia, Peter; Elwyn, Glyn; Durand, Marie-Anne
2017-08-18
Implementing patient decision aids in clinic workflow has proven to be a challenge for healthcare organizations and physicians. Our aim was to determine the organizational strategies, motivations, and facilitating factors to the routine implementation of Option Grid™ encounter decision aids at two independent settings. Case studies conducted by semi-structured interview, using the Normalization Process Theory (NPT) as a framework for thematic analysis. Twenty three interviews with physicians, nurses, hospital staff and stakeholders were conducted at: 1) CapitalCare Medical Group in Albany, New York; 2) HealthPartners Clinics in Minneapolis, Minnesota. 'Coherent' motivations were guided by financial incentives at CapitalCare, and by a 'champion' physician at HealthPartners. Nurses worked 'collectively' at both settings and played an important role at sites where successful implementation occurred. Some physicians did not understand the perceived utility of Option Grid™, which led to varying degrees of implementation success across sites. The appraisal work (reflexive monitoring) identified benefits, particularly in terms of information provision. Physicians at both settings, however, were concerned with time pressures and the suitability of the tool for patients with low levels of health literacy. Although both practice settings illustrated the mechanisms of normalization postulated by the theory, the extent to which Option Grid™ was routinely embedded in clinic workflow varied between sites, and between clinicians. Implementation of new interventions will require attention to an identified rationale (coherence), and to the collective action, cognitive participation, and assessment of value by organizational members of the organization.
Shahaf, Goded; Pratt, Hillel
2013-01-01
In this work we demonstrate the principles of a systematic modeling approach of the neurophysiologic processes underlying a behavioral function. The modeling is based upon a flexible simulation tool, which enables parametric specification of the underlying neurophysiologic characteristics. While the impact of selecting specific parameters is of interest, in this work we focus on the insights, which emerge from rather accepted assumptions regarding neuronal representation. We show that harnessing of even such simple assumptions enables the derivation of significant insights regarding the nature of the neurophysiologic processes underlying behavior. We demonstrate our approach in some detail by modeling the behavioral go/no-go task. We further demonstrate the practical significance of this simplified modeling approach in interpreting experimental data - the manifestation of these processes in the EEG and ERP literature of normal and abnormal (ADHD) function, as well as with comprehensive relevant ERP data analysis. In-fact we show that from the model-based spatiotemporal segregation of the processes, it is possible to derive simple and yet effective and theory-based EEG markers differentiating normal and ADHD subjects. We summarize by claiming that the neurophysiologic processes modeled for the go/no-go task are part of a limited set of neurophysiologic processes which underlie, in a variety of combinations, any behavioral function with measurable operational definition. Such neurophysiologic processes could be sampled directly from EEG on the basis of model-based spatiotemporal segregation.
Chinese version of the separation-individuation inventory.
Tam, Wai-Cheong Carl; Shiah, Yung-Jong; Chiang, Shih-Kuang
2003-08-01
The importance of the separation-individuation process in object relations theory is well known in disciplines of psychology, counseling, and human development. Based on the Separation-Individuation Inventory of Christenson and Wilson, which measures the manifestations of disturbances in this process, a Chinese version of the inventory was developed. For college students Cronbach coefficient alpha was .89, and test-retest reliability over 28 days was .77. The scores of the inventory had positive correlations with both the number of borderline personality characteristics and the Individualism-Collectivism Scale, respectively. Also, the mean score on the inventory of patients diagnosed with borderline personality disorder was significantly higher than that of the two normal control groups (ns = 564). Thus the inventory possessed satisfactory construct validity. Cultural differences regarding the separation-individuation process need to be investigated further.
Normal mode study of the earth's rigid body motions
NASA Technical Reports Server (NTRS)
Chao, B. F.
1983-01-01
In this paper it is shown that the earth's rigid body (rb) motions can be represented by an analytical set of eigensolutions to the equation of motion for elastic-gravitational free oscillations. Thus each degree of freedom in the rb motion is associated with a rb normal mode. Cases of both nonrotating and rotating earth models are studied, and it is shown that the rb modes do incorporate neatly into the earth's system of normal modes of free oscillation. The excitation formula for the rb modes are also obtained, based on normal mode theory. Physical implications of the results are summarized and the fundamental differences between rb modes and seismic modes are emphasized. In particular, it is ascertained that the Chandler wobble, being one of the rb modes belonging to the rotating earth, can be studied using the established theory of normal modes.
Theory and tests of a thermal ion detector sensitive only at Near-normal incidence
NASA Technical Reports Server (NTRS)
Robinson, J. W.
1981-01-01
Measurements of thermal ions are influenced by factors such as spacecraft potential, velocity, angle of attack, and sheath size. A theory is presented for the response of an instrument which accepts ions only within a small angle of incidence from normal. Although a more general theory is available and forms the basis of this one, the small angle restriction allows a simpler formulation which does not depend on sheath size. Furthermore, practical instruments are easily designed around this restriction. Laboratory tests verify that such instruments respond as expected and they illustrate how design details influence perturbations from the ideal response characteristics.
Visual attention to food cues in obesity: an eye-tracking study.
Doolan, Katy J; Breslin, Gavin; Hanna, Donncha; Murphy, Kate; Gallagher, Alison M
2014-12-01
Based on the theory of incentive sensitization, the aim of this study was to investigate differences in attentional processing of food-related visual cues between normal-weight and overweight/obese males and females. Twenty-six normal-weight (14M, 12F) and 26 overweight/obese (14M, 12F) adults completed a visual probe task and an eye-tracking paradigm. Reaction times and eye movements to food and control images were collected during both a fasted and fed condition in a counterbalanced design. Participants had greater visual attention towards high-energy-density food images compared to low-energy-density food images regardless of hunger condition. This was most pronounced in overweight/obese males who had significantly greater maintained attention towards high-energy-density food images when compared with their normal-weight counterparts however no between weight group differences were observed for female participants. High-energy-density food images appear to capture visual attention more readily than low-energy-density food images. Results also suggest the possibility of an altered visual food cue-associated reward system in overweight/obese males. Attentional processing of food cues may play a role in eating behaviors thus should be taken into consideration as part of an integrated approach to curbing obesity. © 2014 The Obesity Society.
Shankle, William R.; Pooley, James P.; Steyvers, Mark; Hara, Junko; Mangrola, Tushar; Reisberg, Barry; Lee, Michael D.
2012-01-01
Determining how cognition affects functional abilities is important in Alzheimer’s disease and related disorders (ADRD). 280 patients (normal or ADRD) received a total of 1,514 assessments using the Functional Assessment Staging Test (FAST) procedure and the MCI Screen (MCIS). A hierarchical Bayesian cognitive processing (HBCP) model was created by embedding a signal detection theory (SDT) model of the MCIS delayed recognition memory task into a hierarchical Bayesian framework. The SDT model used latent parameters of discriminability (memory process) and response bias (executive function) to predict, simultaneously, recognition memory performance for each patient and each FAST severity group. The observed recognition memory data did not distinguish the six FAST severity stages, but the latent parameters completely separated them. The latent parameters were also used successfully to transform the ordinal FAST measure into a continuous measure reflecting the underlying continuum of functional severity. HBCP models applied to recognition memory data from clinical practice settings accurately translated a latent measure of cognition to a continuous measure of functional severity for both individuals and FAST groups. Such a translation links two levels of brain information processing, and may enable more accurate correlations with other levels, such as those characterized by biomarkers. PMID:22407225
Quasi-Normal Modes of Stars and Black Holes.
Kokkotas, Kostas D; Schmidt, Bernd G
1999-01-01
Perturbations of stars and black holes have been one of the main topics of relativistic astrophysics for the last few decades. They are of particular importance today, because of their relevance to gravitational wave astronomy. In this review we present the theory of quasi-normal modes of compact objects from both the mathematical and astrophysical points of view. The discussion includes perturbations of black holes (Schwarzschild, Reissner-Nordström, Kerr and Kerr-Newman) and relativistic stars (non-rotating and slowly-rotating). The properties of the various families of quasi-normal modes are described, and numerical techniques for calculating quasi-normal modes reviewed. The successes, as well as the limits, of perturbation theory are presented, and its role in the emerging era of numerical relativity and supercomputers is discussed.
NASA Technical Reports Server (NTRS)
Chutjian, A.
1982-01-01
Electron attachment cross sections for the processes SF6-/SF6 and Cl-/CFCl3 are calculated in a local theory using a model in which diatomic-like potential energy curves for the normal modes are constructed from available spectroscopic data. Thermally populated vibrational and rotational levels are included. Good agreement is found with experimental cross sections in the energy range 5-100 meV for a particular choice of potential energy curve parameters.
On the theory relating changes in area-average and pan evaporation (Invited)
NASA Astrophysics Data System (ADS)
Shuttleworth, W.; Serrat-Capdevila, A.; Roderick, M. L.; Scott, R.
2009-12-01
Theory relating changes in area-average evaporation with changes in the evaporation from pans or open water is developed. Such changes can arise by Type (a) processes related to large-scale changes in atmospheric concentrations and circulation that modify surface evaporation rates in the same direction, and Type (b) processes related to coupling between the surface and atmospheric boundary layer (ABL) at the landscape scale that usually modify area-average evaporation and pan evaporation in different directions. The interrelationship between evaporation rates in response to Type (a) changes is derived. They have the same sign and broadly similar magnitude but the change in area-average evaporation is modified by surface resistance. As an alternative to assuming the complementary evaporation hypothesis, the results of previous modeling studies that investigated surface-atmosphere coupling are parameterized and used to develop a theoretical description of Type (b) coupling via vapor pressure deficit (VPD) in the ABL. The interrelationship between appropriately normalized pan and area-average evaporation rates is shown to vary with temperature and wind speed but, on average, the Type (b) changes are approximately equal and opposite. Long-term Australian pan evaporation data are analyzed to demonstrate the simultaneous presence of Type (a) and (b) processes, and observations from three field sites in southwestern USA show support for the theory describing Type (b) coupling via VPD. England's victory over Australia in 2009 Ashes cricket test match series will not be mentioned.
Mahakrishnan, Sathiya; Chakraborty, Subrata; Vijay, Amrendra
2016-09-15
Diffusion, an emergent nonequilibrium transport phenomenon, is a nontrivial manifestation of the correlation between the microscopic dynamics of individual molecules and their statistical behavior observed in experiments. We present a thorough investigation of this viewpoint using the mathematical tools of quantum scattering, within the framework of Boltzmann transport theory. In particular, we ask: (a) How and when does a normal diffusive transport become anomalous? (b) What physical attribute of the system is conceptually useful to faithfully rationalize large variations in the coefficient of normal diffusion, observed particularly within the dynamical environment of biological cells? To characterize the diffusive transport, we introduce, analogous to continuous phase transitions, the curvature of the mean square displacement as an order parameter and use the notion of quantum scattering length, which measures the effective interactions between the diffusing molecules and the surrounding, to define a tuning variable, η. We show that the curvature signature conveniently differentiates the normal diffusion regime from the superdiffusion and subdiffusion regimes and the critical point, η = ηc, unambiguously determines the coefficient of normal diffusion. To solve the Boltzmann equation analytically, we use a quantum mechanical expression for the scattering amplitude in the Boltzmann collision term and obtain a general expression for the effective linear collision operator, useful for a variety of transport studies. We also demonstrate that the scattering length is a useful dynamical characteristic to rationalize experimental observations on diffusive transport in complex systems. We assess the numerical accuracy of the present work with representative experimental results on diffusion processes in biological systems. Furthermore, we advance the idea of temperature-dependent effective voltage (of the order of 1 μV or less in a biological environment, for example) as a dynamical cause of the perpetual molecular movement, which eventually manifests as an ordered motion, called the diffusion.
Rapcsak, Steven Z; Henry, Maya L; Teague, Sommer L; Carnahan, Susan D; Beeson, Pélagie M
2007-06-18
Coltheart and co-workers [Castles, A., Bates, T. C., & Coltheart, M. (2006). John Marshall and the developmental dyslexias. Aphasiology, 20, 871-892; Coltheart, M., Rastle, K., Perry, C., Langdon, R., & Ziegler, J. (2001). DRC: A dual route cascaded model of visual word recognition and reading aloud. Psychological Review, 108, 204-256] have demonstrated that an equation derived from dual-route theory accurately predicts reading performance in young normal readers and in children with reading impairment due to developmental dyslexia or stroke. In this paper, we present evidence that the dual-route equation and a related multiple regression model also accurately predict both reading and spelling performance in adult neurological patients with acquired alexia and agraphia. These findings provide empirical support for dual-route theories of written language processing.
Local subsystems in gauge theory and gravity
Donnelly, William; Freidel, Laurent
2016-09-16
We consider the problem of defining localized subsystems in gauge theory and gravity. Such systems are associated to spacelike hypersurfaces with boundaries and provide the natural setting for studying entanglement entropy of regions of space. We present a general formalism to associate a gauge-invariant classical phase space to a spatial slice with boundary by introducing new degrees of freedom on the boundary. In Yang-Mills theory the new degrees of freedom are a choice of gauge on the boundary, transformations of which are generated by the normal component of the nonabelian electric field. In general relativity the new degrees of freedommore » are the location of a codimension-2 surface and a choice of conformal normal frame. These degrees of freedom transform under a group of surface symmetries, consisting of diffeomorphisms of the codimension-2 boundary, and position-dependent linear deformations of its normal plane. We find the observables which generate these symmetries, consisting of the conformal normal metric and curvature of the normal connection. We discuss the implications for the problem of defining entanglement entropy in quantum gravity. Finally, our work suggests that the Bekenstein-Hawking entropy may arise from the different ways of gluing together two partial Cauchy surfaces at a cross-section of the horizon.« less
Giotakos, O
2017-01-01
A variety of phenomena might be considered as reflecting impaired insight in psychosis, like failure to recognize signs, symptoms or disease, failure to derive appropriate cognitive representations, despite recognition of the disease, and misattribution of the source or cause of the disease. The unawareness of tardive dyskinesia symptoms in schizophrenic patients points that self-awareness deficits in schizophrenia may be domain specific. Poor insight is an independent phenomenological and a prevalent feature in psychotic disorders in general, and in schizophrenia in particular, but we don't know yet if delusions in schizophrenia are the result of an entirely normal attempt to account for abnormal perceptual experiences or a product of abnormal experience but of normal reasoning. The theoretical approaches regarding impaired insight include the disturbed perceptual input, the impaired linkage between thought and emotion and the breakdown of the process of self-monitoring and error checking. The inability to distinguish between internally and externally generated mental events has been described by the metarepresentation theory. This theory includes the awareness of ones' goals, which leads to disorders of willed action, the awareness of intention, which leads to movement disorders, and the awareness of intentions of others, which leads to paranoid delusions. The theory of metarepresentation implies mainly output mechanisms, like the frontal cortex, while the input mechanism implies posterior brain systems, including the parietal lobe. There are many similarities between the disturbances of awareness seen in schizophrenia and those seen as a result of known neurological impairment. Neuropsychological models of impaired insight typically attribute the disturbance to any of a variety of core deficits in the processing of information. In this respect, lack of insight is on conceptual par with alogia, apraxia or aphasia in reflecting disturbed cognitive processing. In this direction, research have implicated the role of self-monitoring in disorders of awareness and many of the core symptoms of schizophrenia, and has been suggested that these symptoms are the result of a disturbance of a medial frontal system involving anterior hippocampus, cingulated gyrus, supplementary motor area, and dorsolateral prefrontal cortex. Poor insight seems to be something more than a symptom or an epi-phenomenon and its mechanism may constitute a core factor into the psychosis process. Also, poor insight would be involves a common mechanism for many other mental disorders or even it would be an independent and trans-diagnostic factor into the human personality, probably like the dimension of psychotism.
NASA Astrophysics Data System (ADS)
Lotfy, Kh.
2017-07-01
The dual-phase-lag (DPL) model with two different time translations and Lord-Shulman (LS) theory with one relaxation time are applied to study the effect of hydrostatic initial stress on medium under the influence of two temperature parameter(a new model will be introduced using two temperature theory) and photothermal theory. We solved the thermal loading at the free surface in the semi-infinite semiconducting medium-coupled plasma waves with the effect of mechanical force during a photothermal process. The exact expressions of the considered variables are obtained using normal mode analysis also the two temperature coefficient ratios were obtained analytically. Numerical results for the field quantities are given in the physical domain and illustrated graphically under the effects of several parameters. Comparisons are made between the results of the two different models with and without two temperature parameter, and for two different values of the hydrostatic initial stress. A comparison is carried out between the considered variables as calculated from the generalized thermoelasticity based on the DPL model and the LS theory in the absence and presence of the thermoelastic and thermoelectric coupling parameters.
Self-consistent non-stationary theory of the gyrotron
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dumbrajs, Olgierd; Nusinovich, Gregory S.
2016-08-15
For a long time, the gyrotron theory was developed assuming that the transit time of electrons through the interaction space is much shorter than the cavity fill time. Correspondingly, it was assumed that during this transit time, the amplitude of microwave oscillations remains constant. A recent interest to such additional effects as the after-cavity interaction between electrons and the outgoing wave in the output waveguide had stimulated some studies of the beam-wave interaction processes over much longer distances than a regular part of the waveguide which serves as a cavity in gyrotrons. Correspondingly, it turned out that the gyrotron theorymore » free from the assumption about constant amplitude of microwave oscillations during the electron transit time should be developed. The present paper contains some results obtained in the framework of such theory. The main attention is paid to modification of the boundary between the regions of oscillations with constant amplitude and automodulation in the plane of normalized parameters characterizing the external magnetic field and the beam current. It is shown that the theory free from the assumption about the frozen wave amplitude during the electron transit time predicts some widening of the region of automodulation.« less
Modulation transfer function of a fish-eye lens based on the sixth-order wave aberration theory.
Jia, Han; Lu, Lijun; Cao, Yiqing
2018-01-10
A calculation program of the modulation transfer function (MTF) of a fish-eye lens is developed with the autocorrelation method, in which the sixth-order wave aberration theory of ultra-wide-angle optical systems is used to simulate the wave aberration distribution at the exit pupil of the optical systems. The autocorrelation integral is processed with the Gauss-Legendre integral, and the magnification chromatic aberration is discussed to calculate polychromatic MTF. The MTF calculation results of a given example are then compared with those previously obtained based on the fourth-order wave aberration theory of plane-symmetrical optical systems and with those from the Zemax program. The study shows that MTF based on the sixth-order wave aberration theory has satisfactory calculation accuracy even for a fish-eye lens with a large acceptance aperture. And the impacts of different types of aberrations on the MTF of a fish-eye lens are analyzed. Finally, we apply the self-adaptive and normalized real-coded genetic algorithm and the MTF developed in the paper to optimize the Nikon F/2.8 fish-eye lens; consequently, the optimized system shows better MTF performances than those of the original design.
Principals' Engagement of Low Ability Students in Singapore Secondary Schools
ERIC Educational Resources Information Center
Ong, Chye Hin; Dimmock, Clive
2013-01-01
This article describes a grounded theory constructed from a study of Singapore neighbourhood secondary school principals' engagement of their lowest stream, the Normal Technical students, in their schools. This substantive theory is labelled the "theory of selective engagement". It implies that how principals engage their lowest streamed…
Norm Theory: Comparing Reality to Its Alternatives.
ERIC Educational Resources Information Center
Kahneman, Daniel; Miller, Dale T.
1986-01-01
A theory of norms and normality is applied to some phenomena of emotional responses, social judgment, and conversations about causes. Norm theory is applied in analyses of enhanced emotional response to events that have abnormal causes, of generation of prediction from observations of behavior, and of the role of norms. (Author/LMO)
Stochastic Modeling Approach to the Incubation Time of Prionic Diseases
NASA Astrophysics Data System (ADS)
Ferreira, A. S.; da Silva, M. A.; Cressoni, J. C.
2003-05-01
Transmissible spongiform encephalopathies are neurodegenerative diseases for which prions are the attributed pathogenic agents. A widely accepted theory assumes that prion replication is due to a direct interaction between the pathologic (PrPSc) form and the host-encoded (PrPC) conformation, in a kind of autocatalytic process. Here we show that the overall features of the incubation time of prion diseases are readily obtained if the prion reaction is described by a simple mean-field model. An analytical expression for the incubation time distribution then follows by associating the rate constant to a stochastic variable log normally distributed. The incubation time distribution is then also shown to be log normal and fits the observed BSE (bovine spongiform encephalopathy) data very well. Computer simulation results also yield the correct BSE incubation time distribution at low PrPC densities.
Vilariño Besteiro, M P; Pérez Franco, C; Gallego Morales, L; Calvo Sagardoy, R; García de Lorenzo, A
2009-01-01
This paper intends to show the combination of therapeutical strategies in the treatment of long evolution food disorders. This fashion of work entitled "Modelo Santa Cristina" is based on several theoretical paradigms: Enabling Model, Action Control Model, Change Process Transtheoretical Model and Cognitive-Behavioural Model (Cognitive Restructuring and Learning Theories). Furthermore, Gestalt, Systemic and Psychodrama Orientation Techniques. The purpose of the treatment is both the normalization of food patterns and the increase in self-knowledge, self-acceptance and self-efficacy of patients. The exploration of ambivalence to change, the discovery of the functions of symptoms and the search for alternative behaviours, the normalization of food patterns, bodily image, cognitive restructuring, decision taking, communication skills and elaboration of traumatic experiences are among the main areas of intervention.
Linear perturbations of black holes: stability, quasi-normal modes and tails
NASA Astrophysics Data System (ADS)
Zhidenko, Alexander
2009-03-01
Black holes have their proper oscillations, which are called the quasi-normal modes. The proper oscillations of astrophysical black holes can be observed in the nearest future with the help of gravitational wave detectors. Quasi-normal modes are also very important in the context of testing of the stability of black objects, the anti-de Sitter/Conformal Field Theory (AdS/CFT) correspondence and in higher dimensional theories, such as the brane-world scenarios and string theory. This dissertation reviews a number of works, which provide a thorough study of the quasi-normal spectrum of a wide class of black holes in four and higher dimensions for fields of various spin and gravitational perturbations. We have studied numerically the dependance of the quasi-normal modes on a number of factors, such as the presence of the cosmological constant, the Gauss-Bonnet parameter or the aether in the space-time, the dependance of the spectrum on parameters of the black hole and fields under consideration. By the analysis of the quasi-normal spectrum, we have studied the stability of higher dimensional Reissner-Nordstrom-de Sitter black holes, Kaluza-Klein black holes with squashed horizons, Gauss-Bonnet black holes and black strings. Special attention is paid to the evolution of massive fields in the background of various black holes. We have considered their quasi-normal ringing and the late-time tails. In addition, we present two new numerical techniques: a generalisation of the Nollert improvement of the Frobenius method for higher dimensional problems and a qualitatively new method, which allows to calculate quasi-normal frequencies for black holes, which metrics are not known analytically.
Mentalizing and motivation neural function during social interactions in autism spectrum disorders☆
Assaf, Michal; Hyatt, Christopher J.; Wong, Christina G.; Johnson, Matthew R.; Schultz, Robert T.; Hendler, Talma; Pearlson, Godfrey D.
2013-01-01
Autism Spectrum Disorders (ASDs) are characterized by core deficits in social functions. Two theories have been suggested to explain these deficits: mind-blindness theory posits impaired mentalizing processes (i.e. decreased ability for establishing a representation of others' state of mind), while social motivation theory proposes that diminished reward value for social information leads to reduced social attention, social interactions, and social learning. Mentalizing and motivation are integral to typical social interactions, and neuroimaging evidence points to independent brain networks that support these processes in healthy individuals. However, the simultaneous function of these networks has not been explored in individuals with ASDs. We used a social, interactive fMRI task, the Domino game, to explore mentalizing- and motivation-related brain activation during a well-defined interval where participants respond to rewards or punishments (i.e. motivation) and concurrently process information about their opponent's potential next actions (i.e. mentalizing). Thirteen individuals with high-functioning ASDs, ages 12–24, and 14 healthy controls played fMRI Domino games against a computer-opponent and separately, what they were led to believe was a human-opponent. Results showed that while individuals with ASDs understood the game rules and played similarly to controls, they showed diminished neural activity during the human-opponent runs only (i.e. in a social context) in bilateral middle temporal gyrus (MTG) during mentalizing and right Nucleus Accumbens (NAcc) during reward-related motivation (Pcluster < 0.05 FWE). Importantly, deficits were not observed in these areas when playing against a computer-opponent or in areas related to motor and visual processes. These results demonstrate that while MTG and NAcc, which are critical structures in the mentalizing and motivation networks, respectively, activate normally in a non-social context, they fail to respond in an otherwise identical social context in ASD compared to controls. We discuss implications to both the mind-blindness and social motivation theories of ASD and the importance of social context in research and treatment protocols. PMID:24273716
Normal Aging and Linguistic Decrement.
ERIC Educational Resources Information Center
Emery, Olga B.
A study investigated language patterning, as an indication of synthetic mental activity, in comparison groups of normal pre-middle-aged adults (30-42 years), normal elderly adults (75-93), and elderly adults (71-91) with Alzheimer's dementia. Semiotic theory was used as the conceptual context. Linguistic measures included the Token Test, the…
Girdauskas, Evaldas; Borger, Michael A; Secknus, Maria-Anna; Girdauskas, Gracijus; Kuntze, Thomas
2011-06-01
Although there is adequate evidence that bicuspid aortic valve (BAV) is an inheritable disorder, there is a great controversy regarding the pathogenesis of dilatation of the proximal aorta. The hemodynamic theory was the first explanation for BAV aortopathy. The genetic theory, however, has become increasingly popular over the last decade and can now be viewed as the clearly dominant one. The widespread belief that BAV disease is a congenital disorder of vascular connective tissue has led to more aggressive treatment recommendations of the proximal aorta in such patients, approaching aortic management recommendations for patients with Marfan syndrome. There is emerging evidence that the 'clinically normal' BAV is associated with abnormal flow patterns and asymmetrically increased wall stress in the proximal aorta. Recent in vitro and in vivo studies on BAV function provide a unique hemodynamic insight into the different phenotypes of BAV disease and asymmetry of corresponding aortopathy even in the presence of a 'clinically normal' BAV. On the other hand, there is a subgroup of young male patients with BAV and a root dilatation phenotype, who may present the predominantly genetic form of BAV disease. In the face of these important findings, we feel that a critical review of this clinical problem is timely and appropriate, as the prevailing BAV-aortopathy theory undoubtedly affects the surgical approach to this common clinical entity. Thorough analysis of the recent literature shows a growing amount of evidence supporting the hemodynamic theory of aortopathy in patients with BAV disease. Data from recent studies requires a reevaluation of our overwhelming support of the genetic theory, and obliges us to acknowledge that hemodynamics plays an important role in the development of this disease process. Given the marked heterogeneity of BAV disease, further studies are required in order to more precisely determine which theory is the 'correct' one for explaining the obviously different types of BAV-associated aortopathy. Copyright © 2011 European Association for Cardio-Thoracic Surgery. Published by Elsevier B.V. All rights reserved.
Yoo, Jongmyung; Song, Jeonghwan; Hwang, Hyunsang
2018-06-18
In this study, we investigate the effect of cation amount in electrolyte on Ag/TiO2 based threshold switching devices based on field-induced nucleation theory. For this purpose, normal Ag/TiO2, annealed Ag/TiO2, and Ag-Te/TiO2 based TS devices are prepared, which have different cation amounts in their electrolytes during the switching process. First, we find that all of the prepared TS devices follow the field-induced nucleation theory with different nucleation barrier energy (W0) by investigating the delay time dependency at various voltages and temperatures. Based on the investigation, we reveal that the amount of cations in the electrolyte during the switching process is the control parameter that affects the W0 values, which are found to be inversely proportional to the turn-off speed of the TS devices. This implies that the turn-off speed of the TS devices can be modulated by controlling the amount of cations in the matrix. © 2018 IOP Publishing Ltd.
Yeh, Zai-Ting; Tsai, Ming-Cheng; Tsai, Ming-Dar; Lo, Chiao-Yu; Wang, Kaw-Chen
2017-01-01
"Theory of mind" (ToM) refers to the ability to predict others' thoughts, intentions, beliefs, and feelings. Evidence from neuropsychology and functional imaging indicates that ToM is a domain-specific or modular architecture; however, research in development psychology has suggested that ToM is the full development of the executive functions in individuals. Therefore, the relationship between ToM and the executive functions needs to be clarified. Since the frontal lobe plays a critical role in the abilities of ToM and the executive functions, patients with frontal lobe damage were recruited for the present study. Assessments of ToM and the executive functions were performed on 23 patients with frontal lobe damage and 20 healthy controls. When controlling for the executive functions, significant differences between the patient and normal groups were found in the affective component of ToM, but not in the cognitive component. The present study suggests that in various social situations, executing ToM abilities requires logical reasoning processes provided by the executive functions. However, the reasoning processes of affective ToM are independent of executive functions.
Correlative weighted stacking for seismic data in the wavelet domain
Zhang, S.; Xu, Y.; Xia, J.; ,
2004-01-01
Horizontal stacking plays a crucial role for modern seismic data processing, for it not only compresses random noise and multiple reflections, but also provides a foundational data for subsequent migration and inversion. However, a number of examples showed that random noise in adjacent traces exhibits correlation and coherence. The average stacking and weighted stacking based on the conventional correlative function all result in false events, which are caused by noise. Wavelet transform and high order statistics are very useful methods for modern signal processing. The multiresolution analysis in wavelet theory can decompose signal on difference scales, and high order correlative function can inhibit correlative noise, for which the conventional correlative function is of no use. Based on the theory of wavelet transform and high order statistics, high order correlative weighted stacking (HOCWS) technique is presented in this paper. Its essence is to stack common midpoint gathers after the normal moveout correction by weight that is calculated through high order correlative statistics in the wavelet domain. Synthetic examples demonstrate its advantages in improving the signal to noise (S/N) ration and compressing the correlative random noise.
Kinetic theory approach to modeling of cellular repair mechanisms under genome stress.
Qi, Jinpeng; Ding, Yongsheng; Zhu, Ying; Wu, Yizhi
2011-01-01
Under acute perturbations from outer environment, a normal cell can trigger cellular self-defense mechanism in response to genome stress. To investigate the kinetics of cellular self-repair process at single cell level further, a model of DNA damage generating and repair is proposed under acute Ion Radiation (IR) by using mathematical framework of kinetic theory of active particles (KTAP). Firstly, we focus on illustrating the profile of Cellular Repair System (CRS) instituted by two sub-populations, each of which is made up of the active particles with different discrete states. Then, we implement the mathematical framework of cellular self-repair mechanism, and illustrate the dynamic processes of Double Strand Breaks (DSBs) and Repair Protein (RP) generating, DSB-protein complexes (DSBCs) synthesizing, and toxins accumulating. Finally, we roughly analyze the capability of cellular self-repair mechanism, cellular activity of transferring DNA damage, and genome stability, especially the different fates of a certain cell before and after the time thresholds of IR perturbations that a cell can tolerate maximally under different IR perturbation circumstances.
Examination of the neighborhood activation theory in normal and hearing-impaired listeners.
Dirks, D D; Takayanagi, S; Moshfegh, A; Noffsinger, P D; Fausti, S A
2001-02-01
Experiments were conducted to examine the effects of lexical information on word recognition among normal hearing listeners and individuals with sensorineural hearing loss. The lexical factors of interest were incorporated in the Neighborhood Activation Model (NAM). Central to this model is the concept that words are recognized relationally in the context of other phonemically similar words. NAM suggests that words in the mental lexicon are organized into similarity neighborhoods and the listener is required to select the target word from competing lexical items. Two structural characteristics of similarity neighborhoods that influence word recognition have been identified; "neighborhood density" or the number of phonemically similar words (neighbors) for a particular target item and "neighborhood frequency" or the average frequency of occurrence of all the items within a neighborhood. A third lexical factor, "word frequency" or the frequency of occurrence of a target word in the language, is assumed to optimize the word recognition process by biasing the system toward choosing a high frequency over a low frequency word. Three experiments were performed. In the initial experiments, word recognition for consonant-vowel-consonant (CVC) monosyllables was assessed in young normal hearing listeners by systematically partitioning the items into the eight possible lexical conditions that could be created by two levels of the three lexical factors, word frequency (high and low), neighborhood density (high and low), and average neighborhood frequency (high and low). Neighborhood structure and word frequency were estimated computationally using a large, on-line lexicon-based Webster's Pocket Dictionary. From this program 400 highly familiar, monosyllables were selected and partitioned into eight orthogonal lexical groups (50 words/group). The 400 words were presented randomly to normal hearing listeners in speech-shaped noise (Experiment 1) and "in quiet" (Experiment 2) as well as to an elderly group of listeners with sensorineural hearing loss in the speech-shaped noise (Experiment 3). The results of three experiments verified predictions of NAM in both normal hearing and hearing-impaired listeners. In each experiment, words from low density neighborhoods were recognized more accurately than those from high density neighborhoods. The presence of high frequency neighbors (average neighborhood frequency) produced poorer recognition performance than comparable conditions with low frequency neighbors. Word frequency was found to have a highly significant effect on word recognition. Lexical conditions with high word frequencies produced higher performance scores than conditions with low frequency words. The results supported the basic tenets of NAM theory and identified both neighborhood structural properties and word frequency as significant lexical factors affecting word recognition when listening in noise and "in quiet." The results of the third experiment permit extension of NAM theory to individuals with sensorineural hearing loss. Future development of speech recognition tests should allow for the effects of higher level cognitive (lexical) factors on lower level phonemic processing.
Language impairment is reflected in auditory evoked fields.
Pihko, Elina; Kujala, Teija; Mickos, Annika; Alku, Paavo; Byring, Roger; Korkman, Marit
2008-05-01
Specific language impairment (SLI) is diagnosed when a child has problems in producing or understanding language despite having a normal IQ and there being no other obvious explanation. There can be several associated problems, and no single underlying cause has yet been identified. Some theories propose problems in auditory processing, specifically in the discrimination of sound frequency or rapid temporal frequency changes. We compared automatic cortical speech-sound processing and discrimination between a group of children with SLI and control children with normal language development (mean age: 6.6 years; range: 5-7 years). We measured auditory evoked magnetic fields using two sets of CV syllables, one with a changing consonant /da/ba/ga/ and another one with a changing vowel /su/so/sy/ in an oddball paradigm. The P1m responses for onsets of repetitive stimuli were weaker in the SLI group whereas no significant group differences were found in the mismatch responses. The results indicate that the SLI group, having weaker responses to the onsets of sounds, might have slightly depressed sensory encoding.
A theory for modeling ground-water flow in heterogeneous media
Cooley, Richard L.
2004-01-01
Construction of a ground-water model for a field area is not a straightforward process. Data are virtually never complete or detailed enough to allow substitution into the model equations and direct computation of the results of interest. Formal model calibration through optimization, statistical, and geostatistical methods is being applied to an increasing extent to deal with this problem and provide for quantitative evaluation and uncertainty analysis of the model. However, these approaches are hampered by two pervasive problems: 1) nonlinearity of the solution of the model equations with respect to some of the model (or hydrogeologic) input variables (termed in this report system characteristics) and 2) detailed and generally unknown spatial variability (heterogeneity) of some of the system characteristics such as log hydraulic conductivity, specific storage, recharge and discharge, and boundary conditions. A theory is developed in this report to address these problems. The theory allows construction and analysis of a ground-water model of flow (and, by extension, transport) in heterogeneous media using a small number of lumped or smoothed system characteristics (termed parameters). The theory fully addresses both nonlinearity and heterogeneity in such a way that the parameters are not assumed to be effective values. The ground-water flow system is assumed to be adequately characterized by a set of spatially and temporally distributed discrete values, ?, of the system characteristics. This set contains both small-scale variability that cannot be described in a model and large-scale variability that can. The spatial and temporal variability in ? are accounted for by imagining ? to be generated by a stochastic process wherein ? is normally distributed, although normality is not essential. Because ? has too large a dimension to be estimated using the data normally available, for modeling purposes ? is replaced by a smoothed or lumped approximation y?. (where y is a spatial and temporal interpolation matrix). Set y?. has the same form as the expected value of ?, y 'line' ? , where 'line' ? is the set of drift parameters of the stochastic process; ?. is a best-fit vector to ?. A model function f(?), such as a computed hydraulic head or flux, is assumed to accurately represent an actual field quantity, but the same function written using y?., f(y?.), contains error from lumping or smoothing of ? using y?.. Thus, the replacement of ? by y?. yields nonzero mean model errors of the form E(f(?)-f(y?.)) throughout the model and covariances between model errors at points throughout the model. These nonzero means and covariances are evaluated through third and fifth-order accuracy, respectively, using Taylor series expansions. They can have a significant effect on construction and interpretation of a model that is calibrated by estimating ?.. Vector ?.. is estimated as 'hat' ? using weighted nonlinear least squares techniques to fit a set of model functions f(y'hat' ?) to a. corresponding set of observations of f(?), Y. These observations are assumed to be corrupted by zero-mean, normally distributed observation errors, although, as for ?, normality is not essential. An analytical approximation of the nonlinear least squares solution is obtained using Taylor series expansions and perturbation techniques that assume model and observation errors to be small. This solution is used to evaluate biases and other results to second-order accuracy in the errors. The correct weight matrix to use in the analysis is shown to be the inverse of the second-moment matrix E(Y-f(y?.))(Y-f(y?.))', but the weight matrix is assumed to be arbitrary in most developments. The best diagonal approximation is the inverse of the matrix of diagonal elements of E(Y-f(y?.))(Y-f(y?.))', and a method of estimating this diagonal matrix when it is unknown is developed using a special objective function to compute 'hat' ?. When considered to be an estimate of f
Theory of superconductivity in oxides. Final technical report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, P.W.
1988-05-18
Progress was made towards a final theory of high-Tc superconductivity. The key elements are the work on normal-state properties and the actual mechanism for Tc. With the understanding (ZA) of the large anisotropy and other transport properties in the normal state, the model is uniquely determined: one must have one version or another of a holon-spinon quantum-fluid state, which is not a normal Fermi liquid. And with the recognition (HWA) of the large-repulsion holon-holon interactions, the author has the first way of thinking quantitatively about the superconducting state. Work on the pure Heisenberg system, which is related but not necessarilymore » crucial to understanding the superconducting properties is described.« less
Jha, Paridhi; Christensson, Kyllike; Svanberg, Agneta Skoog; Larsson, Margareta; Sharma, Bharati; Johansson, Eva
2016-08-01
this study aimed to explore and understand the perceptions and experiences of women regarding quality of care received during childbirth in public health facilities. qualitative in-depth interviews were conducted and analysed using the Grounded Theory approach. thirteen women who had given vaginal birth to a healthy newborn infant. participants were interviewed in their homes in one district of Chhattisgarh, India. the interview followed a pre-tested guide comprising one key question: How did the women experience and perceive the care provided during labour and childbirth? 'cashless childbirth but at a cost: subordination during childbirth' was identified as the core category. Women chose a public health facility due to their socio-economic limitations, and to have a cashless and safe childbirth. Participants expressed a sense of trust in public health facilities, and verbalised that free food and ambulance services provided by the government were appreciated. Care during normal birth was medicalised, and women lacked control over the process of their labour. Often, the women experienced verbal and physical abuse, which led to passive acceptance of all the services provided to avoid confrontation with the providers. increasingly higher numbers of women give birth in public health facilities in Chhattisgarh, India, and women who have no alternative place to have a safe and normal birth are the main beneficiaries. The labour rooms are functional, but there is a need for improvement of interpersonal processes, information-sharing, and sensitive treatment of women seeking childbirth services in public health facilities. Copyright © 2016 Elsevier Ltd. All rights reserved.
Complex Network Theory Applied to the Growth of Kuala Lumpur's Public Urban Rail Transit Network.
Ding, Rui; Ujang, Norsidah; Hamid, Hussain Bin; Wu, Jianjun
2015-01-01
Recently, the number of studies involving complex network applications in transportation has increased steadily as scholars from various fields analyze traffic networks. Nonetheless, research on rail network growth is relatively rare. This research examines the evolution of the Public Urban Rail Transit Networks of Kuala Lumpur (PURTNoKL) based on complex network theory and covers both the topological structure of the rail system and future trends in network growth. In addition, network performance when facing different attack strategies is also assessed. Three topological network characteristics are considered: connections, clustering and centrality. In PURTNoKL, we found that the total number of nodes and edges exhibit a linear relationship and that the average degree stays within the interval [2.0488, 2.6774] with heavy-tailed distributions. The evolutionary process shows that the cumulative probability distribution (CPD) of degree and the average shortest path length show good fit with exponential distribution and normal distribution, respectively. Moreover, PURTNoKL exhibits clear cluster characteristics; most of the nodes have a 2-core value, and the CPDs of the centrality's closeness and betweenness follow a normal distribution function and an exponential distribution, respectively. Finally, we discuss four different types of network growth styles and the line extension process, which reveal that the rail network's growth is likely based on the nodes with the biggest lengths of the shortest path and that network protection should emphasize those nodes with the largest degrees and the highest betweenness values. This research may enhance the networkability of the rail system and better shape the future growth of public rail networks.
NASA Technical Reports Server (NTRS)
Armour, Edward A.G.
2007-01-01
Muon catalyzed fusion is a process in which a negatively charged muon combines with two nuclei of isotopes of hydrogen, e.g, a proton and a deuteron or a deuteron and a triton, to form a muonic molecular ion in which the binding is so tight that nuclear fusion occurs. The muon is normally released after fusion has taken place and so can catalyze further fusions. As the muon has a mean lifetime of 2.2 microseconds, this is the maximum period over which a muon can participate in this process. This article gives an outline of the history of muon catalyzed fusion from 1947, when it was first realised that such a process might occur, to the present day. It includes a description of the contribution that Drachrnan has made to the theory of muon catalyzed fusion and the influence this has had on the author's research.
A mathematics for medicine: The Network Effect
West, Bruce J.
2014-01-01
The theory of medicine and its complement systems biology are intended to explain the workings of the large number of mutually interdependent complex physiologic networks in the human body and to apply that understanding to maintaining the functions for which nature designed them. Therefore, when what had originally been made as a simplifying assumption or a working hypothesis becomes foundational to understanding the operation of physiologic networks it is in the best interests of science to replace or at least update that assumption. The replacement process requires, among other things, an evaluation of how the new hypothesis affects modern day understanding of medical science. This paper identifies linear dynamics and Normal statistics as being such arcane assumptions and explores some implications of their retirement. Specifically we explore replacing Normal with fractal statistics and examine how the latter are related to non-linear dynamics and chaos theory. The observed ubiquity of inverse power laws in physiology entails the need for a new calculus, one that describes the dynamics of fractional phenomena and captures the fractal properties of the statistics of physiological time series. We identify these properties as a necessary consequence of the complexity resulting from the network dynamics and refer to them collectively as The Network Effect. PMID:25538622
Scrotal calcinosis: idiopathic or dystrophic?
Dubey, Suparna; Sharma, Rajeev; Maheshwari, Veena
2010-02-15
Scrotal calcinosis is a rare benign local process characterized by multiple, painless, hard scrotal nodules in the absence of any systemic metabolic disorder. Histological examination reveals extensive deposition of calcium in the dermis, which may be surrounded by histiocytes and an inflammatory giant cell reaction. Numerous theories have been propounded to explain the pathogenesis of this condition, but the principal debate revolves around whether the calcium is deposited at the site of previous epithelial cysts or the calcified nodules are purely idiopathic. This is the largest study of scrotal calcinosis to date with 100 cases, on which clinical, biochemical, radiological, cytopathological, and histopathological examinations were conducted. The histological picture shows a continuous spectrum of changes ranging from intact epithelial cysts (41.0%) - both normal and inflamed; through inflamed cysts containing calcific material in the lumen but with intact cyst wall (53.0%); calcified inflamed cysts with partial epithelial lining (11.0%); to 'naked' calcium deposits lying in the dermis (100%), sometimes compressing surrounding collagen fibres to form a pseudocyst (56.0%). The presence of normal values of calcium and phosphorus along with this spectrum of changes in histology both support the theory that these form by dystrophic calcification of epithelial cysts in a progression that involves inflammation, rupture, calcification and obliteration of the cyst wall.
NASA Astrophysics Data System (ADS)
Pu, Yang; Chen, Jun; Wang, Wubao
2014-02-01
The scattering coefficient, μs, the anisotropy factor, g, the scattering phase function, p(θ), and the angular dependence of scattering intensity distributions of human cancerous and normal prostate tissues were systematically investigated as a function of wavelength, scattering angle and scattering particle size using Mie theory and experimental parameters. The Matlab-based codes using Mie theory for both spherical and cylindrical models were developed and applied for studying the light propagation and the key scattering properties of the prostate tissues. The optical and structural parameters of tissue such as the index of refraction of cytoplasm, size of nuclei, and the diameter of the nucleoli for cancerous and normal human prostate tissues obtained from the previous biological, biomedical and bio-optic studies were used for Mie theory simulation and calculation. The wavelength dependence of scattering coefficient and anisotropy factor were investigated in the wide spectral range from 300 nm to 1200 nm. The scattering particle size dependence of μs, g, and scattering angular distributions were studied for cancerous and normal prostate tissues. The results show that cancerous prostate tissue containing larger size scattering particles has more contribution to the forward scattering in comparison with the normal prostate tissue. In addition to the conventional simulation model that approximately considers the scattering particle as sphere, the cylinder model which is more suitable for fiber-like tissue frame components such as collagen and elastin was used for developing a computation code to study angular dependence of scattering in prostate tissues. To the best of our knowledge, this is the first study to deal with both spherical and cylindrical scattering particles in prostate tissues.
1991-03-04
term that describes inextensional motion. The first equation represents the normal stress at the midsurface of the shell, which is equal to the...that the normal velocity at the midsurface of the shell is proportional to the normal derivative of the total pressw e. The scattered pressure ps can
NASA Technical Reports Server (NTRS)
Budd, P. A.
1981-01-01
The secondary electron emission coefficient was measured for a charged polymer (FEP-Teflon) with normally and obliquely incident primary electrons. Theories of secondary emission are reviewed and the experimental data is compared to these theories. Results were obtained for angles of incidence up to 60 deg in normal electric fields of 1500 V/mm. Additional measurements in the range from 50 to 70 deg were made in regions where the normal and tangential fields were approximately equal. The initial input angles and measured output point of the electron beam could be analyzed with computer simulations in order to determine the field within the chamber. When the field is known, the trajectories can be calculated for impacting electrons having various energies and angles of incidence. There was close agreement between the experimental results and the commonly assumed theoretical model in the presence of normal electric fields for angles of incidence up to 60 deg. High angle results obtained in the presence of tangential electric fields did not agree with the theoretical models.
Winter, D A
1989-12-01
The biomechanical (kinetic) analysis of human gait reveals the integrated and detailed motor patterns that are essential in pinpointing the abnormal patterns in pathological gait. In a similar manner, these motor patterns (moments, powers, and EMGs) can be used to identify synergies and to validate theories of CNS control. Based on kinetic and EMG patterns for a wide range of normal subjects and cadences, evidence is presented that both supports and negates the central pattern generator theory of locomotion. Adaptive motor patterns that are evident in peripheral gait pathologies reinforce a strong peripheral rather than a central control. Finally, a three-component subtask theory of human gait is presented and is supported by reference to the motor patterns seen in a normal gait. The identified subtasks are (a) support (against collapse during stance); (b) dynamic balance of the upper body, also during stance; and (c) feedforward control of the foot trajectory to achieve safe ground clearance and a gentle heel contact.
Kerr, Sharyn; Durkin, Kevin
2004-12-01
Standard false belief tasks indicate that normally developing children do not fully develop a theory of mind until the age of 4 years and that children with autism have an impaired theory of mind. Recent evidence, however, suggests that children as young as 3 years of age understand that thought bubbles depict mental representations and that these can be false. Twelve normally developing children and 11 children with autism were tested on a standard false belief task and a number of tasks that employed thought bubbles to represent mental states. While the majority of normally developing children and children with autism failed the standard false belief task, they understood that (i) thought bubbles represent thought, (ii) thought bubbles can be used to infer an unknown reality, (iii) thoughts can be different, and (iv) thoughts can be false. These results indicate that autistic children with a relatively low verbal mental age may be capable of understanding mental representations.
Vibration Control in Turbomachinery Using Active Magnetic Journal Bearings
NASA Technical Reports Server (NTRS)
Knight, Josiah D.
1996-01-01
The effective use of active magnetic bearings for vibration control in turbomachinery depends on an understanding of the forces available from a magnetic bearing actuator. The purpose of this project was to characterize the forces as functions shaft position. Both numerical and experimental studies were done to determine the characteristics of the forces exerted on a stationary shaft by a magnetic bearing actuator. The numerical studies were based on finite element computations and included both linear and nonlinear magnetization functions. Measurements of the force versus position of a nonrotating shaft were made using two separate measurement rigs, one based on strain gage measurement of forces, the other based on deflections of a calibrated beam. The general trends of the measured principal forces agree with the predictions of the theory while the magnitudes of forces are somewhat smaller than those predicted. Other aspects of theory are not confirmed by the measurements. The measured forces in the normal direction are larger than those predicted by theory when the rotor has a normal eccentricity. Over the ranges of position examined, the data indicate an approximately linear relationship between the normal eccentricity of the shaft and the ratio of normal to principal force. The constant of proportionality seems to be larger at lower currents, but for all cases examined its value is between 0.14 and 0.17. The nonlinear theory predicts the existence of normal forces, but has not predicted such a large constant of proportionality for the ratio. The type of coupling illustrated by these measurements would not tend to cause whirl, because the coupling coefficients have the same sign, unlike the case of a fluid film bearing, where the normal stiffness coefficients often have opposite signs. They might, however, tend to cause other self-excited behavior. This possibility must be considered when designing magnetic bearings for flexible rotor applications, such as gas turbines and other turbomachinery.
A normal mode treatment of semi-diurnal body tides on an aspherical, rotating and anelastic Earth
NASA Astrophysics Data System (ADS)
Lau, Harriet C. P.; Yang, Hsin-Ying; Tromp, Jeroen; Mitrovica, Jerry X.; Latychev, Konstantin; Al-Attar, David
2015-08-01
Normal mode treatments of the Earth's body tide response were developed in the 1980s to account for the effects of Earth rotation, ellipticity, anelasticity and resonant excitation within the diurnal band. Recent space-geodetic measurements of the Earth's crustal displacement in response to luni-solar tidal forcings have revealed geographical variations that are indicative of aspherical deep mantle structure, thus providing a novel data set for constraining deep mantle elastic and density structure. In light of this, we make use of advances in seismic free oscillation literature to develop a new, generalized normal mode theory for the tidal response within the semi-diurnal and long-period tidal band. Our theory involves a perturbation method that permits an efficient calculation of the impact of aspherical structure on the tidal response. In addition, we introduce a normal mode treatment of anelasticity that is distinct from both earlier work in body tides and the approach adopted in free oscillation seismology. We present several simple numerical applications of the new theory. First, we compute the tidal response of a spherically symmetric, non-rotating, elastic and isotropic Earth model and demonstrate that our predictions match those based on standard Love number theory. Second, we compute perturbations to this response associated with mantle anelasticity and demonstrate that the usual set of seismic modes adopted for this purpose must be augmented by a family of relaxation modes to accurately capture the full effect of anelasticity on the body tide response. Finally, we explore aspherical effects including rotation and we benchmark results from several illustrative case studies of aspherical Earth structure against independent finite-volume numerical calculations of the semi-diurnal body tide response. These tests confirm the accuracy of the normal mode methodology to at least the level of numerical error in the finite-volume predictions. They also demonstrate that full coupling of normal modes, rather than group coupling, is necessary for accurate predictions of the body tide response.
Investigation of Heat Transfer to a Flat Plate in a Shock Tube.
1987-12-01
2 Objectives and Scope . . . . . .. .. .. .... 5 11. Theory ............... ....... 7 Shock Tube Principles........... 7 Boundary Layer Theory ...in *excess of theory , but the rounded edge flat plate exhibited data which matched or was less than what theory predicted for each Mach number tested...normal shock advancing along an infinite flat plate. For x< Ugt there is a region of interaction between the downstream influence of the leading edge
2007-05-01
sufficient for explaining how theory -of- mind emerges in normally developing children . As confirmation of its plausibility, our theory explains the... autism . While there are a number of different substrate elements that we believe are operative during theory of mind computations, three elements in...15. SUBJECT TERMS PMESII, multiple representations, integrated reasoning, hybrid systems, social cognition, theory of mind 16. SECURITY
Wang, Yimin; Bowman, Joel M
2013-10-21
We present a theory of mode-specific tunneling that makes use of the general tunneling path along the imaginary-frequency normal mode of the saddle point, Qim, and the associated relaxed potential, V(Qim) [Y. Wang and J. M. Bowman, J. Chem. Phys. 129, 121103 (2008)]. The novel aspect of the theory is the projection of the normal modes of a minimum onto the Qim path and the determination of turning points on V(Qim). From that projection, the change in tunneling upon mode excitation can be calculated. If the projection is zero, no enhancement of tunneling is predicted. In that case vibrationally adiabatic (VA) theory could apply. However, if the projection is large then VA theory is not applicable. The approach is applied to mode-specific tunneling in full-dimensional malonaldehyde, using an accurate full-dimensional potential energy surface. Results are in semi-quantitative agreement with experiment for modes that show large enhancement of the tunneling, relative to the ground state tunneling splitting. For the six out-of-plane modes, which have zero projection on the planar Qim path, VA theory does apply, and results from that theory agree qualitatively and even semi-quantitatively with experiment. We also verify the failure of simple VA theory for modes that show large enhancement of tunneling.
Babler, Elizabeth; Strickland, Carolyn June
2015-01-01
To gain a greater understanding of adolescent's experiences living with Type 1 diabetes mellitus (T1DM) and create a theoretical paradigm. Grounded theory as described by Glaser was used. Fifteen in-depth interviews were conducted with adolescent's ages 11-15 with T1DM. Symbolic interactionism is the theoretical framework for grounded theory. Data were collected; transcribed, coded, and analyzed simultaneously using constant comparative analysis and findings were grounded in the words of participants. A theoretical model was created with the concept of "normalizing". Normalizing was defined as the ability to integrate diabetes into one's daily life to make diabetes 'part of me'. Phase four of the model, and the focus of this manuscript was "Moving the Journey towards Independence" and included: 1) taking over care, 2) experiencing conflict with parents, and 3) realizing diabetes is hard. The major task for adolescents in this phase was separating from parents to independently manage diabetes. The normalizing task for this phase was: "taking on the burden of care". Adolescents described challenges with independent care and increased parental conflict including: fearing needles, forgetting insulin, feeling embarrassed and believing that diabetes was a burden in their life. Additionally, juggling the multiple responsibilities of home, school and work along with managing a chronic illness during adolescence is challenging. Transitioning to diabetes self-management is a challenge for adolescents. This model advances understanding of the moving processes in adolescents transitioning; additionally, hypotheses are presented that may be used for developing interventions to promote success in self-management. Copyright © 2015 Elsevier Inc. All rights reserved.
McGregor, Ian; Hayes, Joseph; Prentice, Mike
2015-01-01
A new set of hypotheses is presented regarding the cause of aggressive religious radicalization (ARR). It is grounded in classic and contemporary theory of human motivation and goal regulation, together with recent empirical advances in personality, social, and neurophysiological psychology. We specify personality traits, threats, and group affordances that combine to divert normal motivational processes toward ARR. Conducive personality traits are oppositional, anxiety-prone, and identity-weak (i.e., morally bewildered). Conducive threats are those that arise from seemingly insurmountable external forces and frustrate effective goal regulation. Conducive affordances include opportunity for immediate and concrete engagement in active groups that are powered by conspiracy narratives, infused with cosmic significance, encouraging of moral violence, and sealed with religious unfalsifiability. We propose that ARR is rewarding because it can spur approach motivated states that mask vulnerability for people whose dispositions and circumstances would otherwise leave them mired in anxious distress. PMID:26441709
Large Fluctuations for Spatial Diffusion of Cold Atoms
NASA Astrophysics Data System (ADS)
Aghion, Erez; Kessler, David A.; Barkai, Eli
2017-06-01
We use a new approach to study the large fluctuations of a heavy-tailed system, where the standard large-deviations principle does not apply. Large-deviations theory deals with tails of probability distributions and the rare events of random processes, for example, spreading packets of particles. Mathematically, it concerns the exponential falloff of the density of thin-tailed systems. Here we investigate the spatial density Pt(x ) of laser-cooled atoms, where at intermediate length scales the shape is fat tailed. We focus on the rare events beyond this range, which dominate important statistical properties of the system. Through a novel friction mechanism induced by the laser fields, the density is explored with the recently proposed non-normalized infinite-covariant density approach. The small and large fluctuations give rise to a bifractal nature of the spreading packet. We derive general relations which extend our theory to a class of systems with multifractal moments.
Decision Processes in Discrimination: Fundamental Misrepresentations of Signal Detection Theory
NASA Technical Reports Server (NTRS)
Balakrishnan, J. D.
1998-01-01
In the first part of this article, I describe a new approach to studying decision making in discrimination tasks that does not depend on the technical assumptions of signal detection theory (e.g., normality of the encoding distributions). Applying these new distribution-free tests to data from three experiments, I show that base rate and payoff manipulations had substantial effects on the participants' encoding distributions but no effect on their decision rules, which were uniformly unbiased in equal and unequal base rate conditions and in symmetric and asymmetric payoff conditions. In the second part of the article, I show that this seemingly paradoxical result is readily explained by the sequential sampling models of discrimination. I then propose a new, "model-free" test for response bias that seems to more properly identify both the nature and direction of the biases induced by the classical bias manipulations.
McGregor, Ian; Hayes, Joseph; Prentice, Mike
2015-01-01
A new set of hypotheses is presented regarding the cause of aggressive religious radicalization (ARR). It is grounded in classic and contemporary theory of human motivation and goal regulation, together with recent empirical advances in personality, social, and neurophysiological psychology. We specify personality traits, threats, and group affordances that combine to divert normal motivational processes toward ARR. Conducive personality traits are oppositional, anxiety-prone, and identity-weak (i.e., morally bewildered). Conducive threats are those that arise from seemingly insurmountable external forces and frustrate effective goal regulation. Conducive affordances include opportunity for immediate and concrete engagement in active groups that are powered by conspiracy narratives, infused with cosmic significance, encouraging of moral violence, and sealed with religious unfalsifiability. We propose that ARR is rewarding because it can spur approach motivated states that mask vulnerability for people whose dispositions and circumstances would otherwise leave them mired in anxious distress.
Understanding Jordanian Psychiatric Nurses' Smoking Behaviors: A Grounded Theory Study
Aldiabat, Khaldoun M.; Clinton, Michael
2013-01-01
Purpose. Smoking is prevalent in psychiatric facilities among staff and patients. However, there have been few studies of how contextual factors in specific cultures influence rates of smoking and the health promotion role of psychiatric nurses. This paper reports the findings of a classical grounded theory study conducted to understand how contextual factors in the workplace influences the smoking behaviors of Jordanian psychiatric nurses (JPNs). Method. Semi-structured individual interviews were conducted with a sample of eight male JPNs smokers at a psychiatric facility in Amman, Jordan. Findings. Constant comparative analysis identified becoming a heavy smoker as a psychosocial process characterized by four sub-categories: normalization of smoking; living in ambiguity; experiencing workplace conflict; and, facing up to workplace stressors. Conclusion. Specific contextual workplace factors require targeted smoking cessation interventions if JPNs are to receive the help they need to reduce health risks associated with heavy smoking. PMID:23844286
An analytical and experimental study of crack extension in center-notched composites
NASA Technical Reports Server (NTRS)
Beuth, Jack L., Jr.; Herakovich, Carl T.
1987-01-01
The normal stress ratio theory for crack extension in anisotropic materials is studied analytically and experimentally. The theory is applied within a microscopic-level analysis of a single center notch of arbitrary orientation in a unidirectional composite material. The bulk of the analytical work of this study applies an elasticity solution for an infinite plate with a center line to obtain critical stress and crack growth direction predictions. An elasticity solution for an infinite plate with a center elliptical flaw is also used to obtain qualitative predictions of the location of crack initiation on the border of a rounded notch tip. The analytical portion of the study includes the formulation of a new crack growth theory that includes local shear stress. Normal stress ratio theory predictions are obtained for notched unidirectional tensile coupons and unidirectional Iosipescu shear specimens. These predictions are subsequently compared to experimental results.
Beyond Classical Information Theory: Advancing the Fundamentals for Improved Geophysical Prediction
NASA Astrophysics Data System (ADS)
Perdigão, R. A. P.; Pires, C. L.; Hall, J.; Bloeschl, G.
2016-12-01
Information Theory, in its original and quantum forms, has gradually made its way into various fields of science and engineering. From the very basic concepts of Information Entropy and Mutual Information to Transit Information, Interaction Information and respective partitioning into statistical synergy, redundancy and exclusivity, the overall theoretical foundations have matured as early as the mid XX century. In the Earth Sciences various interesting applications have been devised over the last few decades, such as the design of complex process networks of descriptive and/or inferential nature, wherein earth system processes are "nodes" and statistical relationships between them designed as information-theoretical "interactions". However, most applications still take the very early concepts along with their many caveats, especially in heavily non-Normal, non-linear and structurally changing scenarios. In order to overcome the traditional limitations of information theory and tackle elusive Earth System phenomena, we introduce a new suite of information dynamic methodologies towards a more physically consistent and information comprehensive framework. The methodological developments are then illustrated on a set of practical examples from geophysical fluid dynamics, where high-order nonlinear relationships elusive to the current non-linear information measures are aptly captured. In doing so, these advances increase the predictability of critical events such as the emergence of hyper-chaotic regimes in ocean-atmospheric dynamics and the occurrence of hydro-meteorological extremes.
Autoimmunity: a decision theory model.
Morris, J A
1987-01-01
Concepts from statistical decision theory were used to analyse the detection problem faced by the body's immune system in mounting immune responses to bacteria of the normal body flora. Given that these bacteria are potentially harmful, that there can be extensive cross reaction between bacterial antigens and host tissues, and that the decisions are made in uncertainty, there is a finite chance of error in immune response leading to autoimmune disease. A model of ageing in the immune system is proposed that is based on random decay in components of the decision process, leading to a steep age dependent increase in the probability of error. The age incidence of those autoimmune diseases which peak in early and middle life can be explained as the resultant of two processes: an exponentially falling curve of incidence of first contact with common bacteria, and a rapidly rising error function. Epidemiological data on the variation of incidence with social class, sibship order, climate and culture can be used to predict the likely site of carriage and mode of spread of the causative bacteria. Furthermore, those autoimmune diseases precipitated by common viral respiratory tract infections might represent reactions to nasopharyngeal bacterial overgrowth, and this theory can be tested using monoclonal antibodies to search the bacterial isolates for cross reacting antigens. If this model is correct then prevention of autoimmune disease by early exposure to low doses of bacteria might be possible. PMID:3818985
The imbalanced brain: from normal behavior to schizophrenia.
Grossberg, S
2000-07-15
An outstanding problem in psychiatry concerns how to link discoveries about the pharmacological, neurophysiological, and neuroanatomical substrates of mental disorders to the abnormal behaviors that they control. A related problem concerns how to understand abnormal behaviors on a continuum with normal behaviors. During the past few decades, neural models have been developed of how normal cognitive and emotional processes learn from the environment, focus attention and act upon motivationally important events, and cope with unexpected events. When arousal or volitional signals in these models are suitably altered, they give rise to symptoms that strikingly resemble negative and positive symptoms of schizophrenia, including flat affect, impoverishment of will, attentional problems, loss of a theory of mind, thought derailment, hallucinations, and delusions. This article models how emotional centers of the brain, such as the amygdala, interact with sensory and prefrontal cortices (notably ventral, or orbital, prefrontal cortex) to generate affective states, attend to motivationally salient sensory events, and elicit motivated behaviors. Closing this feedback loop between cognitive and emotional centers is predicted to generate a cognitive-emotional resonance that can support conscious awareness. When such emotional centers become depressed, negative symptoms of schizophrenia emerge in the model. Such emotional centers are modeled as opponent affective processes, such as fear and relief, whose response amplitude and sensitivity are calibrated by an arousal level and chemical transmitters that slowly inactivate, or habituate, in an activity-dependent way. These opponent processes exhibit an Inverted-U, whereby behavior becomes depressed if the arousal level is chosen too large or too small. The negative symptoms are owing to the way in which the depressed opponent process interacts with other circuits throughout the brain.
Theory and Experiment Analysis of Two-Dimensional Acousto-Optic Interaction.
1995-01-03
The universal coupled wave equation of two dimensional acousto optic effect has been deduced and the solution of normal Raman-Hath acousto optic diffraction...was derived from it. The theory was compared with the experimental results of a two dimensional acousto optic device consisting of two one dimensional modulators. The experiment results agree with the theory. (AN)
A Complete Multimode Equivalent-Circuit Theory for Electrical Design
Williams, Dylan F.; Hayden, Leonard A.; Marks, Roger B.
1997-01-01
This work presents a complete equivalent-circuit theory for lossy multimode transmission lines. Its voltages and currents are based on general linear combinations of standard normalized modal voltages and currents. The theory includes new expressions for transmission line impedance matrices, symmetry and lossless conditions, source representations, and the thermal noise of passive multiports. PMID:27805153
Therapeutic Treatment of Early Disturbances in the Mother-Child Interaction.
ERIC Educational Resources Information Center
Broden, Margareta Berg
A theory of normal mother-infant relationship based on Margaret Mahler's theories is the basis of a treatment program for disturbed mother/infant relationships. This theory includes the concept of symbiosis which for the child is an undifferentiated condition, a fusion with the mother where the two have a common outward border, thereby protecting…
Geometrically Nonlinear Transient Analysis of Laminated Composite Plates.
1982-03-01
theory (CPT), in which normals to the midsurface before deformation are assumed to remain straight and normal to the midsurface after deformation (i.e...the plate are negligible when compared to the inplane stresses, and normals to the plate midsurface before deformation remain straight but not...necessarily normal to the midsurface after deformation. $ Equations of motion The plate under consideration is composed of a finite number of orthotropic
SUPERPOSITION OF POLYTROPES IN THE INNER HELIOSHEATH
DOE Office of Scientific and Technical Information (OSTI.GOV)
Livadiotis, G., E-mail: glivadiotis@swri.edu
2016-03-15
This paper presents a possible generalization of the equation of state and Bernoulli's integral when a superposition of polytropic processes applies in space and astrophysical plasmas. The theory of polytropic thermodynamic processes for a fixed polytropic index is extended for a superposition of polytropic indices. In general, the superposition may be described by any distribution of polytropic indices, but emphasis is placed on a Gaussian distribution. The polytropic density–temperature relation has been used in numerous analyses of space plasma data. This linear relation on a log–log scale is now generalized to a concave-downward parabola that is able to describe themore » observations better. The model of the Gaussian superposition of polytropes is successfully applied in the proton plasma of the inner heliosheath. The estimated mean polytropic index is near zero, indicating the dominance of isobaric thermodynamic processes in the sheath, similar to other previously published analyses. By computing Bernoulli's integral and applying its conservation along the equator of the inner heliosheath, the magnetic field in the inner heliosheath is estimated, B ∼ 2.29 ± 0.16 μG. The constructed normalized histogram of the values of the magnetic field is similar to that derived from a different method that uses the concept of large-scale quantization, bringing incredible insights to this novel theory.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gigley, H.M.
1982-01-01
An artificial intelligence approach to the simulation of neurolinguistically constrained processes in sentence comprehension is developed using control strategies for simulation of cooperative computation in associative networks. The desirability of this control strategy in contrast to ATN and production system strategies is explained. A first pass implementation of HOPE, an artificial intelligence simulation model of sentence comprehension, constrained by studies of aphasic performance, psycholinguistics, neurolinguistics, and linguistic theory is described. Claims that the model could serve as a basis for sentence production simulation and for a model of language acquisition as associative learning are discussed. HOPE is a model thatmore » performs in a normal state and includes a lesion simulation facility. HOPE is also a research tool. Its modifiability and use as a tool to investigate hypothesized causes of degradation in comprehension performance by aphasic patients are described. Issues of using behavioral constraints in modelling and obtaining appropriate data for simulated process modelling are discussed. Finally, problems of validation of the simulation results are raised; and issues of how to interpret clinical results to define the evolution of the model are discussed. Conclusions with respect to the feasibility of artificial intelligence simulation process modelling are discussed based on the current state of research.« less
Narrating consciousness: language, media and embodiment.
Hayles, N Katherine; Pulizzi, James J
2010-01-01
Although there has long been a division in studies of consciousness between a focus on neuronal processes or conversely an emphasis on the ruminations of a conscious self, the long-standing split between mechanism and meaning within the brain was mirrored by a split without, between information as a technical term and the meanings that messages are commonly thought to convey. How to heal this breach has posed formidable problems to researchers. Working through the history of cybernetics, one of the historical sites where Claude Shannon's information theory quickly became received doctrine, we argue that the cybernetic program as it developed through second-order cybernetics and autopoietic theory remains incomplete. In this article, we return to fundamental questions about pattern and noise, context and meaning, to forge connections between consciousness, narrative and media. The thrust of our project is to reintroduce context and narrative as crucial factors in the processes of meaning-making. The project proceeds along two fronts: advancing a theoretical framework within which context plays its property central role; and demonstrating the importance of context by analyzing two fictions, Stanislaw Lem's "His Master's Voice" and Joseph McElroy's "Plus," in which context has been deformed by being wrenched away from normal human environments, with radical consequences for processes of meaning-making.
Superposition of Polytropes in the Inner Heliosheath
NASA Astrophysics Data System (ADS)
Livadiotis, G.
2016-03-01
This paper presents a possible generalization of the equation of state and Bernoulli's integral when a superposition of polytropic processes applies in space and astrophysical plasmas. The theory of polytropic thermodynamic processes for a fixed polytropic index is extended for a superposition of polytropic indices. In general, the superposition may be described by any distribution of polytropic indices, but emphasis is placed on a Gaussian distribution. The polytropic density-temperature relation has been used in numerous analyses of space plasma data. This linear relation on a log-log scale is now generalized to a concave-downward parabola that is able to describe the observations better. The model of the Gaussian superposition of polytropes is successfully applied in the proton plasma of the inner heliosheath. The estimated mean polytropic index is near zero, indicating the dominance of isobaric thermodynamic processes in the sheath, similar to other previously published analyses. By computing Bernoulli's integral and applying its conservation along the equator of the inner heliosheath, the magnetic field in the inner heliosheath is estimated, B ˜ 2.29 ± 0.16 μG. The constructed normalized histogram of the values of the magnetic field is similar to that derived from a different method that uses the concept of large-scale quantization, bringing incredible insights to this novel theory.
NASA Astrophysics Data System (ADS)
Laqua, Henryk; Kussmann, Jörg; Ochsenfeld, Christian
2018-03-01
The correct description of multi-reference electronic ground states within Kohn-Sham density functional theory (DFT) requires an ensemble-state representation, employing fractionally occupied orbitals. However, the use of fractional orbital occupation leads to non-normalized exact-exchange holes, resulting in large fractional-spin errors for conventional approximative density functionals. In this communication, we present a simple approach to directly include the exact-exchange-hole normalization into DFT. Compared to conventional functionals, our model strongly improves the description for multi-reference systems, while preserving the accuracy in the single-reference case. We analyze the performance of our proposed method at the example of spin-averaged atoms and spin-restricted bond dissociation energy surfaces.
Laqua, Henryk; Kussmann, Jörg; Ochsenfeld, Christian
2018-03-28
The correct description of multi-reference electronic ground states within Kohn-Sham density functional theory (DFT) requires an ensemble-state representation, employing fractionally occupied orbitals. However, the use of fractional orbital occupation leads to non-normalized exact-exchange holes, resulting in large fractional-spin errors for conventional approximative density functionals. In this communication, we present a simple approach to directly include the exact-exchange-hole normalization into DFT. Compared to conventional functionals, our model strongly improves the description for multi-reference systems, while preserving the accuracy in the single-reference case. We analyze the performance of our proposed method at the example of spin-averaged atoms and spin-restricted bond dissociation energy surfaces.
Normal stress differences and beyond-Navier-Stokes hydrodynamics
NASA Astrophysics Data System (ADS)
Alam, Meheboob; Saha, Saikat
2017-06-01
A recently proposed beyond-Navier-Stokes order hydrodynamic theory for dry granular fluids is revisited by focussing on the behaviour of the stress tensor and the scaling of related transport coefficients in the dense limit. For the homogeneous shear flow, it is shown that the eigen-directions of the second-moment tensor and those of the shear tensor become co-axial, thus making the first normal stress difference (N1) to zero in the same limit. In contrast, the origin of the second normal stress difference (N2) is tied to the `excess' temperature along the mean-vorticity direction and the imposed shear field, respectively, in the dilute and dense flows. The scaling relations for transport coefficients are suggested based on the present theory.
Grand unified theories, topological defects, and ultrahigh-energy cosmic rays
NASA Technical Reports Server (NTRS)
Bhattacharjee, Pijushpani; Hill, Christopher T.; Schramm, David N.
1992-01-01
The ultrahigh-energy (UHE) proton and neutrino spectra resulting from collapse or annihilations of topological defects surviving from the GUT era are calculated. Irrespective of the specific process under consideration (which determines the overall normalization of the spectrum), the UHE proton spectrum always 'recovers' at approximately 1.8 x 10 exp 11 GeV after a partial Greisen-Zatsepin-Kuz'min 'cutoff' at approximately 5 x 10 exp 10 GeV and continues to a GUT-scale energy with a universal shape determined by the physics of hadronic jet fragmentation. Implications of the results are discussed.
The multisensory approach to birth and aromatherapy.
Gutteridge, Kathryn
2014-05-01
The birth environment continues to be a subject of midwifery discourse within theory and practice. This article discusses the birth environment from the perspective of understanding the aromas and aromatherapy for the benefit of women and midwives The dynamic between the olfactory system and stimulation of normal birth processes proves to be fascinating. By examining other health models of care we can incorporate simple but powerful methods that can shape clinical outcomes. There is still more that midwives can do by using aromatherapy in the context of a multisensory approach to make birth environments synchronise with women's potential to birth in a positive way.
A dual process model of perfectionism based on reinforcement theory.
Slade, P D; Owens, R G
1998-07-01
This article begins with a brief review of the current literature on the structure and measurement of perfectionism. It is concluded from this review that two major types can be distinguished, a normal/healthy form and a pathological form. These two forms are then defined as positive and negative perfectionism and related directly to Skinnerian concepts of positive and negative reinforcement. The positive/negative distinction is then further elaborated on in terms of approach/avoidance behavior, goal differences, self-concept involvement, emotional correlates, and the promoting environment. Finally, some of the more obvious theoretical and practical implications are briefly explored.
Cryptic genetic variation, evolution's hidden substrate
Paaby, Annalise B.; Rockman, Matthew V.
2016-01-01
Cryptic genetic variation is invisible under normal conditions but fuel for evolution when circumstances change. In theory, CGV can represent a massive cache of adaptive potential or a pool of deleterious alleles in need of constant suppression. CGV emerges from both neutral and selective processes and it may inform how human populations respond to change. In experimental settings, CGV facilitates adaptation, but does it play an important role in the real world? We review the empirical support for widespread CGV in natural populations, including its potential role in emerging human diseases and the growing evidence of its contribution to evolution. PMID:24614309
Breakup effects on alpha spectroscopic factors of 16O
NASA Astrophysics Data System (ADS)
Adhikari, S.; Basu, C.; Sugathan, P.; Jhinghan, A.; Behera, B. R.; Saneesh, N.; Kaur, G.; Thakur, M.; Mahajan, R.; Dubey, R.; Mitra, A. K.
2017-01-01
The triton angular distribution for the 12C(7Li,t)16O* reaction is measured at 20 MeV, populating discrete states of 16O. Continuum discretized coupled reaction channel calculations are used to to extract the alpha spectroscopic properties of 16O states instead of the distorted wave born approximation theory to include the effects of breakup on the transfer process. The alpha reduced width, spectroscopic factors and the asymptotic normalization constant (ANC) of 16O states are extracted. The error in the spectroscopic factor is about 35% and in that of the ANC about 27%.
Grand unified theories, topological defects and ultrahigh-energy cosmic rays
NASA Technical Reports Server (NTRS)
Bhattacharjee, Pijushpani; Hill, Christopher T.; Schramm, David N.
1991-01-01
The ultrahigh-energy (UHE) proton and neutrino spectra resulting from collapse or annihilations of topological defects surviving from the GUT era are calculated. Irrespective of the specific process under consideration (which determines the overall normalization of the spectrum), the UHE proton spectrum always 'recovers' at approximately 1.8 x 10 exp 11 GeV after a partial Greisen-Zatsepin-Kuz'min 'cutoff' at approximately 5 x 10 exp 10 GeV and continues to a GUT-scale energy with a universal shape determined by the physics of hadronic jet fragmentation. Implications of our results are discussed.
Semiclassical description of resonance-assisted tunneling in one-dimensional integrable models
NASA Astrophysics Data System (ADS)
Le Deunff, Jérémy; Mouchet, Amaury; Schlagheck, Peter
2013-10-01
Resonance-assisted tunneling is investigated within the framework of one-dimensional integrable systems. We present a systematic recipe, based on Hamiltonian normal forms, to construct one-dimensional integrable models that exhibit resonance island chain structures with accurately controlled sizes and positions of the islands. Using complex classical trajectories that evolve along suitably defined paths in the complex time domain, we construct a semiclassical theory of the resonance-assisted tunneling process. This semiclassical approach yields a compact analytical expression for tunnelling-induced level splittings which is found to be in very good agreement with the exact splittings obtained through numerical diagonalization.
It's about time: revisiting temporal processing deficits in dyslexia.
Casini, Laurence; Pech-Georgel, Catherine; Ziegler, Johannes C
2018-03-01
Temporal processing in French children with dyslexia was evaluated in three tasks: a word identification task requiring implicit temporal processing, and two explicit temporal bisection tasks, one in the auditory and one in the visual modality. Normally developing children matched on chronological age and reading level served as a control group. Children with dyslexia exhibited robust deficits in temporal tasks whether they were explicit or implicit and whether they involved the auditory or the visual modality. First, they presented larger perceptual variability when performing temporal tasks, whereas they showed no such difficulties when performing the same task on a non-temporal dimension (intensity). This dissociation suggests that their difficulties were specific to temporal processing and could not be attributed to lapses of attention, reduced alertness, faulty anchoring, or overall noisy processing. In the framework of cognitive models of time perception, these data point to a dysfunction of the 'internal clock' of dyslexic children. These results are broadly compatible with the recent temporal sampling theory of dyslexia. © 2017 John Wiley & Sons Ltd.
[A psychosocial view of a number of Jewish mourning rituals during normal and pathological grief].
Maoz, Benyamin; Lauden, Ari; Ben-Zion, Itzhak
2004-04-01
This article describes the three stages of normal and pathological mourning, emphasizing the constellation embodied in Judaism for this process. These stages are: shock, acute mourning, working through and reconciliation. We present the important question: "How to define pathological mourning?" It is certainly not only a matter of extending beyond the accepted time limits of the mourning process, but also a question of the intensity of mourning in ones daily life, the degree of being preoccupied with it, and the degree of priority that this mourning process has in an individual's life. A number of forms of pathological mourning, during the three mentioned stages, are described, with special attention to Jewish mourning rituals, especially: The "rending of the garments" (Kriyah), the Kaddish, the Shiva, and the termination of mourning after a fixed period of time. One of the possible interpretations of these rituals is that they prevent and neutralize manifestations of aggression and violence. This is an analogue to the function of biological (genetic) rituals which according to the theory of Konrad Lorenz, also minimize the dangerous aggression between the species in nature. The religious ritual converts an aggressive behavior to a minimal and symbolic action, often re-directed, so that an originally dangerous behavior becomes a ritual with an important communicative function.
From intrusive to oscillating thoughts.
Peirce, Anne Griswold
2007-10-01
This paper focused on the possibility that intrusive thoughts (ITs) are a form of an evolutionary, adaptive, and complex strategy to prepare for and resolve stressful life events through schema formation. Intrusive thoughts have been studied in relation to individual conditions, such as traumatic stress disorder and obsessive-compulsive disorder. They have also been documented in the average person experiencing everyday stress. In many descriptions of thought intrusion, it is accompanied by thought suppression. Several theories have been put forth to describe ITs, although none provides a satisfactory explanation as to whether ITs are a normal process, a normal process gone astray, or a sign of pathology. There is also no consistent view of the role that thought suppression plays in the process. I propose that thought intrusion and thought suppression may be better understood by examining them together as a complex and adaptive mechanism capable of escalating in times of need. The ability of a biological mechanism to scale up in times of need is one hallmark of a complex and adaptive system. Other hallmarks of complexity, including self-similarity across scales, sensitivity to initial conditions, presence of feedback loops, and system oscillation, are also discussed in this article. Finally, I propose that thought intrusion and thought suppression are better described together as an oscillatory cycle.
The social process of escalation: a promising focus for crisis management research
2012-01-01
Background This study identifies a promising, new focus for the crisis management research in the health care domain. After reviewing the literature on health care crisis management, there seems to be a knowledge-gap regarding organisational change and adaption, especially when health care situations goes from normal, to non-normal, to pathological and further into a state of emergency or crisis. Discussion Based on studies of escalating situations in obstetric care it is suggested that two theoretical perspectives (contingency theory and the idea of failure as a result of incomplete interaction) tend to simplify the issue of escalation rather than attend to its complexities (including the various power relations among the stakeholders involved). However studying the process of escalation as inherently complex and social allows us to see the definition of a situation as normal or non-normal as an exercise of power in itself, rather than representing a putatively correct response to a particular emergency. Implications The concept of escalation, when treated this way, can help us further the analysis of clinical and institutional acts and competence. It can also turn our attention to some important elements in a class of social phenomenon, crises and emergencies, that so far have not received the attention they deserve. Focusing on organisational choreography, that interplay of potential factors such as power, professional identity, organisational accountability, and experience, is not only a promising focus for future naturalistic research but also for developing more pragmatic strategies that can enhance organisational coordination and response in complex events. PMID:22704075
NASA Astrophysics Data System (ADS)
Filatov, Michael; Cremer, Dieter
2002-01-01
A recently developed variationally stable quasi-relativistic method, which is based on the low-order approximation to the method of normalized elimination of the small component, was incorporated into density functional theory (DFT). The new method was tested for diatomic molecules involving Ag, Cd, Au, and Hg by calculating equilibrium bond lengths, vibrational frequencies, and dissociation energies. The method is easy to implement into standard quantum chemical programs and leads to accurate results for the benchmark systems studied.
Normal Psychosexual Development
ERIC Educational Resources Information Center
Rutter, Michael
1971-01-01
Normal sexual development is reviewed with respect to physical maturation, sexual interests, sex drive", psychosexual competence and maturity, gender role, object choice, children's concepts of sexual differences, sex role preference and standards, and psychosexual stages. Biologic, psychoanalytic and psychosocial theories are briefly considered.…
Multiple Scattering in Clouds: Insights from Three-Dimensional Diffusion/P{sub 1} Theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, Anthony B.; Marshak, Alexander
2001-03-15
In the atmosphere, multiple scattering matters nowhere more than in clouds, and being a product of its turbulence, clouds are highly variable environments. This challenges three-dimensional (3D) radiative transfer theory in a way that easily swamps any available computational resources. Fortunately, the far simpler diffusion (or P{sub 1}) theory becomes more accurate as the scattering intensifies, and allows for some analytical progress as well as computational efficiency. After surveying current approaches to 3D solar cloud-radiation problems from the diffusion standpoint, a general 3D result in steady-state diffusive transport is derived relating the variability-induced change in domain-average flux (i.e., diffuse transmittance)more » to the one-point covariance of internal fluctuations in particle density and in radiative flux. These flux variations follow specific spatial patterns in deliberately hydrodynamical language: radiative channeling. The P{sub 1} theory proves even more powerful when the photon diffusion process unfolds in time as well as space. For slab geometry, characteristic times and lengths that describe normal and transverse transport phenomena are derived. This phenomenology is used to (a) explain persistent features in satellite images of dense stratocumulus as radiative channeling, (b) set limits on current cloud remote-sensing techniques, and (c) propose new ones both active and passive.« less
Huang, Ping; Tan, Shanzhong; Zhang, Yong-xin; Li, Jun-song; Chai, Chuan; Li, Jin-ji; Cai, Bao-chang
2014-08-08
Ascending and descending theory is a core principle of traditional Chinese medicine (TCM) theories. It plays an essential role in TCM clinical applications. Some TCM medicine has specific properties, which could alter the inclination and direction of their actions. The properties of the ascending and floating process of one herbal medicine are affected by means of herb processing. Wine-processing, which is sautéing with rice wine, is one of the most popular technologies of herb processing. Wine-processing increases the inclination and direction of its actions, thereby producing or strengthening their efficacy in cleaning the upper-energizer heat. Radix scutellariae, the dried roots of Scutellaria baicalensis Georgi, is a well-known TCM used for the treatment of inflammation, pyrexia, jaundice, etc. Recently, wine-processed Radix scutellariae was normally applied in clinical studies for the treatment of upper-energizer syndrome. In order to investigate the effects of wine-processing on ascending and descending of Radix scutellariae, the comparative study of distribution of flavonoids in rat tissues of triple energizers (SanJiao-upper, middle, lower jiao) after oral administration of crude and wine-processed Radix scutellariae aqueous extracts was carried out. The rats were randomly assigned to two groups and orally administered with crude and wine-processed Radix scutellariae aqueous extracts, respectively. At different pre-determined time points after administration, the concentrations of compounds in rat tissue homogenate were determined, and the main tissue pharmacokinetic parameters were investigated. Tissue pharmacokinetic parameters including AUC0-t, t1/2, Tmax and Cmax were calculated using DAS 2.0. An unpaired Student t-test was used to compare the differences in tissue pharmacokinetic parameters between the two groups. All the results were expressed as arithmetic mean±S.D. The parameters of Cmax and AUC0-t of some flavonoids in wine-processed Radix scutellariae were remarkably increased (p<0.05, p<0.01, p<0.001) in the rat upper-energizer tissues (lung and heart) compared with those of the crude group. However, in the rat middle- and lower-energizer tissues (spleen, liver and kidney), the Cmax and AUC0-t of some flavonoids were significantly decreased (p<0.05, p<0.01) compared with the crude group. The main explanation for these differences seems to the effects of wine-processing on ascending and descending theory. All of these differences in the distribution of triple energizers after oral administration of crude and wine-processed Radix scutellariae aqueous extracts may lead to the increase of efficacy on the upper-energizer tissues and were in compliance with the ascending and descending theory. Therefore, wine-processing was recommended when Radix scutellariae was used for cleaning the upper-energizer heat and humidity. The obtained knowledge can be used to evaluate the impact of these differences on the efficacy of both the drugs in clinical applications and might be helpful in explaining the effects of wine-processing on ascending and descending theory. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Theory of Mind and Language in Children with Cochlear Implants
ERIC Educational Resources Information Center
Remmel, Ethan; Peters, Kimberly
2009-01-01
Thirty children with cochlear implants (CI children), age range 3-12 years, and 30 children with normal hearing (NH children), age range 4-6 years, were tested on theory of mind and language measures. The CI children showed little to no delay on either theory of mind, relative to the NH children, or spoken language, relative to hearing norms. The…
Asymptotic inference in system identification for the atom maser.
Catana, Catalin; van Horssen, Merlijn; Guta, Madalin
2012-11-28
System identification is closely related to control theory and plays an increasing role in quantum engineering. In the quantum set-up, system identification is usually equated to process tomography, i.e. estimating a channel by probing it repeatedly with different input states. However, for quantum dynamical systems such as quantum Markov processes, it is more natural to consider the estimation based on continuous measurements of the output, with a given input that may be stationary. We address this problem using asymptotic statistics tools, for the specific example of estimating the Rabi frequency of an atom maser. We compute the Fisher information of different measurement processes as well as the quantum Fisher information of the atom maser, and establish the local asymptotic normality of these statistical models. The statistical notions can be expressed in terms of spectral properties of certain deformed Markov generators, and the connection to large deviations is briefly discussed.
NASA Technical Reports Server (NTRS)
Tessler, A.; Annett, M. S.; Gendron, G.
2001-01-01
A {1,2}-order theory for laminated composite and sandwich plates is extended to include thermoelastic effects. The theory incorporates all three-dimensional strains and stresses. Mixed-field assumptions are introduced which include linear in-plane displacements, parabolic transverse displacement and shear strains, and a cubic distribution of the transverse normal stress. Least squares strain compatibility conditions and exact traction boundary conditions are enforced to yield higher polynomial degree distributions for the transverse shear strains and transverse normal stress through the plate thickness. The principle of virtual work is used to derive a 10th-order system of equilibrium equations and associated Poisson boundary conditions. The predictive capability of the theory is demonstrated using a closed-form analytic solution for a simply-supported rectangular plate subjected to a linearly varying temperature field across the thickness. Several thin and moderately thick laminated composite and sandwich plates are analyzed. Numerical comparisons are made with corresponding solutions of the first-order shear deformation theory and three-dimensional elasticity theory. These results, which closely approximate the three-dimensional elasticity solutions, demonstrate that through - the - thickness deformations even in relatively thin and, especially in thick. composite and sandwich laminates can be significant under severe thermal gradients. The {1,2}-order kinematic assumptions insure an overall accurate theory that is in general superior and, in some cases, equivalent to the first-order theory.
Does dissociation offer a useful explanation for psychopathology?
Jureidini, Jon
2004-01-01
Dissociation is often conceptualised as an altered state of consciousness, a trance-like state in which normal barriers between conscious and unconscious memories, desires and beliefs break down and other amnestic barriers emerge. This review explores whether it is likely that there is a neurophysiology of pathological dissociative processes that will elucidate management. A critical reading of current research, sourced through Medline and Psychinfo searches from 1990 to 2002, using subject headings: dissociative disorders, hypnosis and stress disorder (post-traumatic), as well as keywords: dissociation, hypnosis and trance. Current knowledge does not support the notion of dissociation as a discrete brain state or process. Psychiatric and neurophysiological research and theory development are better directed towards individual components that contribute to dissociative experience. Copyright (c) 2004 S. Karger AG, Basel.
Chagas, Mauro H.; Magalhães, Fabrício A.; Peixoto, Gustavo H. C.; Pereira, Beatriz M.; Andrade, André G. P.; Menzel, Hans-Joachim K.
2016-01-01
ABSTRACT Background Stretching exercises are able to promote adaptations in the muscle-tendon unit (MTU), which can be tested through physiological and biomechanical variables. Identifying the key variables in MTU adaptations is crucial to improvements in training. Objective To perform an exploratory factor analysis (EFA) involving the variables often used to evaluate the response of the MTU to stretching exercises. Method Maximum joint range of motion (ROMMAX), ROM at first sensation of stretching (FSTROM), peak torque (torqueMAX), passive stiffness, normalized stiffness, passive energy, and normalized energy were investigated in 36 participants during passive knee extension on an isokinetic dynamometer. Stiffness and energy values were normalized by the muscle cross-sectional area and their passive mode assured by monitoring the EMG activity. Results EFA revealed two major factors that explained 89.68% of the total variance: 53.13% was explained by the variables torqueMAX, passive stiffness, normalized stiffness, passive energy, and normalized energy, whereas the remaining 36.55% was explained by the variables ROMMAX and FSTROM. Conclusion This result supports the literature wherein two main hypotheses (mechanical and sensory theories) have been suggested to describe the adaptations of the MTU to stretching exercises. Contrary to some studies, in the present investigation torqueMAX was significantly correlated with the variables of the mechanical theory rather than those of the sensory theory. Therefore, a new approach was proposed to explain the behavior of the torqueMAX during stretching exercises. PMID:27437715
Relativistic scattered-wave theory. II - Normalization and symmetrization. [of Dirac wavefunctions
NASA Technical Reports Server (NTRS)
Yang, C. Y.
1978-01-01
Formalisms for normalization and symmetrization of one-electron Dirac scattered-wave wavefunctions are presented. The normalization integral consists of one-dimensional radial integrals for the spherical regions and an analytic expression for the intersphere region. Symmetrization drastically reduces the size of the secular matrix to be solved. Examples for planar Pb2Se2 and tetrahedral Pd4 are discussed.
Davis, Joe M
2011-10-28
General equations are derived for the distribution of minimum resolution between two chromatographic peaks, when peak heights in a multi-component chromatogram follow a continuous statistical distribution. The derivation draws on published theory by relating the area under the distribution of minimum resolution to the area under the distribution of the ratio of peak heights, which in turn is derived from the peak-height distribution. Two procedures are proposed for the equations' numerical solution. The procedures are applied to the log-normal distribution, which recently was reported to describe the distribution of component concentrations in three complex natural mixtures. For published statistical parameters of these mixtures, the distribution of minimum resolution is similar to that for the commonly assumed exponential distribution of peak heights used in statistical-overlap theory. However, these two distributions of minimum resolution can differ markedly, depending on the scale parameter of the log-normal distribution. Theory for the computation of the distribution of minimum resolution is extended to other cases of interest. With the log-normal distribution of peak heights as an example, the distribution of minimum resolution is computed when small peaks are lost due to noise or detection limits, and when the height of at least one peak is less than an upper limit. The distribution of minimum resolution shifts slightly to lower resolution values in the first case and to markedly larger resolution values in the second one. The theory and numerical procedure are confirmed by Monte Carlo simulation. Copyright © 2011 Elsevier B.V. All rights reserved.
Learning from myocarditis: mimicry, chaos and black holes
Rose, Noel R.
2014-01-01
Autoimmune myocarditis and its sequel, dilated cardiomyopathy, are major causes of heart failure, especially in children and young adults. We have developed animal models to investigate their pathogenesis by infecting genetically susceptible mice with coxsackievirus B3 or by immunizing them with cardiac myosin or its immunodominant peptide. A number of valuable lessons have emerged from our study of this paradigm of an infection-induced autoimmune disease. We understand more clearly how natural autoimmunity, as an important component of normal physiology, must be recalibrated regularly due to changes caused by infection or other internal and external stimuli. A new normal homeostatic platform will be established based on its evolutionary fitness. A loss of homeostasis with out-of-control normal autoimmunity leads to autoimmune disease. It is signified early on by a spread of an adaptive autoimmune response to novel epitopes and neighboring antigens. The progression from infection to normal, well-balanced autoimmunity to autoimmune disease and on to irreversible damage is a complex, step-wise process. Yet, chaos theory provides hope that the pattern is potentially predictable. Infection-induced autoimmune disease represents a sequence of events heading for a train wreck at the end of the line. Our aim in autoimmune disease research must be to stop the train before this happens. PMID:24904749
Learning from myocarditis: mimicry, chaos and black holes.
Rose, Noel R
2014-01-01
Autoimmune myocarditis and its sequel, dilated cardiomyopathy, are major causes of heart failure, especially in children and young adults. We have developed animal models to investigate their pathogenesis by infecting genetically susceptible mice with coxsackievirus B3 or by immunizing them with cardiac myosin or its immunodominant peptide. A number of valuable lessons have emerged from our study of this paradigm of an infection-induced autoimmune disease. We understand more clearly how natural autoimmunity, as an important component of normal physiology, must be recalibrated regularly due to changes caused by infection or other internal and external stimuli. A new normal homeostatic platform will be established based on its evolutionary fitness. A loss of homeostasis with out-of-control normal autoimmunity leads to autoimmune disease. It is signified early on by a spread of an adaptive autoimmune response to novel epitopes and neighboring antigens. The progression from infection to normal, well-balanced autoimmunity to autoimmune disease and on to irreversible damage is a complex, step-wise process. Yet, chaos theory provides hope that the pattern is potentially predictable. Infection-induced autoimmune disease represents a sequence of events heading for a train wreck at the end of the line. Our aim in autoimmune disease research must be to stop the train before this happens.
Li, Simeng; Li, Nianbei
2018-03-28
For one-dimensional (1d) nonlinear atomic lattices, the models with on-site nonlinearities such as the Frenkel-Kontorova (FK) and ϕ 4 lattices have normal energy transport while the models with inter-site nonlinearities such as the Fermi-Pasta-Ulam-β (FPU-β) lattice exhibit anomalous energy transport. The 1d Discrete Nonlinear Schrödinger (DNLS) equations with on-site nonlinearities has been previously studied and normal energy transport has also been found. Here, we investigate the energy transport of 1d FPU-like DNLS equations with inter-site nonlinearities. Extended from the FPU-β lattice, the renormalized vibration theory is developed for the FPU-like DNLS models and the predicted renormalized vibrations are verified by direct numerical simulations same as the FPU-β lattice. However, the energy diffusion processes are explored and normal energy transport is observed for the 1d FPU-like DNLS models, which is different from their atomic lattice counterpart of FPU-β lattice. The reason might be that, unlike nonlinear atomic lattices where models with on-site nonlinearities have one less conserved quantities than the models with inter-site nonlinearities, the DNLS models with on-site or inter-site nonlinearities have the same number of conserved quantities as the result of gauge transformation.
Integration of gene normalization stages and co-reference resolution using a Markov logic network.
Dai, Hong-Jie; Chang, Yen-Ching; Tsai, Richard Tzong-Han; Hsu, Wen-Lian
2011-09-15
Gene normalization (GN) is the task of normalizing a textual gene mention to a unique gene database ID. Traditional top performing GN systems usually need to consider several constraints to make decisions in the normalization process, including filtering out false positives, or disambiguating an ambiguous gene mention, to improve system performance. However, these constraints are usually executed in several separate stages and cannot use each other's input/output interactively. In this article, we propose a novel approach that employs a Markov logic network (MLN) to model the constraints used in the GN task. Firstly, we show how various constraints can be formulated and combined in an MLN. Secondly, we are the first to apply the two main concepts of co-reference resolution-discourse salience in centering theory and transitivity-to GN models. Furthermore, to make our results more relevant to developers of information extraction applications, we adopt the instance-based precision/recall/F-measure (PRF) in addition to the article-wide PRF to assess system performance. Experimental results show that our system outperforms baseline and state-of-the-art systems under two evaluation schemes. Through further analysis, we have found several unexplored challenges in the GN task. hongjie@iis.sinica.edu.tw Supplementary data are available at Bioinformatics online.
Breslow, L; Cowan, P A
1984-02-01
This study describes a strategy for examining cognitive functioning in psychotic and normal children without the usual confounding effects of marked differences in cognitive structure that occur when children of the same age are compared. Participants were 14 psychotic children, 12 males and 2 females, mean age 9-2, matched with normal children at preoperational and concrete operational stage levels on a set of Piagetian classification tasks. The mean age of the normal children was 6-4, replicating the usually found developmental delay in psychotic samples. Participants were then compared on both structural level and functional abilities on a set of tasks involving seriation of sticks; the higher-level children were also administered a seriation drawing task. Analysis of children's processes of seriating and seriation drawings indicated that over and above the structural retardation, psychotic children at all levels showed functional deficits, especially in the use of anticipatory imagery. The implications for general developmental theory are that progress in structural development is not sufficient for imaginal development, and that structural development of logical concepts is relatively independent of the development of imagery. It was suggested that "thought disorder" may not be a disordered structure of thinking or a retardation in psychotic populations but rather a mismatch between higher-level logical structures and lower-level functions.
NASA Astrophysics Data System (ADS)
Kumar, Rajneesh; Singh, Kulwinder; Pathania, Devinder Singh
2017-07-01
The purpose of this paper is to study the variations in temperature, radial and normal displacement, normal stress, shear stress and couple stress in a micropolar thermoelastic solid in the context of fractional order theory of thermoelasticity. Eigen value approach together with Laplace and Hankel transforms are employed to obtain the general solution of the problem. The field variables corresponding to different fractional order theories of thermoelasticity have been obtained in the transformed domain. The general solution is applied to an infinite space subjected to a concentrated load at the origin. To obtained solution in the physical domain numerical inversion technique has been applied and numerically computed results are depicted graphically to analyze the effects of fractional order parameter on the field variables.
Hultman, Lill; Forinder, Ulla; Pergert, Pernilla
2016-01-01
The purpose of the study was to explore how adolescents with disabilities experience everyday life with personal assistants. In this qualitative study, individual interviews were conducted at 35 occasions with 16 Swedish adolescents with disabilities, in the ages 16-21. Data were analyzed using grounded theory methodology. The adolescents' main concern was to achieve normality, which is about doing rather than being normal. They try to resolve this by assisted normality utilizing personal assistance. Assisted normality can be obtained by the existing relationship, the cooperation between the assistant and the adolescent and the situational placement of the assistant. Normality is obstructed by physical, social and psychological barriers. This study is from the adolescents' perspective and has implications for understanding the value of having access to personal assistance in order to achieve assisted normality and enable social interaction in everyday life. Access to personal assistance is important to enable social interaction in everyday life. A good and functional relationship is enabled through the existing relation, co-operation and situational placement of the assistant. If the assistant is not properly sensitized, young people risk turning into objects of care. Access to personal assistants cannot compensate for disabling barriers in the society as for example lack of acceptance.
Song, Cen; Zhuang, Jun
2018-01-01
In security check systems, tighter screening processes increase the security level, but also cause more congestion, which could cause longer wait times. Having to deal with more congestion in lines could also cause issues for the screeners. The Transportation Security Administration (TSA) Precheck Program was introduced to create fast lanes in airports with the goal of expediting passengers who the TSA does not deem to be threats. In this lane, the TSA allows passengers to enjoy fewer restrictions in order to speed up the screening time. Motivated by the TSA Precheck Program, we study parallel queueing imperfect screening systems, where the potential normal and adversary participants/applicants decide whether to apply to the Precheck Program or not. The approved participants would be assigned to a faster screening channel based on a screening policy determined by an approver, who balances the concerns of safety of the passengers and congestion of the lines. There exist three types of optimal normal applicant's application strategy, which depend on whether the marginal payoff is negative or positive, or whether the marginal benefit equals the marginal cost. An adversary applicant would not apply when the screening policy is sufficiently large or the number of utilized benefits is sufficiently small. The basic model is extended by considering (1) applicants' parameters to follow different distributions and (2) applicants to have risk levels, where the approver determines the threshold value needed to qualify for Precheck. This article integrates game theory and queueing theory to study the optimal screening policy and provides some insights to imperfect parallel queueing screening systems. © 2017 Society for Risk Analysis.
[Analysis of the heart sound with arrhythmia based on nonlinear chaos theory].
Ding, Xiaorong; Guo, Xingming; Zhong, Lisha; Xiao, Shouzhong
2012-10-01
In this paper, a new method based on the nonlinear chaos theory was proposed to study the arrhythmia with the combination of the correlation dimension and largest Lyapunov exponent, through computing and analyzing these two parameters of 30 cases normal heart sound and 30 cases with arrhythmia. The results showed that the two parameters of the heart sounds with arrhythmia were higher than those with the normal, and there was significant difference between these two kinds of heart sounds. That is probably due to the irregularity of the arrhythmia which causes the decrease of predictability, and it's more complex than the normal heart sound. Therefore, the correlation dimension and the largest Lyapunov exponent can be used to analyze the arrhythmia and for its feature extraction.
Application of binaural beat phenomenon with aphasic patients.
Barr, D F; Mullin, T A; Herbert, P S
1977-04-01
We investigated whether six aphasics and six normal subjects could binaurally fuse two slightly differing frequencies of constant amplitude. The aphasics were subdivided into two groups: (1) two men who had had mild cerebrovascular accidents (CVAs) during the past 15 months; (2) four men who had had severe CVAs during the last 15 months. Two tones of different frequency levels but equal in intensity were presented dichotically to the subjects at 40 dB sensation level. All subjects had normal hearing at 500 Hz (0 to 25 dB). All six normal subjects and the two aphasics who had had mild CVAs could hear the binaural beats. The four aphasics who had had severe CVAs could not hear them. A 2 X 2 design resulting from this study was compared using chi2 test with Yates correction and was found to be significantly different (P less than .05). Two theories are presented to explain these findings: the "depression theory" and the "temporal time-sequencing theory." Therapeutic implications are also discussed relative to cerebral and/or brain stem involvement in the fusion of binaural stimuli.
Robust controller design for flexible structures using normalized coprime factor plant descriptions
NASA Technical Reports Server (NTRS)
Armstrong, Ernest S.
1993-01-01
Stabilization is a fundamental requirement in the design of feedback compensators for flexible structures. The search for the largest neighborhood around a given design plant for which a single controller produces closed-loop stability can be formulated as an H(sub infinity) control problem. The use of normalized coprime factor plant descriptions, in which the plant perturbations are defined as additive modifications to the coprime factors, leads to a closed-form expression for the maximum neighborhood boundary allowing optimal and suboptimal H(sub infinity) compensators to be computed directly without the usual gamma iteration. A summary of the theory on robust stabilization using normalized coprime factor plant descriptions is presented, and the application of the theory to the computation of robustly stable compensators for the phase version of the Control-Structures Interaction (CSI) Evolutionary Model is described. Results from the application indicate that the suboptimal version of the theory has the potential of providing the bases for the computation of low-authority compensators that are robustly stable to expected variations in design model parameters and additive unmodeled dynamics.
Vibrational Modes of Oblate Clouds of Charge
NASA Astrophysics Data System (ADS)
Jenkins, Thomas; Spencer, Ross L.
2000-10-01
When a nonneutral plasma confined in a Penning trap is allowed time to expand, its shape at global thermal equilibrium is that of a thin oblate spheroid [D. L. Paulson et al., Phys. Plasmas 5, 345 (1998)]. Oscillations similar to those of a drumhead can be externally induced in such a plasma. Although a theory developed by Dubin predicts the frequencies of the various normal modes of oscillation [Phys. Rev. Lett. 66, 2076 (1991)], this theory assumes that the plasma has zero temperature and is confined by an ideal quadrupole electric field. Neither of these conditions is strictly true in experiments [C. S. Weimer et al., Phys. Rev. A 49, 3842 (1994)] where physical properties of the plasma are deduced from measurements of these frequencies, causing the measurements and ideal theory to differ by about 20%. We reformulate the problem of the normal oscillatory modes as a principal-value integral eigenvalue equation, including finite-temperature and non-ideal confinement effects. The equation is solved numerically to obtain the plasma's normal mode frequencies and shapes; reasonable agreement with experiment is obtained.
NASA Astrophysics Data System (ADS)
Shi, Junchao; Zhang, Xudong; Liu, Ying; Chen, Qi
2017-03-01
In their interesting article [1] Wang et al. proposed a mathematical model based on evolutionary game theory [2] to tackle the fundamental question in embryo development, that how sperm and egg interact with each other, through epigenetic processes, to form a zygote and direct successful embryo development. This work is based on the premise that epigenetic reprogramming (referring to the erasure and reconstruction of epigenetic marks, such as DNA methylation and histone modifications) after fertilization might be of paramount importance to maintain the normal development of embryos, a premise we fully agree, given the compelling experimental evidence reported [3]. Wang et al. have specifically chosen to employ the well-studied DNA methylation reprogramming process during mammalian early embryo development, as a basis to develop their mathematical model, namely epigenetic game theory (epiGame). They concluded that the DNA methylation pattern in mammalian early embryo could be formulated and quantified, and their model can be further used to quantify the interactions, such as competition and/or cooperation of expressed genes that maximize the fitness of embryos. The efforts by Wang et al. in quantitatively and systematically analyzing the beginning of life apparently hold value and represent a novel direction for future embryo development research from both theoretical and experimental biologists. On the other hand, we see their theory still at its infancy, because there are plenty more parameters to consider and there are spaces for debates, such as the cases of haploid embryo development [4]. Here, we briefly comment on the dynamic process of epigenetic reprogramming that goes beyond DNA methylation, a dynamic interplay that involves histone modifications, non-coding RNAs, transposable elements et al., as well as the potential input of the various types of 'hereditary' epigenetic information in the gametes - a game that has started before the fertilization.
Too little, too late: reduced visual span and speed characterize pure alexia.
Starrfelt, Randi; Habekost, Thomas; Leff, Alexander P
2009-12-01
Whether normal word reading includes a stage of visual processing selectively dedicated to word or letter recognition is highly debated. Characterizing pure alexia, a seemingly selective disorder of reading, has been central to this debate. Two main theories claim either that 1) Pure alexia is caused by damage to a reading specific brain region in the left fusiform gyrus or 2) Pure alexia results from a general visual impairment that may particularly affect simultaneous processing of multiple items. We tested these competing theories in 4 patients with pure alexia using sensitive psychophysical measures and mathematical modeling. Recognition of single letters and digits in the central visual field was impaired in all patients. Visual apprehension span was also reduced for both letters and digits in all patients. The only cortical region lesioned across all 4 patients was the left fusiform gyrus, indicating that this region subserves a function broader than letter or word identification. We suggest that a seemingly pure disorder of reading can arise due to a general reduction of visual speed and span, and explain why this has a disproportionate impact on word reading while recognition of other visual stimuli are less obviously affected.
Too Little, Too Late: Reduced Visual Span and Speed Characterize Pure Alexia
Habekost, Thomas; Leff, Alexander P.
2009-01-01
Whether normal word reading includes a stage of visual processing selectively dedicated to word or letter recognition is highly debated. Characterizing pure alexia, a seemingly selective disorder of reading, has been central to this debate. Two main theories claim either that 1) Pure alexia is caused by damage to a reading specific brain region in the left fusiform gyrus or 2) Pure alexia results from a general visual impairment that may particularly affect simultaneous processing of multiple items. We tested these competing theories in 4 patients with pure alexia using sensitive psychophysical measures and mathematical modeling. Recognition of single letters and digits in the central visual field was impaired in all patients. Visual apprehension span was also reduced for both letters and digits in all patients. The only cortical region lesioned across all 4 patients was the left fusiform gyrus, indicating that this region subserves a function broader than letter or word identification. We suggest that a seemingly pure disorder of reading can arise due to a general reduction of visual speed and span, and explain why this has a disproportionate impact on word reading while recognition of other visual stimuli are less obviously affected. PMID:19366870
A unified theory of bone healing and nonunion: BHN theory.
Elliott, D S; Newman, K J H; Forward, D P; Hahn, D M; Ollivere, B; Kojima, K; Handley, R; Rossiter, N D; Wixted, J J; Smith, R M; Moran, C G
2016-07-01
This article presents a unified clinical theory that links established facts about the physiology of bone and homeostasis, with those involved in the healing of fractures and the development of nonunion. The key to this theory is the concept that the tissue that forms in and around a fracture should be considered a specific functional entity. This 'bone-healing unit' produces a physiological response to its biological and mechanical environment, which leads to the normal healing of bone. This tissue responds to mechanical forces and functions according to Wolff's law, Perren's strain theory and Frost's concept of the "mechanostat". In response to the local mechanical environment, the bone-healing unit normally changes with time, producing different tissues that can tolerate various levels of strain. The normal result is the formation of bone that bridges the fracture - healing by callus. Nonunion occurs when the bone-healing unit fails either due to mechanical or biological problems or a combination of both. In clinical practice, the majority of nonunions are due to mechanical problems with instability, resulting in too much strain at the fracture site. In most nonunions, there is an intact bone-healing unit. We suggest that this maintains its biological potential to heal, but fails to function due to the mechanical conditions. The theory predicts the healing pattern of multifragmentary fractures and the observed morphological characteristics of different nonunions. It suggests that the majority of nonunions will heal if the correct mechanical environment is produced by surgery, without the need for biological adjuncts such as autologous bone graft. Cite this article: Bone Joint J 2016;98-B:884-91. ©2016 The British Editorial Society of Bone & Joint Surgery.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hau, Jan-Niklas, E-mail: hau@fdy.tu-darmstadt.de; Oberlack, Martin; GSC CE, Technische Universität Darmstadt, Dolivostraße 15, 64293 Darmstadt
2015-12-15
Aerodynamic sound generation in shear flows is investigated in the light of the breakthrough in hydrodynamics stability theory in the 1990s, where generic phenomena of non-normal shear flow systems were understood. By applying the thereby emerged short-time/non-modal approach, the sole linear mechanism of wave generation by vortices in shear flows was captured [G. D. Chagelishvili, A. Tevzadze, G. Bodo, and S. S. Moiseev, “Linear mechanism of wave emergence from vortices in smooth shear flows,” Phys. Rev. Lett. 79, 3178-3181 (1997); B. F. Farrell and P. J. Ioannou, “Transient and asymptotic growth of two-dimensional perturbations in viscous compressible shear flow,” Phys.more » Fluids 12, 3021-3028 (2000); N. A. Bakas, “Mechanism underlying transient growth of planar perturbations in unbounded compressible shear flow,” J. Fluid Mech. 639, 479-507 (2009); and G. Favraud and V. Pagneux, “Superadiabatic evolution of acoustic and vorticity perturbations in Couette flow,” Phys. Rev. E 89, 033012 (2014)]. Its source is the non-normality induced linear mode-coupling, which becomes efficient at moderate Mach numbers that is defined for each perturbation harmonic as the ratio of the shear rate to its characteristic frequency. Based on the results by the non-modal approach, we investigate a two-dimensional homentropic constant shear flow and focus on the dynamical characteristics in the wavenumber plane. This allows to separate from each other the participants of the dynamical processes — vortex and wave modes — and to estimate the efficacy of the process of linear wave-generation. This process is analyzed and visualized on the example of a packet of vortex modes, localized in both, spectral and physical, planes. Further, by employing direct numerical simulations, the wave generation by chaotically distributed vortex modes is analyzed and the involved linear and nonlinear processes are identified. The generated acoustic field is anisotropic in the wavenumber plane, which results in highly directional linear sound radiation, whereas the nonlinearly generated waves are almost omni-directional. As part of this analysis, we compare the effectiveness of the linear and nonlinear mechanisms of wave generation within the range of validity of the rapid distortion theory and show the dominance of the linear aerodynamic sound generation. Finally, topological differences between the linear source term of the acoustic analogy equation and of the anisotropic non-normality induced linear mechanism of wave generation are found.« less
Framework for Conducting Empirical Observations of Learning Processes.
ERIC Educational Resources Information Center
Fischer, Hans Ernst; von Aufschnaiter, Stephan
1993-01-01
Reviews four hypotheses about learning: Comenius's transmission-reception theory, information processing theory, Gestalt theory, and Piagetian theory. Uses the categories preunderstanding, conceptual change, and learning processes to classify and assess investigations on learning processes. (PR)
Van Dun, Bram; Wouters, Jan; Moonen, Marc
2009-07-01
Auditory steady-state responses (ASSRs) are used for hearing threshold estimation at audiometric frequencies. Hearing impaired newborns, in particular, benefit from this technique as it allows for a more precise diagnosis than traditional techniques, and a hearing aid can be better fitted at an early age. However, measurement duration of current single-channel techniques is still too long for clinical widespread use. This paper evaluates the practical performance of a multi-channel electroencephalogram (EEG) processing strategy based on a detection theory approach. A minimum electrode set is determined for ASSRs with frequencies between 80 and 110 Hz using eight-channel EEG measurements of ten normal-hearing adults. This set provides a near-optimal hearing threshold estimate for all subjects and improves response detection significantly for EEG data with numerous artifacts. Multi-channel processing does not significantly improve response detection for EEG data with few artifacts. In this case, best response detection is obtained when noise-weighted averaging is applied on single-channel data. The same test setup (eight channels, ten normal-hearing subjects) is also used to determine a minimum electrode setup for 10-Hz ASSRs. This configuration allows to record near-optimal signal-to-noise ratios for 80% of subjects.
May, Carl R; Cummings, Amanda; Girling, Melissa; Bracher, Mike; Mair, Frances S; May, Christine M; Murray, Elizabeth; Myall, Michelle; Rapley, Tim; Finch, Tracy
2018-06-07
Normalization Process Theory (NPT) identifies, characterises and explains key mechanisms that promote and inhibit the implementation, embedding and integration of new health techniques, technologies and other complex interventions. A large body of literature that employs NPT to inform feasibility studies and process evaluations of complex healthcare interventions has now emerged. The aims of this review were to review this literature; to identify and characterise the uses and limits of NPT in research on the implementation and integration of healthcare interventions; and to explore NPT's contribution to understanding the dynamics of these processes. A qualitative systematic review was conducted. We searched Web of Science, Scopus and Google Scholar for articles with empirical data in peer-reviewed journals that cited either key papers presenting and developing NPT, or the NPT Online Toolkit ( www.normalizationprocess.org ). We included in the review only articles that used NPT as the primary approach to collection, analysis or reporting of data in studies of the implementation of healthcare techniques, technologies or other interventions. A structured data extraction instrument was used, and data were analysed qualitatively. Searches revealed 3322 citations. We show that after eliminating 2337 duplicates and broken or junk URLs, 985 were screened as titles and abstracts. Of these, 101 were excluded because they did not fit the inclusion criteria for the review. This left 884 articles for full-text screening. Of these, 754 did not fit the inclusion criteria for the review. This left 130 papers presenting results from 108 identifiable studies to be included in the review. NPT appears to provide researchers and practitioners with a conceptual vocabulary for rigorous studies of implementation processes. It identifies, characterises and explains empirically identifiable mechanisms that motivate and shape implementation processes. Taken together, these mean that analyses using NPT can effectively assist in the explanation of the success or failure of specific implementation projects. Ten percent of papers included critiques of some aspect of NPT, with those that did mainly focusing on its terminology. However, two studies critiqued NPT emphasis on agency, and one study critiqued NPT for its normative focus. This review demonstrates that researchers found NPT useful and applied it across a wide range of interventions. It has been effectively used to aid intervention development and implementation planning as well as evaluating and understanding implementation processes themselves. In particular, NPT appears to have offered a valuable set of conceptual tools to aid understanding of implementation as a dynamic process.
2012-06-09
employed theories are the Euler-Bernoulli beam theory (EBT) and the Timoshenko beam theory ( TBT ). The major deficiency associated with the EBT is failure to...account for defor- mations associated with shearing. The TBT relaxes the normality assumption of the EBT and admits a constant state of shear strain...on a given cross-section. As a result, the TBT necessitates the use of shear correction coefficients in order to accurately predict transverse
Justification of Paternalism in Education.
ERIC Educational Resources Information Center
Nordenbo, Sven Erik
1986-01-01
A systematic presentation is given of the theories of justification normally applied to paternalistic acts: (1) pseudo-paternalism, (2) consequentialism, and (3) consent-based theories. The validity of four common arguments for educational paternalism is discussed: education is necessary, children are ignorant, children are unable to choose, and…
Are Prospective English Teachers Linguistically Intelligent?
ERIC Educational Resources Information Center
Tezel, Kadir Vefa
2017-01-01
Language is normally associated with linguistic capabilities of individuals. In the theory of multiple intelligences, language is considered to be related primarily to linguistic intelligence. Using the theory of Multiple Intelligences as its starting point, this descriptive survey study investigated to what extent prospective English teachers'…
Quasi-normal modes of holographic system with Weyl correction and momentum dissipation
NASA Astrophysics Data System (ADS)
Wu, Jian-Pin; Liu, Peng
2018-05-01
We study the charge response in complex frequency plane and the quasi-normal modes (QNMs) of the boundary quantum field theory with momentum dissipation dual to a probe generalized Maxwell system with Weyl correction. When the strength of the momentum dissipation α ˆ is small, the pole structure of the conductivity is similar to the case without the momentum dissipation. The qualitative correspondence between the poles of the real part of the conductivity of the original theory and the ones of its electromagnetic (EM) dual theory approximately holds when γ → - γ with γ being the Weyl coupling parameter. While the strong momentum dissipation alters the pole structure such that most of the poles locate at the purely imaginary axis. At this moment, the correspondence between the poles of the original theory and its EM dual one is violated when γ → - γ. In addition, for the dominant pole, the EM duality almost holds when γ → - γ for all α ˆ except for a small region of α ˆ .
Toward a General Research Process for Using Dubin's Theory Building Model
ERIC Educational Resources Information Center
Holton, Elwood F.; Lowe, Janis S.
2007-01-01
Dubin developed a widely used methodology for theory building, which describes the components of the theory building process. Unfortunately, he does not define a research process for implementing his theory building model. This article proposes a seven-step general research process for implementing Dubin's theory building model. An example of a…
1992-03-14
overdoped Lal. 66 Sr0 34 CuO4 . 1. Introduction Understanding the normal state charge and spin dynamics of cuprates is closely tied to an explanation of high...frequency of the tank circuit of 160 MHz. As predicted by theory [191, the SQUID noise is reduced significantly when using the higher frequency. This...emphasized that the spin excitation gap is not decreasing with temperature as expected in the classical BCS theory . An other astonishing result is
Renormalization group, normal form theory and the Ising model
NASA Astrophysics Data System (ADS)
Raju, Archishman; Hayden, Lorien; Clement, Colin; Liarte, Danilo; Sethna, James
The results of the renormalization group are commonly advertised as the existence of power law singularities at critical points. Logarithmic and exponential corrections are seen as special cases and dealt with on a case-by-case basis. We propose to systematize computing the singularities in the renormalization group using perturbative normal form theory. This gives us a way to classify all such singularities in a unified framework and to generate a systematic machinery to do scaling collapses. We show that this procedure leads to some new results even in classic cases like the Ising model and has general applicability.
Working memory, situation models, and synesthesia
Radvansky, Gabriel A.; Gibson, Bradley S.; McNerney, M. Windy
2013-03-04
Research on language comprehension suggests a strong relationship between working memory span measures and language comprehension. However, there is also evidence that this relationship weakens at higher levels of comprehension, such as the situation model level. The current study explored this relationship by comparing 10 grapheme–color synesthetes who have additional color experiences when they read words that begin with different letters and 48 normal controls on a number of tests of complex working memory capacity and processing at the situation model level. On all tests of working memory capacity, the synesthetes outperformed the controls. Importantly, there was no carryover benefitmore » for the synesthetes for processing at the situation model level. This reinforces the idea that although some aspects of language comprehension are related to working memory span scores, this applies less directly to situation model levels. As a result, this suggests that theories of working memory must take into account this limitation, and the working memory processes that are involved in situation model construction and processing must be derived.« less
Working memory, situation models, and synesthesia
DOE Office of Scientific and Technical Information (OSTI.GOV)
Radvansky, Gabriel A.; Gibson, Bradley S.; McNerney, M. Windy
Research on language comprehension suggests a strong relationship between working memory span measures and language comprehension. However, there is also evidence that this relationship weakens at higher levels of comprehension, such as the situation model level. The current study explored this relationship by comparing 10 grapheme–color synesthetes who have additional color experiences when they read words that begin with different letters and 48 normal controls on a number of tests of complex working memory capacity and processing at the situation model level. On all tests of working memory capacity, the synesthetes outperformed the controls. Importantly, there was no carryover benefitmore » for the synesthetes for processing at the situation model level. This reinforces the idea that although some aspects of language comprehension are related to working memory span scores, this applies less directly to situation model levels. As a result, this suggests that theories of working memory must take into account this limitation, and the working memory processes that are involved in situation model construction and processing must be derived.« less
Not Alone: Tracing the Origins of Very-Low-Mass Stars and Brown Dwarfs Through Multiplicity Studies
NASA Astrophysics Data System (ADS)
Burgasser, A. J.; Reid, I. N.; Siegler, N.; Close, L.; Allen, P.; Lowrance, P.; Gizis, J.
The properties of multiple stellar systems have long provided important empirical constraints for star-formation theories, enabling (along with several other lines of evidence) a concrete, qualitative picture of the birth and early evolution of normal stars. At very low masses (VLM; M ? 0.1 solar mass), down to and below the hydrogen-burning minimum mass, our understanding of formation processes is not as clear, with several competing theories now under consideration. One means of testing these theories is through the empirical characterization of VLM multiple systems. Here, we review the results of various VLM multiplicity studies to date. These systems can be generally characterized as closely separated (93% have projected separations ? < 20 AU), near equal-mass (77% have M2/M1 ? 0.8) and occurring infrequently (perhaps 10-30% of systems are binary). Both the frequency and maximum separation of stellar and brown dwarf binaries steadily decrease for lower system masses, suggesting that VLM binary formation and/or evolution may be a mass-dependent process. There is evidence for a fairly rapid decline in the number of loosely bound systems below ~0.3 solar mass, corresponding to a factor of 10-20 increase in the minimum binding energy of VLM binaries as compared to more massive stellar binaries. This wide-separation "desert" is present among both field (~1-5 G.y.) and older (>100 m.y.) cluster systems, while the youngest (<10 m.y.) VLM binaries, particularly those in nearby, low-density star-forming regions, appear to have somewhat different systemic properties. We compare these empirical trends to predictions laid out by current formation theories, and outline future observational studies needed to probe the full parameter space of the lowest-mass multiple systems.
Spectral statistics of the acoustic stadium
NASA Astrophysics Data System (ADS)
Méndez-Sánchez, R. A.; Báez, G.; Leyvraz, F.; Seligman, T. H.
2014-01-01
We calculate the normal-mode frequencies and wave amplitudes of the two-dimensional acoustical stadium. We also obtain the statistical properties of the acoustical spectrum and show that they agree with the results given by random matrix theory. Some normal-mode wave amplitudes showing scarring are presented.
Constitutive law for seismicity rate based on rate and state friction: Dieterich 1994 revisited.
NASA Astrophysics Data System (ADS)
Heimisson, E. R.; Segall, P.
2017-12-01
Dieterich [1994] derived a constitutive law for seismicity rate based on rate and state friction, which has been applied widely to aftershocks, earthquake triggering, and induced seismicity in various geological settings. Here, this influential work is revisited, and re-derived in a more straightforward manner. By virtue of this new derivation the model is generalized to include changes in effective normal stress associated with background seismicity. Furthermore, the general case when seismicity rate is not constant under constant stressing rate is formulated. The new derivation provides directly practical integral expressions for the cumulative number of events and rate of seismicity for arbitrary stressing history. Arguably, the most prominent limitation of Dieterich's 1994 theory is the assumption that seismic sources do not interact. Here we derive a constitutive relationship that considers source interactions between sub-volumes of the crust, where the stress in each sub-volume is assumed constant. Interactions are considered both under constant stressing rate conditions and for arbitrary stressing history. This theory can be used to model seismicity rate due to stress changes or to estimate stress changes using observed seismicity from triggered earthquake swarms where earthquake interactions and magnitudes are take into account. We identify special conditions under which influence of interactions cancel and the predictions reduces to those of Dieterich 1994. This remarkable result may explain the apparent success of the model when applied to observations of triggered seismicity. This approach has application to understanding and modeling induced and triggered seismicity, and the quantitative interpretation of geodetic and seismic data. It enables simultaneous modeling of geodetic and seismic data in a self-consistent framework. To date physics-based modeling of seismicity with or without geodetic data has been found to give insight into various processes related to aftershocks, VT and injection-induced seismicity. However, the role of various processes such as earthquake interactions and magnitudes and effective normal stress has been unclear. The new theory presented resolves some of the pertinent issues raised in the literature with application of the Dieterich 1994 model.
Viscous pressure correction in the irrotational flow outside Prandtl's boundary layer
NASA Astrophysics Data System (ADS)
Joseph, Daniel; Wang, Jing
2004-11-01
We argue that boundary layers on solid with irrotational motion outside are like a gas bubble because the shear stress vanishes at the edge of the boundary layer but the irrotational shear stress does not. This discrepancy induces a pressure correction and an additional drag which can be advertised as due to the viscous dissipation of the irrotational flow. Typically, this extra correction to the drag would be relatively small. A much more interesting implication of the extra pressure theory arises from the consideration of the effects of viscosity on the normal stress on a solid boundary which are entirely neglected in Prandtl's theory. It is very well known and easily demonstrated that as a consequence of the continuity equation the viscous normal stress must vanish on a rigid solid. It follows that all the greatly important effects of viscosity on the normal stress are buried in the pressure and the leading order effects of viscosity on the normal stress can be obtained from the viscous correction of viscous potential flow.
Burau, Viola; Carstensen, Kathrine; Fredens, Mia; Kousgaard, Marius Brostrøm
2018-01-24
There is an increased interest in improving the physical health of people with mental illness. Little is known about implementing health promotion interventions in adult mental health organisations where many users also have physical health problems. The literature suggests that contextual factors are important for implementation in community settings. This study focused on the change process and analysed the implementation of a structural health promotion intervention in community mental health organisations in different contexts in Denmark. The study was based on a qualitative multiple-case design and included two municipal and two regional provider organisations. Data were various written sources and 13 semi-structured interviews with 22 key managers and frontline staff. The analysis was organised around the four main constructs of Normalization Process Theory: Coherence, Cognitive Participation, Collective Action, and Reflexive Monitoring. Coherence: Most respondents found the intervention to be meaningful in that the intervention fitted well into existing goals, practices and treatment approaches. Cognitive Participation: Management engagement varied across providers and low engagement impeded implementation. Engaging all staff was a general problem although some of the initial resistance was apparently overcome. Collective Action: Daily enactment depended on staff being attentive and flexible enough to manage the complex needs and varying capacities of users. Reflexive Monitoring: During implementation, staff evaluations of the progress and impact of the intervention were mostly informal and ad hoc and staff used these to make on-going adjustments to activities. Overall, characteristics of context common to all providers (work force and user groups) seemed to be more important for implementation than differences in the external political-administrative context. In terms of research, future studies should adopt a more bottom-up, grounded description of context and pay closer attention to the interplay between different dimensions of implementation. In terms of practice, future interventions need to better facilitate the translation of the initial sense of general meaning into daily practice by active local management support that occurs throughout the implementation process and that systematically connects the intervention to existing practices.
Applicability of post-ionization theory to laser-assisted field evaporation of magnetite
Schreiber, Daniel K.; Chiaramonti, Ann N.; Gordon, Lyle M.; ...
2014-12-15
Analysis of the mean Fe ion charge state from laser-assisted field evaporation of magnetite (Fe3O4) reveals unexpected trends as a function of laser pulse energy that break from conventional post-ionization theory for metals. For Fe ions evaporated from magnetite, the effects of post-ionization are partially offset by the increased prevalence of direct evaporation into higher charge states with increasing laser pulse energy. Therefore the final charge state is related to both the field strength and the laser pulse energy, despite those variables themselves being intertwined when analyzing at a constant detection rate. Comparison of data collected at different base temperaturesmore » also show that the increased prevalence of Fe2+ at higher laser energies is possibly not a direct thermal effect. Conversely, the ratio of 16O+:16O2+ is well-correlated with field strength and unaffected by laser pulse energy on its own, making it a better overall indicator of the field evaporation conditions than the mean Fe charge state. Plotting the normalized field strength versus laser pulse energy also elucidates a non-linear dependence, in agreement with previous observations on semiconductors, that suggests a field-dependent laser absorption efficiency. Together these observations demonstrate that the field evaporation process for laser-pulsed oxides exhibits fundamental differences from metallic specimens that cannot be completely explained by post-ionization theory. Further theoretical studies, combined with detailed analytical observations, are required to understand fully the field evaporation process of non-metallic samples.« less
Grossberg, Stephen
2017-03-01
The hard problem of consciousness is the problem of explaining how we experience qualia or phenomenal experiences, such as seeing, hearing, and feeling, and knowing what they are. To solve this problem, a theory of consciousness needs to link brain to mind by modeling how emergent properties of several brain mechanisms interacting together embody detailed properties of individual conscious psychological experiences. This article summarizes evidence that Adaptive Resonance Theory, or ART, accomplishes this goal. ART is a cognitive and neural theory of how advanced brains autonomously learn to attend, recognize, and predict objects and events in a changing world. ART has predicted that "all conscious states are resonant states" as part of its specification of mechanistic links between processes of consciousness, learning, expectation, attention, resonance, and synchrony. It hereby provides functional and mechanistic explanations of data ranging from individual spikes and their synchronization to the dynamics of conscious perceptual, cognitive, and cognitive-emotional experiences. ART has reached sufficient maturity to begin classifying the brain resonances that support conscious experiences of seeing, hearing, feeling, and knowing. Psychological and neurobiological data in both normal individuals and clinical patients are clarified by this classification. This analysis also explains why not all resonances become conscious, and why not all brain dynamics are resonant. The global organization of the brain into computationally complementary cortical processing streams (complementary computing), and the organization of the cerebral cortex into characteristic layers of cells (laminar computing), figure prominently in these explanations of conscious and unconscious processes. Alternative models of consciousness are also discussed. Copyright © 2016 The Author. Published by Elsevier Ltd.. All rights reserved.
Newly diagnosed childhood diabetes: a psychosocial transition for parents?
Lowes, Lesley; Gregory, John W; Lyne, Patricia
2005-05-01
This paper reports a study to gain a new theoretical understanding of parental grief responses and the process of adaptation to a diagnosis of childhood diabetes. A diagnosis of childhood (type 1) diabetes is an anxious and distressing event for the whole family. Little is known about the experience of parents of newly diagnosed children as they cope with and adapt to their new situation. Parkes' Theory of Psychosocial Transition proposes that life-change events, or 'psychosocial transitions', require people to undertake a major revision of their assumptions about the world. The relevance of this theory to adjusting to a diagnosis of childhood diabetes has not been explored. Forty audio taped in-depth interviews were undertaken with 38 parents of 20 newly-diagnosed children. The data were subsequently examined using the framework of the Theory of Psychosocial Transition. Before diagnosis, most parents associated their child's symptoms with normal childhood illnesses. The unexpectedness and speed of the diagnosis left all parents ill-prepared to deal with the situation. Their world suddenly changed, leaving them insecure and uncertain about the future. Diabetes intruded emotionally and practically upon all of their lives. Parents successfully adjusted and adapted their lives and rebuilt a new model of the world to accommodate their child's diabetes. However, this dynamic process has no guaranteed endpoint for parents. A diagnosis of childhood diabetes leads to a psychosocial transition for parents. The concept of transition provides a logical explanation of parents' responses to loss, and allows increased understanding of the grieving and adaptation processes experienced by parents of children diagnosed with a chronic condition such as diabetes. This knowledge should help health care professionals to assist parents in the period of transition.
Xia, Xiaodong; Hao, Jia; Wang, Yang; Zhong, Zheng; Weng, George J
2017-05-24
Highly aligned graphene-based nanocomposites are of great interest due to their excellent electrical properties along the aligned direction. Graphene fillers in these composites are not necessarily perfectly aligned, but their orientations are highly confined to a certain angle, [Formula: see text] with 90° giving rise to the randomly oriented state and 0° to the perfectly aligned one. Recent experiments have shown that electrical conductivity and dielectric permittivity of highly aligned graphene-polymer nanocomposites are strongly dependent on this distribution angle, but at present no theory seems to exist to address this issue. In this work we present a new effective-medium theory that is derived from the underlying physical process including the effects of graphene orientation, filler loading, aspect ratio, percolation threshold, interfacial tunneling, and Maxwell-Wagner-Sillars polarization, to determine these two properties. The theory is formulated in the context of preferred orientational average. We highlight this new theory with an application to rGO/epoxy nanocomposites, and demonstrate that the calculated in-plane and out-of-plane conductivity and permittivity are in agreement with the experimental data as the range of graphene orientations changes from the randomly oriented to the highly aligned state. We also show that the percolation thresholds of highly aligned graphene nanocomposites are in general different along the planar and the normal directions, but they converge into a single one when the statistical distribution of graphene fillers is spherically symmetric.
Sensory conflict in motion sickness: An observer theory approach
NASA Technical Reports Server (NTRS)
Oman, Charles M.
1989-01-01
Motion sickness is the general term describing a group of common nausea syndromes originally attributed to motion-induced cerebral ischemia, stimulation of abdominal organ afferent, or overstimulation of the vestibular organs of the inner ear. Sea-, car-, and airsicknesses are the most commonly experienced examples. However, the discovery of other variants such as Cinerama-, flight simulator-, spectacle-, and space sickness in which the physical motion of the head and body is normal or absent has led to a succession of sensory conflict theories which offer a more comprehensive etiologic perspective. Implicit in the conflict theory is the hypothesis that neutral and/or humoral signals originate in regions of the brain subversing spatial orientation, and that these signals somehow traverse to other centers mediating sickness symptoms. Unfortunately, the present understanding of the neurophysiological basis of motion sickness is far from complete. No sensory conflict neuron or process has yet been physiologically identified. To what extent can the existing theory be reconciled with current knowledge of the physiology and pharmacology of nausea and vomiting. The stimuli which causes sickness, synthesizes a contemporary Observer Theory view of the Sensory Conflict hypothesis are reviewed, and a revised model for the dynamic coupling between the putative conflict signals and nausea magnitude estimates is presented. The use of quantitative models for sensory conflict offers a possible new approach to improving the design of visual and motion systems for flight simulators and other virtual environment display systems.
Understanding the process of fascial unwinding.
Minasny, Budiman
2009-09-23
Fascial or myofascial unwinding is a process in which a client undergoes a spontaneous reaction in response to the therapist's touch. It can be induced by using specific techniques that encourage a client's body to move into areas of ease. Unwinding is a popular technique in massage therapy, but its mechanism is not well understood. In the absence of a scientific explanation or hypothesis of the mechanism of action, it can be interpreted as "mystical." This paper proposes a model that builds on the neurobiologic, ideomotor action, and consciousness theories to explain the process and mechanism of fascial unwinding. HYPOTHETICAL MODEL: During fascial unwinding, the therapist stimulates mechanoreceptors in the fascia by applying gentle touch and stretching. Touch and stretching induce relaxation and activate the parasympathetic nervous system. They also activate the central nervous system, which is involved in the modulation of muscle tone as well as movement. As a result, the central nervous system is aroused and thereby responds by encouraging muscles to find an easier, or more relaxed, position and by introducing the ideomotor action. Although the ideomotor action is generated via normal voluntary motor control systems, it is altered and experienced as an involuntary response. Fascial unwinding occurs when a physically induced suggestion by a therapist prompts ideomotor action that the client experiences as involuntary. This action is guided by the central nervous system, which produces continuous action until a state of ease is reached. Consequently, fascial unwinding can be thought of as a neurobiologic process employing the self-regulation dynamic system theory.
String-Coupled Pendulum Oscillators: Theory and Experiment.
ERIC Educational Resources Information Center
Moloney, Michael J.
1978-01-01
A coupled-oscillator system is given which is readily set up, using only household materials. The normal-mode analysis of this system is worked out, and an experiment or demonstration is recommended in which one verifies the theory by measuring two times and four lengths. (Author/GA)
Scalar utility theory and proportional processing: what does it actually imply?
Rosenström, Tom; Wiesner, Karoline; Houston, Alasdair I
2017-01-01
Scalar Utility Theory (SUT) is a model used to predict animal and human choice behaviour in the context of reward amount, delay to reward, and variability in these quantities (risk preferences). This article reviews and extends SUT, deriving novel predictions. We show that, contrary to what has been implied in the literature, (1) SUT can predict both risk averse and risk prone behaviour for both reward amounts and delays to reward depending on experimental parameters, (2) SUT implies violations of several concepts of rational behaviour (e.g. it violates strong stochastic transitivity and its equivalents, and leads to probability matching) and (3) SUT can predict, but does not always predict, a linear relationship between risk sensitivity in choices and coefficient of variation in the decision-making experiment. SUT derives from Scalar Expectancy Theory which models uncertainty in behavioural timing using a normal distribution. We show that the above conclusions also hold for other distributions, such as the inverse Gaussian distribution derived from drift-diffusion models. A straightforward way to test the key assumptions of SUT is suggested and possible extensions, future prospects and mechanistic underpinnings are discussed. PMID:27288541
The Kolmogorov-Obukhov Statistical Theory of Turbulence
NASA Astrophysics Data System (ADS)
Birnir, Björn
2013-08-01
In 1941 Kolmogorov and Obukhov postulated the existence of a statistical theory of turbulence, which allows the computation of statistical quantities that can be simulated and measured in a turbulent system. These are quantities such as the moments, the structure functions and the probability density functions (PDFs) of the turbulent velocity field. In this paper we will outline how to construct this statistical theory from the stochastic Navier-Stokes equation. The additive noise in the stochastic Navier-Stokes equation is generic noise given by the central limit theorem and the large deviation principle. The multiplicative noise consists of jumps multiplying the velocity, modeling jumps in the velocity gradient. We first estimate the structure functions of turbulence and establish the Kolmogorov-Obukhov 1962 scaling hypothesis with the She-Leveque intermittency corrections. Then we compute the invariant measure of turbulence, writing the stochastic Navier-Stokes equation as an infinite-dimensional Ito process, and solving the linear Kolmogorov-Hopf functional differential equation for the invariant measure. Finally we project the invariant measure onto the PDF. The PDFs turn out to be the normalized inverse Gaussian (NIG) distributions of Barndorff-Nilsen, and compare well with PDFs from simulations and experiments.
Street, Helen
2003-09-01
This study explores depression in cancer patients with reference to conditional goal setting (CGS) theory. CGS theory proposes that depressed individuals believe that personal happiness is conditional upon attainment of specific goals (personal CGS). Other individuals may set important goals believing that goal achievement is a necessary prerequisite of social acceptance and approval (social CGS). CGS has been found to contribute to depression in normal populations. 15.2% of the 67 newly diagnosed cancer patients in this study showed clinical levels of depression. A significant relationship was identified between personal CGS, rumination and depression, as predicted in CGS theory. Two months later, 46.7% of patients demonstrated clinical levels of depression. This later experience of depression was significantly related to social CGS. The results suggest CGS involving a misdirected pursuit of happiness is initially associated with depression whereas subsequent experiences of depression are related to a misdirected pursuit of social acceptance. Implications are discussed in terms of understanding the cancer patients' motivations controlling goal setting. It is suggested that successful psychotherapy for depression in cancer patients needs to examine the motivations controlling goal setting in addition to the process of goal pursuit. Copyright 2003 John Wiley & Sons, Ltd.
Scalar utility theory and proportional processing: What does it actually imply?
Rosenström, Tom; Wiesner, Karoline; Houston, Alasdair I
2016-09-07
Scalar Utility Theory (SUT) is a model used to predict animal and human choice behaviour in the context of reward amount, delay to reward, and variability in these quantities (risk preferences). This article reviews and extends SUT, deriving novel predictions. We show that, contrary to what has been implied in the literature, (1) SUT can predict both risk averse and risk prone behaviour for both reward amounts and delays to reward depending on experimental parameters, (2) SUT implies violations of several concepts of rational behaviour (e.g. it violates strong stochastic transitivity and its equivalents, and leads to probability matching) and (3) SUT can predict, but does not always predict, a linear relationship between risk sensitivity in choices and coefficient of variation in the decision-making experiment. SUT derives from Scalar Expectancy Theory which models uncertainty in behavioural timing using a normal distribution. We show that the above conclusions also hold for other distributions, such as the inverse Gaussian distribution derived from drift-diffusion models. A straightforward way to test the key assumptions of SUT is suggested and possible extensions, future prospects and mechanistic underpinnings are discussed. Copyright © 2016 Elsevier Ltd. All rights reserved.
Analysis of Particle Image Velocimetry (PIV) Data for Acoustic Velocity Measurements
NASA Technical Reports Server (NTRS)
Blackshire, James L.
1997-01-01
Acoustic velocity measurements were taken using Particle Image Velocimetry (PIV) in a Normal Incidence Tube configuration at various frequency, phase, and amplitude levels. This report presents the results of the PIV analysis and data reduction portions of the test and details the processing that was done. Estimates of lower measurement sensitivity levels were determined based on PIV image quality, correlation, and noise level parameters used in the test. Comparison of measurements with linear acoustic theory are presented. The onset of nonlinear, harmonic frequency acoustic levels were also studied for various decibel and frequency levels ranging from 90 to 132 dB and 500 to 3000 Hz, respectively.
Bessel beams with spatial oscillating polarization
Fu, Shiyao; Zhang, Shikun; Gao, Chunqing
2016-01-01
Bessel beams are widely used in optical metrology mainly because of their large Rayleigh range (focal length). Radial/azimuthal polarization of such beams is of interest in the fields of material processing, plasma absorption or communication. In this paper an experimental set-up is presented, which generates a Bessel-type vector beam with a spatial polarization, oscillating along the optical axis, when propagating in free space. A first holographic axicon (HA) HA1 produces a normal, linearly polarized Bessel beam, which by a second HA2 is converted into the spatial oscillating polarized beam. The theory is briefly discussed, the set-up and the experimental results are presented in detail. PMID:27488174
Cryptic genetic variation: evolution's hidden substrate.
Paaby, Annalise B; Rockman, Matthew V
2014-04-01
Cryptic genetic variation (CGV) is invisible under normal conditions, but it can fuel evolution when circumstances change. In theory, CGV can represent a massive cache of adaptive potential or a pool of deleterious alleles that are in need of constant suppression. CGV emerges from both neutral and selective processes, and it may inform about how human populations respond to change. CGV facilitates adaptation in experimental settings, but does it have an important role in the real world? Here, we review the empirical support for widespread CGV in natural populations, including its potential role in emerging human diseases and the growing evidence of its contribution to evolution.
Fractal mechanisms in the electrophysiology of the heart
NASA Technical Reports Server (NTRS)
Goldberger, A. L.
1992-01-01
The mathematical concept of fractals provides insights into complex anatomic branching structures that lack a characteristic (single) length scale, and certain complex physiologic processes, such as heart rate regulation, that lack a single time scale. Heart rate control is perturbed by alterations in neuro-autonomic function in a number of important clinical syndromes, including sudden cardiac death, congestive failure, cocaine intoxication, fetal distress, space sickness and physiologic aging. These conditions are associated with a loss of the normal fractal complexity of interbeat interval dynamics. Such changes, which may not be detectable using conventional statistics, can be quantified using new methods derived from "chaos theory.".
NASA Astrophysics Data System (ADS)
Anderson, Philip W.; Casey, Philip A.
2010-04-01
We present a formalism for dealing directly with the effects of the Gutzwiller projection implicit in the t-J model which is widely believed to underlie the phenomenology of the high-Tc cuprates. We suggest that a true Bardeen-Cooper-Schrieffer condensation from a Fermi liquid state takes place, but in the unphysical space prior to projection. At low doping, however, instead of a hidden Fermi liquid one gets a 'hidden' non-superconducting resonating valence bond state which develops hole pockets upon doping. The theory which results upon projection does not follow conventional rules of diagram theory and in fact in the normal state is a Z = 0 non-Fermi liquid. Anomalous properties of the 'strange metal' normal state are predicted and compared against experimental findings.
Evidence for perceptual deficits in associative visual (prosop)agnosia: a single-case study.
Delvenne, Jean François; Seron, Xavier; Coyette, Françoise; Rossion, Bruno
2004-01-01
Associative visual agnosia is classically defined as normal visual perception stripped of its meaning [Archiv für Psychiatrie und Nervenkrankheiten 21 (1890) 22/English translation: Cognitive Neuropsychol. 5 (1988) 155]: these patients cannot access to their stored visual memories to categorize the objects nonetheless perceived correctly. However, according to an influential theory of visual agnosia [Farah, Visual Agnosia: Disorders of Object Recognition and What They Tell Us about Normal Vision, MIT Press, Cambridge, MA, 1990], visual associative agnosics necessarily present perceptual deficits that are the cause of their impairment at object recognition Here we report a detailed investigation of a patient with bilateral occipito-temporal lesions strongly impaired at object and face recognition. NS presents normal drawing copy, and normal performance at object and face matching tasks as used in classical neuropsychological tests. However, when tested with several computer tasks using carefully controlled visual stimuli and taking both his accuracy rate and response times into account, NS was found to have abnormal performances at high-level visual processing of objects and faces. Albeit presenting a different pattern of deficits than previously described in integrative agnosic patients such as HJA and LH, his deficits were characterized by an inability to integrate individual parts into a whole percept, as suggested by his failure at processing structurally impossible three-dimensional (3D) objects, an absence of face inversion effects and an advantage at detecting and matching single parts. Taken together, these observations question the idea of separate visual representations for object/face perception and object/face knowledge derived from investigations of visual associative (prosop)agnosia, and they raise some methodological issues in the analysis of single-case studies of (prosop)agnosic patients.
Innovations in Basic Flight Training for the Indonesian Air Force
1990-12-01
microeconomic theory that could approximate the optimum mix of training hours between an aircraft and simulator, and therefore improve cost effectiveness...The microeconomic theory being used is normally employed when showing production with two variable inputs. An example of variable inputs would be labor...NAS Corpus Christi, Texas, Aerodynamics of the T-34C, 1989. 26. Naval Air Training Command, NAS Corpus Christi, Texas, Meteorological Theory Workbook
Extensions of the Theory of the Electron-Phonon Interaction in Metals: A Collection.
1983-11-03
accepted The measured zero -field susceptibility is given 50 . . . . 26 GENERALIZATION OF THE THEORY OF THE ELECTRON-... 1199 JP by X.P_ IM T V.IM 0... Generalization of the Theory of the Electron-Phonon Inter- action: Thermodynamic Formulation of Superconducting- and Normal-State Properties...A microscopic treatment of the consequences for supercon- ductivity of a nonconstant electronic density of states is presented. Generalized
Mass Media Theory, Leveraging Relationships, and Reliable Strategic Communication Effects
2008-03-19
other people who are in the same social and cultural groups. Families respond to patriarchs and matriarchs , congregations respond to pastors, and teens...media to self-correct behavior in order to make society seem more “normal.” Verbal and Written Message- Centric Theories Premise of Theory Magic...Effects Harmony and Balance People gravitate toward information they already believe. Structural Functionalism When society begins to seem
An Analysis of Japanese University Students' Oral Performance in English Using Processability Theory
ERIC Educational Resources Information Center
Sakai, Hideki
2008-01-01
This paper presents a brief summary of processability theory as proposed by [Pienemann, M., 1998a. "Language Processing and Second Language Development: Processability Theory." John Benjamins, Amsterdam; Pienemann, M., 1998b. "Developmental dynamics in L1 and L2 acquisition: processability theory and generative entrenchment." "Bilingualism:…
Analysis of Layered Composite Plates Accounting for Large Deflections and Transverse Shear Strains.
1981-05-01
composite plates than isotropic plates. The classical thin- plate theory (CPT) assumes that normals to the midsurface before deformation remain straight...and normal to the midsurface after deformation, implying that thickness shear deformation effects are negligible. As a result, the natural
Theories on migration processes of Cd in Jiaozhou Bay
NASA Astrophysics Data System (ADS)
Yang, Dongfang; Li, Haixia; Wang, Qi; Ding, Jun; Zhang, Longlei
2018-03-01
Understanding the migration progress is essential to pollution control, while developing theories for the migration progress is the scientific basis. This paper further developed five key theories on migration processes of Cd including homogeneous theory, environmental dynamic theory, horizontal loss theory, migration trend theory and vertical migration theory, respectively. The performance and practical values of these theories were demonstrated in the application of these on analyzing the migration process of Cd in Jiaozhou Bay. Results these theory helpful to better understand the migration progress of pollutants in marine bay.
An open-source framework for large-scale, flexible evaluation of biomedical text mining systems.
Baumgartner, William A; Cohen, K Bretonnel; Hunter, Lawrence
2008-01-29
Improved evaluation methodologies have been identified as a necessary prerequisite to the improvement of text mining theory and practice. This paper presents a publicly available framework that facilitates thorough, structured, and large-scale evaluations of text mining technologies. The extensibility of this framework and its ability to uncover system-wide characteristics by analyzing component parts as well as its usefulness for facilitating third-party application integration are demonstrated through examples in the biomedical domain. Our evaluation framework was assembled using the Unstructured Information Management Architecture. It was used to analyze a set of gene mention identification systems involving 225 combinations of system, evaluation corpus, and correctness measure. Interactions between all three were found to affect the relative rankings of the systems. A second experiment evaluated gene normalization system performance using as input 4,097 combinations of gene mention systems and gene mention system-combining strategies. Gene mention system recall is shown to affect gene normalization system performance much more than does gene mention system precision, and high gene normalization performance is shown to be achievable with remarkably low levels of gene mention system precision. The software presented in this paper demonstrates the potential for novel discovery resulting from the structured evaluation of biomedical language processing systems, as well as the usefulness of such an evaluation framework for promoting collaboration between developers of biomedical language processing technologies. The code base is available as part of the BioNLP UIMA Component Repository on SourceForge.net.
An open-source framework for large-scale, flexible evaluation of biomedical text mining systems
Baumgartner, William A; Cohen, K Bretonnel; Hunter, Lawrence
2008-01-01
Background Improved evaluation methodologies have been identified as a necessary prerequisite to the improvement of text mining theory and practice. This paper presents a publicly available framework that facilitates thorough, structured, and large-scale evaluations of text mining technologies. The extensibility of this framework and its ability to uncover system-wide characteristics by analyzing component parts as well as its usefulness for facilitating third-party application integration are demonstrated through examples in the biomedical domain. Results Our evaluation framework was assembled using the Unstructured Information Management Architecture. It was used to analyze a set of gene mention identification systems involving 225 combinations of system, evaluation corpus, and correctness measure. Interactions between all three were found to affect the relative rankings of the systems. A second experiment evaluated gene normalization system performance using as input 4,097 combinations of gene mention systems and gene mention system-combining strategies. Gene mention system recall is shown to affect gene normalization system performance much more than does gene mention system precision, and high gene normalization performance is shown to be achievable with remarkably low levels of gene mention system precision. Conclusion The software presented in this paper demonstrates the potential for novel discovery resulting from the structured evaluation of biomedical language processing systems, as well as the usefulness of such an evaluation framework for promoting collaboration between developers of biomedical language processing technologies. The code base is available as part of the BioNLP UIMA Component Repository on SourceForge.net. PMID:18230184
Preattentive visual search and perceptual grouping in schizophrenia.
Carr, V J; Dewis, S A; Lewin, T J
1998-06-15
To help determine whether patients with schizophrenia show deficits in the stimulus-based aspects of preattentive processing, we undertook a series of experiments within the framework of feature integration theory. Thirty subjects with a DSM-III-R diagnosis of schizophrenia and 30 age-, gender-, and education-matched normal control subjects completed two computerized experimental tasks, a visual search task assessing parallel and serial information processing (Experiment 1) and a task which examined the effects of perceptual grouping on visual search strategies (Experiment 2). We also assessed current symptomatology and its relationship to task performance. While the schizophrenia subjects had longer reaction times in Experiment 1, their overall pattern of performance across both experimental tasks was similar to that of the control subjects, and generally unrelated to current symptomatology. Predictions from feature integration theory about the impact of varying display size (Experiment 1) and number of perceptual groups (Experiment 2) on the detection of feature and conjunction targets were strongly supported. This study revealed no firm evidence that schizophrenia is associated with a preattentive abnormality in visual search using stimuli that differ on the basis of physical characteristics. While subject and task characteristics may partially account for differences between this and previous studies, it is more likely that preattentive processing abnormalities in schizophrenia may occur only under conditions involving selected 'top-down' factors such as context and meaning.
Elaboration of the α-model derived from the BCS theory of superconductivity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnston, David C.
2013-10-14
The single-band α-model of superconductivity (Padamsee et al 1973 J. Low Temp. Phys. 12 387) is a popular model that was adapted from the single-band Bardeen–Cooper–Schrieffer (BCS) theory of superconductivity mainly to allow fits to electronic heat capacity versus temperature T data that deviate from the BCS prediction. The model assumes that the normalized superconducting order parameter Δ(T)/Δ(0) and therefore the normalized London penetration depth λL(T)/λL(0) are the same as in BCS theory, calculated using the BCS value αBCS ≈ 1.764 of α ≡ Δ(0)/kBTc, where kB is The single-band α-model of superconductivity (Padamsee et al 1973 J. Low Temp.more » Phys. 12 387) is a popular model that was adapted from the single-band Bardeen–Cooper–Schrieffer (BCS) theory of superconductivity mainly to allow fits to electronic heat capacity versus temperature T data that deviate from the BCS prediction. The model assumes that the normalized superconducting order parameter Δ(T)/Δ(0) and therefore the normalized London penetration depth λL(T)/λL(0) are the same as in BCS theory, calculated using the BCS value αBCS ≈ 1.764 of α ≡ Δ(0)/kBTc, where kB is Boltzmann's constant and Tc is the superconducting transition temperature. On the other hand, to calculate the electronic free energy, entropy, heat capacity and thermodynamic critical field versus T, the α-model takes α to be an adjustable parameter. Here we write the BCS equations and limiting behaviors for the superconducting state thermodynamic properties explicitly in terms of α, as needed for calculations within the α-model, and present plots of the results versus T and α that are compared with the respective BCS predictions. Mechanisms such as gap anisotropy and strong coupling that can cause deviations of the thermodynamics from the BCS predictions, especially the heat capacity jump at Tc, are considered. Extensions of the α-model that have appeared in the literature, such as the two-band model, are also discussed. Tables of values of Δ(T)/Δ(0), the normalized London parameter Λ(T)/Λ(0) and λL(T)/λL(0) calculated from the BCS theory using α = αBCS are provided, which are the same in the α-model by assumption. Tables of values of the entropy, heat capacity and thermodynamic critical field versus T for seven values of α, including αBCS, are also presented.« less
Can Gravity Probe B usefully constrain torsion gravity theories?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flanagan, Eanna E.; Rosenthal, Eran
2007-06-15
In most theories of gravity involving torsion, the source for torsion is the intrinsic spin of matter. Since the spins of fermions are normally randomly oriented in macroscopic bodies, the amount of torsion generated by macroscopic bodies is normally negligible. However, in a recent paper, Mao et al. (arXiv:gr-qc/0608121) point out that there is a class of theories, including the Hayashi-Shirafuji (1979) theory, in which the angular momentum of macroscopic spinning bodies generates a significant amount of torsion. They further argue that, by the principle of action equals reaction, one would expect the angular momentum of test bodies to couplemore » to a background torsion field, and therefore the precession of the Gravity Probe B gyroscopes should be affected in these theories by the torsion generated by the Earth. We show that in fact the principle of action equals reaction does not apply to these theories, essentially because the torsion is not an independent dynamical degree of freedom. We examine in detail a generalization of the Hayashi-Shirafuji theory suggested by Mao et al. called Einstein-Hayashi-Shirafuji theory. There are a variety of different versions of this theory, depending on the precise form of the coupling to matter chosen for the torsion. We show that, for any coupling to matter that is compatible with the spin transport equation postulated by Mao et al., the theory has either ghosts or an ill-posed initial-value formulation. These theoretical problems can be avoided by specializing the parameters of the theory and in addition choosing the standard minimal coupling to matter of the torsion tensor. This yields a consistent theory, but one in which the action equals reaction principle is violated, and in which the angular momentum of the gyroscopes does not couple to the Earth's torsion field. Thus, the Einstein-Hayashi-Shirafuji theory does not predict a detectable torsion signal for Gravity Probe B. There may be other torsion theories which do.« less
Non-normal perturbation growth in idealised island and headland wakes
NASA Astrophysics Data System (ADS)
Aiken, C. M.; Moore, A. M.; Middleton, J. H.
2003-12-01
Generalised linear stability theory is used to calculate the linear perturbations that furnish most rapid growth in energy in a model of a steady recirculating island wake. This optimal peturbation is found to be antisymmetric and to evolve into a von Kármán vortex street. Eigenanalysis of the linearised system reveals that the eigenmodes corresponding to vortex sheet formation are damped, so the growth of the perturbation is understood through the non-normality of the linearised system. Qualitatively similar perturbation growth is shown to occur in a non-linear model of stochastically-forced subcritical flow, resulting in transition to an unsteady wake. Free-stream variability with amplitude 8% of the mean inflow speed sustains vortex street structures in the non-linear model with perturbation velocities the order of the inflow speed, suggesting that environmental stochastic forcing may similarly be capable of exciting growing disturbances in real island wakes. To support this, qualitatively similar perturbation growth is demonstrated in the straining wake of a realistic island obstacle. It is shown that for the case of an idealised headland, where the vortex street eigenmodes are lacking, vortex sheets are produced through a similar non-normal process.
A Sociological Journey into Sexuality.
ERIC Educational Resources Information Center
Reiss, Ira L.
1986-01-01
Proposes that sexuality is universally linked to the social structure in three specific areas: (a) marital jealousy, (b) gender role power, and (c) beliefs about normality. Variations and interrelations of these three linkages are explained by the logical structure of this sociological theory. The relevance of this theory for the applied…
The Evolution of Human Longevity: Toward a Biocultural Theory.
ERIC Educational Resources Information Center
Mayer, Peter J.
Homo sapiens is the only extant species for which there exists a significant post-reproductive period in the normal lifespan. Explanations for the evolution of this species-specific trait are possible through "non-deterministic" theories of aging positing "wear and tear" or the failure of nature to eliminate imperfection, or…
Cultural Descriptions as Political Cultural Acts: An Exploration
ERIC Educational Resources Information Center
Holliday, Adrian
2010-01-01
Interculturality may be something normal which everyone possesses to a degree. However, dominant neo-essentialist theories of culture give the impression that we are too different to easily cross-cultural boundaries. These theories support the development of academic disciplines and the need for professional certainty in intercultural training.…
Hodgson, Catherine; Lambon Ralph, Matthew A
2008-01-01
Semantic errors are commonly found in semantic dementia (SD) and some forms of stroke aphasia and provide insights into semantic processing and speech production. Low error rates are found in standard picture naming tasks in normal controls. In order to increase error rates and thus provide an experimental model of aphasic performance, this study utilised a novel method- tempo picture naming. Experiment 1 showed that, compared to standard deadline naming tasks, participants made more errors on the tempo picture naming tasks. Further, RTs were longer and more errors were produced to living items than non-living items a pattern seen in both semantic dementia and semantically-impaired stroke aphasic patients. Experiment 2 showed that providing the initial phoneme as a cue enhanced performance whereas providing an incorrect phonemic cue further reduced performance. These results support the contention that the tempo picture naming paradigm reduces the time allowed for controlled semantic processing causing increased error rates. This experimental procedure would, therefore, appear to mimic the performance of aphasic patients with multi-modal semantic impairment that results from poor semantic control rather than the degradation of semantic representations observed in semantic dementia [Jefferies, E. A., & Lambon Ralph, M. A. (2006). Semantic impairment in stoke aphasia vs. semantic dementia: A case-series comparison. Brain, 129, 2132-2147]. Further implications for theories of semantic cognition and models of speech processing are discussed.
Yiu, Sean; Tom, Brian Dm
2017-01-01
Several researchers have described two-part models with patient-specific stochastic processes for analysing longitudinal semicontinuous data. In theory, such models can offer greater flexibility than the standard two-part model with patient-specific random effects. However, in practice, the high dimensional integrations involved in the marginal likelihood (i.e. integrated over the stochastic processes) significantly complicates model fitting. Thus, non-standard computationally intensive procedures based on simulating the marginal likelihood have so far only been proposed. In this paper, we describe an efficient method of implementation by demonstrating how the high dimensional integrations involved in the marginal likelihood can be computed efficiently. Specifically, by using a property of the multivariate normal distribution and the standard marginal cumulative distribution function identity, we transform the marginal likelihood so that the high dimensional integrations are contained in the cumulative distribution function of a multivariate normal distribution, which can then be efficiently evaluated. Hence, maximum likelihood estimation can be used to obtain parameter estimates and asymptotic standard errors (from the observed information matrix) of model parameters. We describe our proposed efficient implementation procedure for the standard two-part model parameterisation and when it is of interest to directly model the overall marginal mean. The methodology is applied on a psoriatic arthritis data set concerning functional disability.
NASA Astrophysics Data System (ADS)
Gavroglu, Kostas
Practitioners of many (sub)-disciplines in the sciences are, at times, confronted with an apparent bliss which often turns into a nightmare: they are stuck with too good and too fertile a theory. 'Normal' science is surely a rewarding practice-but for that very reason it may, at times, also become boring. Theories or theoretical schemata may make successful predictions, may clarify 'mechanisms', they may show the way to further developments, and they may be amenable to non-controversial approximations. If one is really lucky, they may even-at least in principle-be able to answer all questions. There have-especially in the history of physics-been many such theories. Laplacian physics, ether physics and superstrings have historically defined the frameworks for such utopias where everything could be answerable, at least in principle. But one is truly at a loss when one is confronted with this in principle. In principle but not in practice? In principle but never? Confronted with the deadlocks that are implicit in such utopias, scientists started to collectively display a Procrustean psychopathology. They would prepare the beds and, yet, the theories would manage to trick the tricksters: almost all theories appeared to be fitting to any Procrustean bed. They were short and tall and normal at the same time.
Nonlocal superconducting correlations in graphene in the quantum Hall regime
NASA Astrophysics Data System (ADS)
Beconcini, Michael; Polini, Marco; Taddei, Fabio
2018-05-01
We study Andreev processes and nonlocal transport in a three-terminal graphene-superconductor hybrid system under a quantizing perpendicular magnetic field [G.-H. Lee et al., Nat. Phys. 13, 693 (2017), 10.1038/nphys4084]. We find that the amplitude of the crossed Andreev reflection (CAR) processes crucially depends on the orientation of the lattice. By employing Landauer-Büttiker scattering theory, we find that CAR is generally very small for a zigzag edge, while for an armchair edge it can be larger than the normal transmission, thereby resulting in a negative nonlocal resistance. In the case of an armchair edge and with a wide superconducting region (as compared to the superconducting coherence length), CAR exhibits large oscillations as a function of the magnetic field due to interference effects. This results in sign changes of the nonlocal resistance.
Liu, Chen; Yang, Huazhe; Wan, Peng; Wang, Kehong; Tan, Lili; Yang, Ke
2014-02-01
The in vitro biodegradation behavior of Mg17Al12 as a second phase in Mg-Al-Zn alloys was investigated via electrochemical measurement and immersion test. The Hank's solutions with neutral and acidic pH values were adopted as electrolytes to simulate the in vivo environment during normal and inflammatory response process. Furthermore, the local orbital density functional theory approach was employed to study the thermodynamical stability of Mg17Al12 phase. All the results proved the occurrence of pitting corrosion process with crackings for Mg17Al12 phase in Hank's solution, but with a much lower degradation rate compared with both AZ31 alloy and pure magnesium. Furthermore, a preliminary explanation on the biodegradation behaviors of Mg17Al12 phase was proposed. © 2013.
Gestalt psychology: the forgotten paradigm in abnormal psychology.
Silverstein, Steven M; Uhlhaas, Peter J
2004-01-01
Gestalt views of psychopathology are almost completely ignored in mainstream psychology and psychiatry. However, a review of available evidence indicates a remarkable consistency between these views and current data from experimental psychopathology and cognitive neuroscience. This consistency is especially pronounced in the area of schizophrenia. In addition, there is a convergence of cognitive and neurobiological evidence regarding the validity of early Gestalt views of both normal brain-behavior relationships and disordered ones, as in schizophrenia. This article reviews some contributions of Gestalt psychology regarding schizophrenia and examines these views in light of more recent findings from cognitive psychology, cognitive neuroscience, and experimental psychopathology. We conclude that Gestalt theory is a viable theoretical framework from which to understand schizophrenia. Specifically, it appears that a breakdown of Gestalt organizational processes may characterize both the cognitive and the brain processes in schizophrenia.
Theory into Practice: Advancing Normalization for the Child under Three
ERIC Educational Resources Information Center
Conklin-Moore, Alyssa
2017-01-01
Alyssa Conklin-Moore discusses normalization in the child under three from several perspectives. She takes an extensive look at the child, including orienting parents to the Montessori environment, the child's entrance into the environment, addressing the sensitive periods, and fostering independence, contribution, and community. She reminds the…
Local Influence and Robust Procedures for Mediation Analysis
ERIC Educational Resources Information Center
Zu, Jiyun; Yuan, Ke-Hai
2010-01-01
Existing studies of mediation models have been limited to normal-theory maximum likelihood (ML). Because real data in the social and behavioral sciences are seldom normally distributed and often contain outliers, classical methods generally lead to inefficient or biased parameter estimates. Consequently, the conclusions from a mediation analysis…
[An improved low spectral distortion PCA fusion method].
Peng, Shi; Zhang, Ai-Wu; Li, Han-Lun; Hu, Shao-Xing; Meng, Xian-Gang; Sun, Wei-Dong
2013-10-01
Aiming at the spectral distortion produced in PCA fusion process, the present paper proposes an improved low spectral distortion PCA fusion method. This method uses NCUT (normalized cut) image segmentation algorithm to make a complex hyperspectral remote sensing image into multiple sub-images for increasing the separability of samples, which can weaken the spectral distortions of traditional PCA fusion; Pixels similarity weighting matrix and masks were produced by using graph theory and clustering theory. These masks are used to cut the hyperspectral image and high-resolution image into some sub-region objects. All corresponding sub-region objects between the hyperspectral image and high-resolution image are fused by using PCA method, and all sub-regional integration results are spliced together to produce a new image. In the experiment, Hyperion hyperspectral data and Rapid Eye data were used. And the experiment result shows that the proposed method has the same ability to enhance spatial resolution and greater ability to improve spectral fidelity performance.
Fundamentals of poly(lactic acid) microstructure, crystallization behavior, and properties
NASA Astrophysics Data System (ADS)
Kang, Shuhui
Poly(lactic acid) is an environmentally-benign biodegradable and sustainable thermoplastic material, which has found broad applications as food packaging films and as non-woven fibers. The crystallization and deformation mechanisms of the polymer are largely determined by the distribution of conformation and configuration. Knowledge of these mechanisms is needed to understand the mechanical and thermal properties on which processing conditions mainly depend. In conjunction with laser light scattering, Raman spectroscopy and normal coordinate analysis are used in this thesis to elucidate these properties. Vibrational spectroscopic theory, Flory's rotational isomeric state (RIS) theory, Gaussian chain statistics and statistical mechanics are used to relate experimental data to molecular chain structure. A refined RIS model is proposed, chain rigidity recalculated and chain statistics discussed. A Raman spectroscopic characterization method for crystalline and amorphous phase orientation has been developed. A shrinkage model is also proposed to interpret the dimensional stability for fibers and uni- or biaxially stretched films. A study of stereocomplexation formed by poly(l-lactic acid) and poly(d-lactic acid) is also presented.
Infrared absorptivities of transition metals at room and liquid-helium temperatures.
NASA Technical Reports Server (NTRS)
Jones, M. C.; Palmer, D. C.; Tien, C. L.
1972-01-01
Evaluation of experimental data concerning the normal spectral absorptivities of the transition metals, nickel, iron, platinum, and chromium, at both room and liquid-helium temperatures in the wavelength range from 2.5 to 50 microns. The absorptivities were derived from reflectivity measurements made relative to a room-temperature vapor-deposited gold reference mirror. The absorptivity of the gold reference mirror was measured calorimetrically, by use of infrared laser sources. Investigation of various methods of sample-surface preparation resulted in the choice of a vacuum-annealing process as the final stage. The experimental results are discussed on the basis of the anomalous-skin-effect theory modified for multiple conduction bands. As predicted, the results approach a single-band model toward the longer wavelengths. Agreement between theory and experiment is considerably improved by taking into account the modification of the relaxation time due to the photon-electron-phonon interaction proposed by Holstein (1954) and Gurzhi (1958); but, particularly at helium temperatures, the calculated curve is consistently below the experimental results.
Relation Between Hertz Stress-Life Exponent, Ball-Race Conformity, and Ball Bearing Life
NASA Technical Reports Server (NTRS)
Zaretsky, Erwin V.; Poplawski, Joseph V.; Root, Lawrence E.
2008-01-01
ANSI/ABMA and ISO standards based on Lundberg-Palmgren bearing life theory are normalized for ball bearings having inner- and outerrace conformities of 52 percent (0.52) and made from pre-1940 bearing steel. The Lundberg-Palmgren theory incorporates an inverse 9th power relation between Hertz stress and fatigue life for ball bearings. The effect of race conformity on ball set life independent of race life is not incorporated into the Lundberg-Palmgren theory. In addition, post-1960 vacuum-processed bearing steel exhibits a 12th power relation between Hertz stress and life. The work reported extends the previous work of Zaretsky, Poplawski, and Root to calculate changes in bearing life--that includes the life of the ball set--caused by race conformity, Hertz stress-life exponent, ball bearing type and bearing series. The bearing fatigue life in actual application will usually be equal to or greater than that calculated using the ANSI/ABMA and ISO standards that incorporate the Lundberg-Palmgren theory. The relative fatigue life of an individual race is more sensitive to changes in race conformity for Hertz stress-life exponent n of 12 than where n = 9. However, when the effects are combined to predict actual bearing life for a specified set of conditions and bearing geometry, the predicted life of the bearing will be greater for a value of n = 12 than n = 9.
Relation between Hertz Stress-Life Exponent, Ball-Race Conformity, and Ball Bearing Life
NASA Technical Reports Server (NTRS)
Zaretsky, Erwin V.; Poplawski, Joseph V.; Root, Lawrence E.
2006-01-01
ANSI/ABMA and ISO standards based on Lundberg-Palmgren bearing life theory are normalized for ball bearings having inner- and outer-race conformities of 52 percent (0.52) and made from pre-1940 bearing steel. The Lundberg-Palmgren theory incorporates an inverse 9th power relation between Hertz stress and fatigue life for ball bearings. The effect of race conformity on ball set life independent of race life is not incorporated into the Lundberg-Palmgren theory. In addition, post-1960 vacuum-processed bearing steel exhibits a 12th power relation between Hertz stress and life. The work reported extends the previous work of Zaretsky, Poplawski, and Root to calculate changes in bearing life, that includes the life of the ball set, caused by race conformity, Hertz stress-life exponent, ball bearing type and bearing series. The bearing fatigue life in actual application will usually be equal to or greater than that calculated using the ANSI/ABMA and ISO standards that incorporate the Lundberg-Palmgren theory. The relative fatigue life of an individual race is more sensitive to changes in race conformity for Hertz stress-life exponent n of 12 than where n = 9. However, when the effects are combined to predict actual bearing life for a specified set of conditions and bearing geometry, the predicted life of the bearing will be greater for a value of n = 12 than n = 9.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Donnelly, William; Freidel, Laurent
We consider the problem of defining localized subsystems in gauge theory and gravity. Such systems are associated to spacelike hypersurfaces with boundaries and provide the natural setting for studying entanglement entropy of regions of space. We present a general formalism to associate a gauge-invariant classical phase space to a spatial slice with boundary by introducing new degrees of freedom on the boundary. In Yang-Mills theory the new degrees of freedom are a choice of gauge on the boundary, transformations of which are generated by the normal component of the nonabelian electric field. In general relativity the new degrees of freedommore » are the location of a codimension-2 surface and a choice of conformal normal frame. These degrees of freedom transform under a group of surface symmetries, consisting of diffeomorphisms of the codimension-2 boundary, and position-dependent linear deformations of its normal plane. We find the observables which generate these symmetries, consisting of the conformal normal metric and curvature of the normal connection. We discuss the implications for the problem of defining entanglement entropy in quantum gravity. Finally, our work suggests that the Bekenstein-Hawking entropy may arise from the different ways of gluing together two partial Cauchy surfaces at a cross-section of the horizon.« less
In the Shadow of E. H. Carr: The Evolution of International Politics
2012-06-01
promote the merits of cooperation and look to institutions as a method for ensuring peace. We examine Normal Angel’s liberal theory , Robert Keohane...pages, Carr divides the field into its ideational and material sides: utopianism and realism, ethics and politics, theory and practice, intellectualism...Carr believed that the current course of international politics could lead to the ruin of humanity. He did not believe that IR theories and practices
Psychoanalysis and homosexuality: do we need a new theory?
Auchincloss, E L; Vaughan, S C
2001-01-01
No need exists, it is argued, for a new psychoanalytic theory of homosexuality. Certainly psychoanalysis should not be expected to generate such a theory using its own methodology alone. The preoccupation with producing such a theory avoids more important questions about psychoanalytic theory building raised by an examination of the long relationship between psychoanalysis and homosexuality. These questions concern the problems related to using psychoanalytic methodology (1) to construct categories (including the categories normal and abnormal), (2) to construct causal theory (the problems include the limitations of psychoanalytic developmental theory and a long-standing confusion between psychoanalytic developmental theory, psychoanalytic genetic reconstruction, and psychodynamics), and (3) to identify "bedrock." Finally, the question is addressed of what might be needed that is new in the psychoanalytic approach to homosexuality.
Estimating macroporosity in a forest watershed by use of a tension infiltrometer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watson, K.W.; Luxmoore, R.J.
The ability to obtain sufficient field hydrologic data at reasonable cost can be an important limiting factor in applying transport models. A procedure is described for using ponded-flow- and tension-infiltration measurements to calculate transport parameters in a forest watershed. Thirty infiltration measurements were taken under ponded-flow conditions and at 3, 6, and 15 cm (H/sub 2/O) tension. It was assumed from capillarity theory that pores > 0.1-, 0.05-, and 0.02-cm diam, respectively, were excluded from the transport process during the tension infiltration measurements. Under ponded flow, 73% of the flux was conducted through macropores (i.e., pores > 0.1-cm diam.). Anmore » estimated 96% of the water flux was transmitted through only 0.32% of the soil volume. In general the larger the total water flux the larger the macropore contribution to total water flux. The Shapiro-Wilk normality test indicated that water flux through both matrix pore space and macropores was log-normally distributed in space.« less
Mustillo, Sarah A; Hendrix, Kimber L; Schafer, Markus H
2012-03-01
As a stigmatizing condition, obesity may lead to the internalization of devalued labels and threats to self-concept. Modified labeling theory suggests that the effects of stigma may outlive direct manifestations of the discredited characteristic itself. This article considers whether obesity's effects on self-concept linger when obese youth enter the normal body mass range. Using longitudinal data from the National Growth and Health Study on 2,206 black and white girls, we estimated a parallel-process growth mixture model of body mass linked to growth models of body image discrepancy and self-esteem. We found that discrepancy was higher and self-esteem lower in formerly obese girls compared to girls always in the normal range and comparable to chronically obese girls. Neither body image discrepancy nor self-esteem rebounded in white girls despite reduction in body mass, suggesting that the effects of stigma linger. Self-esteem, but not discrepancy, did rebound in black girls.
Markevych, Vladlena; Asbjørnsen, Arve E; Lind, Ola; Plante, Elena; Cone, Barbara
2011-07-01
The present study investigated a possible connection between speech processing and cochlear function. Twenty-two subjects with age range from 18 to 39, balanced for gender with normal hearing and without any known neurological condition, were tested with the dichotic listening (DL) test, in which listeners were asked to identify CV-syllables in a nonforced, and also attention-right, and attention-left condition. Transient evoked otoacoustic emissions (TEOAEs) were recorded for both ears, with and without the presentation of contralateral broadband noise. The main finding was a strong negative correlation between language laterality as measured with the dichotic listening task and of the TEOAE responses. The findings support a hypothesis of shared variance between central and peripheral auditory lateralities, and contribute to the attentional theory of auditory lateralization. The results have implications for the understanding of the cortico-fugal efferent control of cochlear activity. 2011 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Wang, Yu; Fan, Jie; Xu, Ye; Sun, Wei; Chen, Dong
2017-06-01
Effective application of carbon capture, utilization and storage (CCUS) systems could help to alleviate the influence of climate change by reducing carbon dioxide (CO2) emissions. The research objective of this study is to develop an equilibrium chance-constrained programming model with bi-random variables (ECCP model) for supporting the CCUS management system under random circumstances. The major advantage of the ECCP model is that it tackles random variables as bi-random variables with a normal distribution, where the mean values follow a normal distribution. This could avoid irrational assumptions and oversimplifications in the process of parameter design and enrich the theory of stochastic optimization. The ECCP model is solved by an equilibrium change-constrained programming algorithm, which provides convenience for decision makers to rank the solution set using the natural order of real numbers. The ECCP model is applied to a CCUS management problem, and the solutions could be useful in helping managers to design and generate rational CO2-allocation patterns under complexities and uncertainties.
Surface recrystallization theory of the wear of copper in liquid methane
NASA Technical Reports Server (NTRS)
Bill, R. C.; Wisander, D. W.
1974-01-01
Copper was subjected to sliding against 440C in liquid methane. The normal load range was from 1/4 to 2 kilograms, and the sliding velocity range was from 3.1 to 25 meters per second. Over this range of experimental parameters, the wear rate of the copper rider was found to be proportional to the sliding velocity squared and to the normal load. Transmission electron microscopy was used to study the dislocation structure in the copper very near the wear scar surface. It was found that near the wear scar surface, the microstructure was characterized by a fine-cell recrystallized zone in which individual dislocations could be distinguished in the cell walls. The interiors of the cells, about 0.5 micrometer in diameter, were nearly dislocation free. Below the recrystallized layer was a zone that was intensely cold worked by the friction process. With increasing depth, this intensely cold worked zone gradually became indistinguishable from the partially cold worked bulk of the copper, representative of the initial condition of the material.
HIV-positive mothers and stigma.
Ingram, D; Hutchinson, S A
1999-01-01
Our purpose in this paper is to demonstrate how stigma pervades the lives of human immunodeficiency virus (HIV)-positive mothers and their children. Data from a grounded theory study on HIV-positive mothers are used to illustrate Goffman's theory of stigma. This research is an example of "emergent fit," where extant theory is discovered by the interpretive researchers to fit much of the data. The sample included 18 HIV-positive mothers who participated in in-depth interviews. The HIV-positive mothers valued being perceived as normal but acknowledged that normalcy was lost for them because of the stigma of HIV. Consequently, they tried to pass as normal by managing information and manipulating their environment. They attempted to cover up their illness by lying and pretending. Health care professionals can provide quality, client-centered care when they understand the power that stigma holds over these women and the strategies that effectively mitigate the stigma.
A continuum theory of grain size evolution and damage
NASA Astrophysics Data System (ADS)
Ricard, Y.; Bercovici, D.
2009-01-01
Lithospheric shear localization, as occurs in the formation of tectonic plate boundaries, is often associated with diminished grain size (e.g., mylonites). Grain size reduction is typically attributed to dynamic recrystallization; however, theoretical models of shear localization arising from this hypothesis are problematic because (1) they require the simultaneous action of two creep mechanisms (diffusion and dislocation creep) that occur in different deformation regimes (i.e., in grain size stress space) and (2) the grain growth ("healing") laws employed by these models are derived from normal grain growth or coarsening theory, which are valid in the absence of deformation, although the shear localization setting itself requires deformation. Here we present a new first principles grained-continuum theory, which accounts for both coarsening and damage-induced grain size reduction in a monomineralic assemblage undergoing irrecoverable deformation. Damage per se is the generic process for generation of microcracks, defects, dislocations (including recrystallization), subgrains, nuclei, and cataclastic breakdown of grains. The theory contains coupled macroscopic continuum mechanical and grain-scale statistical components. The continuum level of the theory considers standard mass, momentum, and energy conservation, as well as entropy production, on a statistically averaged grained continuum. The grain-scale element of the theory describes both the evolution of the grain size distribution and mechanisms for both continuous grain growth and discontinuous grain fracture and coalescence. The continuous and discontinuous processes of grain size variation are prescribed by nonequilibrium thermodynamics (in particular, the treatment of entropy production provides the phenomenological laws for grain growth and reduction); grain size evolution thus incorporates the free energy differences between grains, including both grain boundary surface energy (which controls coarsening) and the contribution of deformational work to these free energies (which controls damage). In the absence of deformation, only two mechanisms that increase the average grain size are allowed by the second law of thermodynamics. One mechanism, involving continuous diffusive mass transport from small to large grains, captures the essential components of normal grain growth theories of Lifshitz-Slyosov and Hillert. The second mechanism involves the aggregation of grains and is described using a Smoluchovski formalism. With the inclusion of deformational work and damage, the theory predicts two mechanisms for which the thermodynamic requirement of entropy positivity always forces large grains to shrink and small ones to grow. The first such damage-driven mechanism involving continuous mass transfer from large to small grains tends to homogenize the distribution of grain size toward its initial mean grain size. The second damage mechanism favors the creation of small grains by discontinuous division of larger grains and reduces the mean grain size with time. When considered separately, most of these mechanisms allow for self-similar grain size distributions whose scales (i.e., statistical moments such as the mean, variance, and skewness) can all be described by a single grain scale, such as the mean or maximum. However, the combination of mechanisms, e.g., one that captures the competition between continuous coarsening and mean grain size reduction by breakage, does not generally permit a self-similar solution for the grain size distribution, which contradicts the classic assumption that grain growth laws allowing for both coarsening and recrystallization can be treated with a single grain scale such as the mean size.
ERIC Educational Resources Information Center
Golledge, Reginald G.
1996-01-01
Discusses the origin of theories in geography and particularly the development of location theories. Considers the influence of economic theory on agricultural land use, industrial location, and geographic location theories. Explores a set of interrelated activities that show how the marketing process illustrates process theory. (MJP)
Relaxation approximation in the theory of shear turbulence
NASA Technical Reports Server (NTRS)
Rubinstein, Robert
1995-01-01
Leslie's perturbative treatment of the direct interaction approximation for shear turbulence (Modern Developments in the Theory of Turbulence, 1972) is applied to derive a time dependent model for the Reynolds stresses. The stresses are decomposed into tensor components which satisfy coupled linear relaxation equations; the present theory therefore differs from phenomenological Reynolds stress closures in which the time derivatives of the stresses are expressed in terms of the stresses themselves. The theory accounts naturally for the time dependence of the Reynolds normal stress ratios in simple shear flow. The distortion of wavenumber space by the mean shear plays a crucial role in this theory.
A {3,2}-Order Bending Theory for Laminated Composite and Sandwich Beams
NASA Technical Reports Server (NTRS)
Cook, Geoffrey M.; Tessler, Alexander
1998-01-01
A higher-order bending theory is derived for laminated composite and sandwich beams thus extending the recent {1,2}-order theory to include third-order axial effect without introducing additional kinematic variables. The present theory is of order {3,2} and includes both transverse shear and transverse normal deformations. A closed-form solution to the cylindrical bending problem is derived and compared with the corresponding exact elasticity solution. The numerical comparisons are focused on the most challenging material systems and beam aspect ratios which include moderate-to-thick unsymmetric composite and sandwich laminates. Advantages and limitations of the theory are discussed.
Generalized interferometry - I: theory for interstation correlations
NASA Astrophysics Data System (ADS)
Fichtner, Andreas; Stehly, Laurent; Ermert, Laura; Boehm, Christian
2017-02-01
We develop a general theory for interferometry by correlation that (i) properly accounts for heterogeneously distributed sources of continuous or transient nature, (ii) fully incorporates any type of linear and nonlinear processing, such as one-bit normalization, spectral whitening and phase-weighted stacking, (iii) operates for any type of medium, including 3-D elastic, heterogeneous and attenuating media, (iv) enables the exploitation of complete correlation waveforms, including seemingly unphysical arrivals, and (v) unifies the earthquake-based two-station method and ambient noise correlations. Our central theme is not to equate interferometry with Green function retrieval, and to extract information directly from processed interstation correlations, regardless of their relation to the Green function. We demonstrate that processing transforms the actual wavefield sources and actual wave propagation physics into effective sources and effective wave propagation. This transformation is uniquely determined by the processing applied to the observed data, and can be easily computed. The effective forward model, that links effective sources and propagation to synthetic interstation correlations, may not be perfect. A forward modelling error, induced by processing, describes the extent to which processed correlations can actually be interpreted as proper correlations, that is, as resulting from some effective source and some effective wave propagation. The magnitude of the forward modelling error is controlled by the processing scheme and the temporal variability of the sources. Applying adjoint techniques to the effective forward model, we derive finite-frequency Fréchet kernels for the sources of the wavefield and Earth structure, that should be inverted jointly. The structure kernels depend on the sources of the wavefield and the processing scheme applied to the raw data. Therefore, both must be taken into account correctly in order to make accurate inferences on Earth structure. Not making any restrictive assumptions on the nature of the wavefield sources, our theory can be applied to earthquake and ambient noise data, either separately or combined. This allows us (i) to locate earthquakes using interstation correlations and without knowledge of the origin time, (ii) to unify the earthquake-based two-station method and noise correlations without the need to exclude either of the two data types, and (iii) to eliminate the requirement to remove earthquake signals from noise recordings prior to the computation of correlation functions. In addition to the basic theory for acoustic wavefields, we present numerical examples for 2-D media, an extension to the most general viscoelastic case, and a method for the design of optimal processing schemes that eliminate the forward modelling error completely. This work is intended to provide a comprehensive theoretical foundation of full-waveform interferometry by correlation, and to suggest improvements to current passive monitoring methods.
Languages and Lives through a Critical Eye: The Case of Estonia
ERIC Educational Resources Information Center
Skerrett, Delaney Michael
2011-01-01
This article seeks to situate Estonian language use and policy within the emerging field of critical language policy and planning (LPP). Critical LPP draws on poststructuralist theory to deconstruct normalized categories that maintain systems of inequality. It is akin to the queer theory project for gender and sexuality. Since the country regained…
Theory of Mind in Williams Syndrome Assessed Using a Nonverbal Task
ERIC Educational Resources Information Center
Porter, Melanie A.; Coltheart, Max; Langdon, Robyn
2008-01-01
This study examined Theory of Mind in Williams syndrome (WS) and in normal chronological age-matched and mental age-matched control groups, using a picture sequencing task. This task assesses understanding of pretence, intention and false belief, while controlling for social-script knowledge and physical cause-and-effect reasoning. The task was…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Yimin; Miller, Wlliam H.
2006-02-22
One of the outstanding issues in the quantum instanton (QI) theory (or any transition state-type theory) for thermal rate constants of chemical reactions is the choice of an appropriate ''dividing surface'' (DS) that separates reactants and products. (In the general version of the QI theory, there are actually two dividing surfaces involved.) This paper shows one simple and general way for choosing DS's for use in QI Theory, namely using the family of (hyper) planes normal to the minimum energy path (MEP) on the potential energy surface at various distances s along it. Here the reaction coordinate is not onemore » of the dynamical coordinates of the system (which will in general be the Cartesian coordinates of the atoms), but rather simply a parameter which specifies the DS. It is also shown how this idea can be implemented for an N-atom system in 3d space in a way that preserves overall translational and rotational invariance. Numerical application to a simple system (the colliner H + H{sub 2} reaction) is presented to illustrate the procedure.« less
Local phase space and edge modes for diffeomorphism-invariant theories
NASA Astrophysics Data System (ADS)
Speranza, Antony J.
2018-02-01
We discuss an approach to characterizing local degrees of freedom of a subregion in diffeomorphism-invariant theories using the extended phase space of Donnelly and Freidel [36]. Such a characterization is important for defining local observables and entanglement entropy in gravitational theories. Traditional phase space constructions for subregions are not invariant with respect to diffeomorphisms that act at the boundary. The extended phase space remedies this problem by introducing edge mode fields at the boundary whose transformations under diffeomorphisms render the extended symplectic structure fully gauge invariant. In this work, we present a general construction for the edge mode symplectic structure. We show that the new fields satisfy a surface symmetry algebra generated by the Noether charges associated with the edge mode fields. For surface-preserving symmetries, the algebra is universal for all diffeomorphism-invariant theories, comprised of diffeomorphisms of the boundary, SL(2, ℝ) transformations of the normal plane, and, in some cases, normal shearing transformations. We also show that if boundary conditions are chosen such that surface translations are symmetries, the algebra acquires a central extension.
Theories of transporting processes of Cu in Jiaozhou Bay
NASA Astrophysics Data System (ADS)
Yang, Dongfang; Su, Chunhua; Zhu, Sixi; Wu, Yunjie; Zhou, Wei
2018-02-01
Many marine bays have been polluted along with the rapid development of industry and population size, and understanding the transporting progresses of pollutants is essential to pollution control. In order to better understanding the transporting progresses of pollutants in marine, this paper carried on a comprehensive research of the theories of transporting processes of Cu in Jiaozhou Bay. Results showed that the transporting processes of Cu in this bay could be summarized into seven key theories including homogeneous theory, environmental dynamic theory, horizontal loss theory, source to waters transporting theory, sedimentation transporting theory, migration trend theory and vertical transporting theory, respectively. These theories helpful to better understand the migration progress of pollutants in marine bay.
NASA Technical Reports Server (NTRS)
Pototzky, Anthony S.; Zeiler, Thomas A.; Perry, Boyd, III
1989-01-01
This paper describes and illustrates two ways of performing time-correlated gust-load calculations. The first is based on Matched Filter Theory; the second on Random Process Theory. Both approaches yield theoretically identical results and represent novel applications of the theories, are computationally fast, and may be applied to other dynamic-response problems. A theoretical development and example calculations using both Matched Filter Theory and Random Process Theory approaches are presented.
NASA Technical Reports Server (NTRS)
Pototzky, Anthony S.; Zeiler, Thomas A.; Perry, Boyd, III
1989-01-01
Two ways of performing time-correlated gust-load calculations are described and illustrated. The first is based on Matched Filter Theory; the second on Random Process Theory. Both approaches yield theoretically identical results and represent novel applications of the theories, are computationally fast, and may be applied to other dynamic-response problems. A theoretical development and example calculations using both Matched Filter Theory and Random Process Theory approaches are presented.
ERIC Educational Resources Information Center
Furnham, Adrian
1984-01-01
Over 200 'normal' adolescents were administered self-report measures of personality (extraversion, neuroticism, and psychoticism), social skills, anomie, and delinquency in order to establish which of three theories best predicted delinquency. Eysenck's personality factors, particularly psychoticism, correlated most highly with delinquency. (RH)
Lesbian Mothers' Bids for Normalcy in Their Children's Schools
ERIC Educational Resources Information Center
Bower, Laura A.; Klecka, Cari L.
2009-01-01
Albeit growing in number, lesbian mothers and their children remain a statistical minority in schools. Lesbian mothers in this study described their families as "normal" or "just like any other family." From the perspective of queer theory, normal is a socially constructed and insidious concept. This study analyzes both the strategies participants…
Specific Language Impairment as a Period of Extended Optional Infinitive.
ERIC Educational Resources Information Center
Rice, Mabel L.; And Others
1995-01-01
This study evaluated an Extended Optional Infinitive theory of specific language impairment (SLI) in children, which suggests that SLI children omit finiteness markers longer than do normally developing children. Comparison of 18 SLI 5-year olds with 2 normally developing groups (ages 5 and 3) found that SLI subjects omitted finiteness markers…
Li, Heheng; Luo, Liangping; Huang, Li
2011-02-01
The present paper is aimed to study the fractal spectrum of the cerebral computerized tomography in 158 normal infants of different age groups, based on the calculation of chaotic theory. The distribution range of neonatal period was 1.88-1.90 (mean = 1.8913 +/- 0.0064); It reached a stable condition at the level of 1.89-1.90 during 1-12 months old (mean = 1.8927 +/- 0.0045); The normal range of 1-2 years old infants was 1.86-1.90 (mean = 1.8863 +/- 4 0.0085); It kept the invariance of the quantitative value among 1.88-1.91(mean = 1.8958 +/- 0.0083) during 2-3 years of age. ANOVA indicated there's no significant difference between boys and girls (F = 0.243, P > 0.05), but the difference of age groups was significant (F = 8.947, P < 0.001). The fractal dimension of cerebral computerized tomography in normal infants computed by box methods was maintained at an efficient stability from 1.86 to 1.91. It indicated that there exit some attractor modes in pediatric brain development.
NASA Astrophysics Data System (ADS)
Dykeman, Eric C.; Sankey, Otto F.
2010-02-01
We describe a technique for calculating the low-frequency mechanical modes and frequencies of a large symmetric biological molecule where the eigenvectors of the Hessian matrix are determined with full atomic detail. The method, which follows order N methods used in electronic structure theory, determines the subset of lowest-frequency modes while using group theory to reduce the complexity of the problem. We apply the method to three icosahedral viruses of various T numbers and sizes; the human viruses polio and hepatitis B, and the cowpea chlorotic mottle virus, a plant virus. From the normal-mode eigenvectors, we use a bond polarizability model to predict a low-frequency Raman scattering profile for the viruses. The full atomic detail in the displacement patterns combined with an empirical potential-energy model allows a comparison of the fully atomic normal modes with elastic network models and normal-mode analysis with only dihedral degrees of freedom. We find that coarse-graining normal-mode analysis (particularly the elastic network model) can predict the displacement patterns for the first few (˜10) low-frequency modes that are global and cooperative.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu Jialu; Yang Chunnuan; Cai Hao
2007-04-15
After finding the basic solutions of the linearized nonlinear Schroedinger equation by the method of separation of variables, the perturbation theory for the dark soliton solution is constructed by linear Green's function theory. In application to the self-induced Raman scattering, the adiabatic corrections to the soliton's parameters are obtained and the remaining correction term is given as a pure integral with respect to the continuous spectral parameter.
Kovalev, Vadim M; Tse, Wang-Kong
2017-11-22
We develop a microscopic theory for the relaxation dynamics of an optically pumped two-level system (TLS) coupled to a bath of weakly interacting Bose gas. Using Keldysh formalism and diagrammatic perturbation theory, expressions for the relaxation times of the TLS Rabi oscillations are derived when the boson bath is in the normal state and the Bose-Einstein condensate (BEC) state. We apply our general theory to consider an irradiated quantum dot coupled with a boson bath consisting of a two-dimensional dipolar exciton gas. When the bath is in the BEC regime, relaxation of the Rabi oscillations is due to both condensate and non-condensate fractions of the bath bosons for weak TLS-light coupling and pre dominantly due to the non-condensate fraction for strong TLS-light coupling. Our theory also shows that a phase transition of the bath from the normal to the BEC state strongly influences the relaxation rate of the TLS Rabi oscillations. The TLS relaxation rate is approximately independent of the pump field frequency and monotonically dependent on the field strength when the bath is in the low-temperature regime of the normal phase. Phase transition of the dipolar exciton gas leads to a non-monotonic dependence of the TLS relaxation rate on both the pump field frequency and field strength, providing a characteristic signature for the detection of BEC phase transition of the coupled dipolar exciton gas.
An improved plate theory of order (1,2) for thick composite laminates
NASA Technical Reports Server (NTRS)
Tessler, A.
1992-01-01
A new (1,2)-order theory is proposed for the linear elasto-static analysis of laminated composite plates. The basic assumptions are those concerning the distribution through the laminate thickness of the displacements, transverse shear strains and the transverse normal stress, with these quantities regarded as some weighted averages of their exact elasticity theory representations. The displacement expansions are linear for the inplane components and quadratic for the transverse component, whereas the transverse shear strains and transverse normal stress are respectively quadratic and cubic through the thickness. The main distinguishing feature of the theory is that all strain and stress components are expressed in terms of the assumed displacements prior to the application of a variational principle. This is accomplished by an a priori least-square compatibility requirement for the transverse strains and by requiring exact stress boundary conditions at the top and bottom plate surfaces. Equations of equilibrium and associated Poisson boundary conditions are derived from the virtual work principle. It is shown that the theory is particularly suited for finite element discretization as it requires simple C(sup 0)- and C(sup -1)-continuous displacement interpolation fields. Analytic solutions for the problem of cylindrical bending are derived and compared with the exact elasticity solutions and those of our earlier (1,2)-order theory based on the assumed displacements and transverse strains.
Exploring social cognition in patients with apathy following acquired brain damage.
Njomboro, Progress; Humphreys, Glyn W; Deb, Shoumitro
2014-01-23
Research on cognition in apathy has largely focused on executive functions. To the best of our knowledge, no studies have investigated the relationship between apathy symptoms and processes involved in social cognition. Apathy symptoms include attenuated emotional behaviour, low social engagement and social withdrawal, all of which may be linked to underlying socio-cognitive deficits. We compared patients with brain damage who also had apathy symptoms against similar patients with brain damage but without apathy symptoms. Both patient groups were also compared against normal controls on key socio-cognitive measures involving moral reasoning, social awareness related to making judgements between normative and non-normative behaviour, Theory of Mind processing, and the perception of facial expressions of emotion. We also controlled for the likely effects of executive deficits and depressive symptoms on these comparisons. Our results indicated that patients with apathy were distinctively impaired in making moral reasoning decisions and in judging the social appropriateness of behaviour. Deficits in Theory of Mind and perception of facial expressions of emotion did not distinguish patients with apathy from those without apathy. Our findings point to a possible socio-cognitive profile for apathy symptoms and provide initial insights into how socio-cognitive deficits in patients with apathy may affect social functioning.
From Hayflick to Walford: the role of T cell replicative senescence in human aging.
Effros, Rita B
2004-06-01
The immunologic theory of aging, proposed more than 40 years ago by Roy Walford, suggests that the normal process of aging in man and in animals is pathogenetically related to faulty immunological processes. Since that time, research on immunological aging has undergone extraordinary expansion, leading to new information in areas spanning from molecular biology and cell signaling to large-scale clinical studies. Investigation in this area has also provided unexpected insights into HIV disease, many aspects of which represent accelerated immunological aging. This article describes the initial insights and vision of Roy Walford into one particular facet of human immunological aging, namely, the potential relevance of the well-studied human fibroblast replicative senescence model, initially developed by Leonard Hayflick, to cells of the immune system. Extensive research on T cell senescence in cell culture has now documented changes in vitro that closely mirror alterations occurring during in vivo aging in humans, underscoring the biological significance of T cell replicative senescence. Moreover, the inclusion of high proportions of putatively senescent T cells in the 'immune risk phenotype' that is associated with early mortality in octogenarians provides initial clinical confirmation of both the immunologic theory of aging and the role of the T cell Hayflick Limit in human aging, two areas of gerontological research pioneered by Roy Walford.
Thermalization time scales for WIMP capture by the Sun in effective theories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Widmark, A., E-mail: axel.widmark@fysik.su.se
I study the process of dark matter capture by the Sun, under the assumption of a Weakly Interacting Massive Particle (WIMP), in the framework of non-relativistic effective field theory. Hypothetically, WIMPs from the galactic halo can scatter against atomic nuclei in the solar interior, settle to thermal equilibrium with the solar core and annihilate to produce an observable flux of neutrinos. In particular, I examine the thermalization process using Monte-Carlo integration of WIMP trajectories. I consider WIMPs in a mass range of 10–1000 GeV and WIMP-nucleon interaction operators with different dependence on spin and transferred momentum. I find that themore » density profiles of captured WIMPs are in accordance with a thermal profile described by the Sun's gravitational potential and core temperature. Depending on the operator that governs the interaction, the majority of the thermalization time is spent in either the solar interior or exterior. If normalizing the WIMP-nuclei interaction strength to a specific capture rate, I find that the thermalization time differs at most by 3 orders of magnitude between operators. In most cases of interest, the thermalization time is many orders of magnitude shorter than the age of the solar system.« less
NASA Astrophysics Data System (ADS)
Kohno, M.
2018-03-01
Adopting hyperon-nucleon and hyperon-nucleon-nucleon interactions parametrized in chiral effective field theory, single-particle potentials of the Λ and Σ hyperons are evaluated in symmetric nuclear matter and in pure neutron matter within the framework of lowest-order Bruckner theory. The chiral NLO interaction bears strong Λ N -Σ N coupling. Although the Λ potential is repulsive if the coupling is switched off, the Λ N -Σ N correlation brings about the attraction consistent with empirical data. The Σ potential is repulsive, which is also consistent with empirical information. The interesting result is that the Λ potential becomes shallower beyond normal density. This provides the possibility of solving the hyperon puzzle without introducing ad hoc assumptions. The effects of the Λ N N -Λ N N and Λ N N -Σ N N three-baryon forces are considered. These three-baryon forces are first reduced to normal-ordered effective two-baryon interactions in nuclear matter and then incorporated in the G -matrix equation. The repulsion from the Λ N N -Λ N N interaction is of the order of 5 MeV at normal density and becomes larger with increasing density. The effects of the Λ N N -Σ N N coupling compensate the repulsion at normal density. The net effect of the three-baryon interactions on the Λ single-particle potential is repulsive at higher densities.
Stasenko, Alena; Bonn, Cory; Teghipco, Alex; Garcea, Frank E; Sweet, Catherine; Dombovy, Mary; McDonough, Joyce; Mahon, Bradford Z
2015-01-01
The debate about the causal role of the motor system in speech perception has been reignited by demonstrations that motor processes are engaged during the processing of speech sounds. Here, we evaluate which aspects of auditory speech processing are affected, and which are not, in a stroke patient with dysfunction of the speech motor system. We found that the patient showed a normal phonemic categorical boundary when discriminating two non-words that differ by a minimal pair (e.g., ADA-AGA). However, using the same stimuli, the patient was unable to identify or label the non-word stimuli (using a button-press response). A control task showed that he could identify speech sounds by speaker gender, ruling out a general labelling impairment. These data suggest that while the motor system is not causally involved in perception of the speech signal, it may be used when other cues (e.g., meaning, context) are not available.
Counting abilities in autism: possible implications for central coherence theory.
Jarrold, C; Russell, J
1997-02-01
We examined the claim that children with autism have a "weak drive for central coherence" which biases them towards processing information at an analytic rather than global level. This was done by investigating whether children with autism would rapidly and automatically enumerate a number of dots presented in a canonical form, or count each dot individually to obtain the total. The time taken to count stimuli was compared across three participant groups: children with autism, children with moderate learning difficulties, and normally developing children. There were 22 children in each group, and individuals were matched across groups on the basis of verbal mental age. Results implied that children with autism did show a tendency towards an analytic level of processing. However, though the groups differed on measures of counting speeds, the number or children showing patterns of global or analytic processing did not differ significantly across the groups. Whether these results implicate a weak drive for central coherence in autism, which is both specific to, and pervasive in the disorder, is discussed.
Zhang, Hua; Wang, Chen; Sun, Han-Lei; Fu, Gang; Chen, Shu; Zhang, Yue-Jiao; Chen, Bing-Hui; Anema, Jason R.; Yang, Zhi-Lin; Li, Jian-Feng; Tian, Zhong-Qun
2017-01-01
Surface molecular information acquired in situ from a catalytic process can greatly promote the rational design of highly efficient catalysts by revealing structure-activity relationships and reaction mechanisms. Raman spectroscopy can provide this rich structural information, but normal Raman is not sensitive enough to detect trace active species adsorbed on the surface of catalysts. Here we develop a general method for in situ monitoring of heterogeneous catalytic processes through shell-isolated nanoparticle-enhanced Raman spectroscopy (SHINERS) satellite nanocomposites (Au-core silica-shell nanocatalyst-satellite structures), which are stable and have extremely high surface Raman sensitivity. By combining operando SHINERS with density functional theory calculations, we identify the working mechanisms for CO oxidation over PtFe and Pd nanocatalysts, which are typical low- and high-temperature catalysts, respectively. Active species, such as surface oxides, superoxide/peroxide species and Pd–C/Pt–C bonds are directly observed during the reactions. We demonstrate that in situ SHINERS can provide a deep understanding of the fundamental concepts of catalysis. PMID:28537269
The Development of Early Pulsation Theory, or, How Cepheids Are Like Steam Engines
NASA Astrophysics Data System (ADS)
Stanley, M.
2012-06-01
The pulsation theory of Cepheid variable stars was a major breakthrough of early twentieth-century astrophysics. At the beginning of that century, the basic physics of normal stars was very poorly understood, and variable stars were even more mysterious. Breaking with accepted explanations in terms of eclipsing binaries, Harlow Shapley and A. S. Eddington pioneered novel theories that considered Cepheids as pulsating spheres of gas. Surprisingly, the pulsation theory not only depended on novel developments in stellar physics, but the theory also drove many of those developments. In particular, models of stars in radiative balance and theories of stellar energy were heavily inspired and shaped by ideas about variable stars. Further, the success of the pulsation theory helped justify the new approaches to astrophysics being developed before World War II.
Estimating outflow facility through pressure dependent pathways of the human eye
Gardiner, Bruce S.
2017-01-01
We develop and test a new theory for pressure dependent outflow from the eye. The theory comprises three main parameters: (i) a constant hydraulic conductivity, (ii) an exponential decay constant and (iii) a no-flow intraocular pressure, from which the total pressure dependent outflow, average outflow facilities and local outflow facilities for the whole eye may be evaluated. We use a new notation to specify precisely the meaning of model parameters and so model outputs. Drawing on a range of published data, we apply the theory to animal eyes, enucleated eyes and in vivo human eyes, and demonstrate how to evaluate model parameters. It is shown that the theory can fit high quality experimental data remarkably well. The new theory predicts that outflow facilities and total pressure dependent outflow for the whole eye are more than twice as large as estimates based on the Goldman equation and fluorometric analysis of anterior aqueous outflow. It appears likely that this discrepancy can be largely explained by pseudofacility and aqueous flow through the retinal pigmented epithelium, while any residual discrepancy may be due to pathological processes in aged eyes. The model predicts that if the hydraulic conductivity is too small, or the exponential decay constant is too large, then intraocular eye pressure may become unstable when subjected to normal circadian changes in aqueous production. The model also predicts relationships between variables that may be helpful when planning future experiments, and the model generates many novel testable hypotheses. With additional research, the analysis described here may find application in the differential diagnosis, prognosis and monitoring of glaucoma. PMID:29261696
Meyer, Rebecca A; Fish, Anne F; Lou, Qinqing
2017-10-01
This article describes the Hage framework for theory construction and its application to the clinical problem of glycemic control in college-aged students with type 1 diabetes. College-aged students with type 1 diabetes struggle to self-manage their condition. Glycated hemoglobin (HbA1c), if controlled within acceptable limits (6-8%), is associated with the prevention or delay of serious diabetic complications such as kidney and cardiovascular disease. Diabetes educators provide knowledge and skills, but young adults must self-manage their condition on a daily basis, independent of parents. The Hage framework includes five tasks of theory construction: narrowing and naming the concepts, specifying the definitions, creating the theoretical statements, specifying the linkages, and ordering components in preparation for model building. During the process, concepts within the theory were revised as the literature was reviewed, and measures and hypotheses, foundational to research, were generated. We were successful in applying the framework and creating a model of factors affecting glycemic control, emphasizing that physical activity, thought of as a normal part of wellness, can be a two-edged sword producing positive effect but also serious negative effects in some college-aged students with type 1 diabetes. Contextual factors important to self-management in college-aged students are emphasized. The Hage framework, already used to a small extent in nursing curricula, deserves more attention and, because of its generic nature, may be used as a template for theory construction to examine a wide variety of nursing topics. Copyright © 2017 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Abele, Stephan
2018-01-01
This article deals with a theory-based investigation of the diagnostic problem-solving process in professional contexts. To begin with, a theory of the diagnostic problem-solving process was developed drawing on findings from different professional contexts. The theory distinguishes between four sub-processes of the diagnostic problem-solving…
NASA Astrophysics Data System (ADS)
Hu, YanChao; Bi, WeiTao; Li, ShiYao; She, ZhenSu
2017-12-01
A challenge in the study of turbulent boundary layers (TBLs) is to understand the non-equilibrium relaxation process after sep-aration and reattachment due to shock-wave/boundary-layer interaction. The classical boundary layer theory cannot deal with the strong adverse pressure gradient, and hence, the computational modeling of this process remains inaccurate. Here, we report the direct numerical simulation results of the relaxation TBL behind a compression ramp, which reveal the presence of intense large-scale eddies, with significantly enhanced Reynolds stress and turbulent heat flux. A crucial finding is that the wall-normal profiles of the excess Reynolds stress and turbulent heat flux obey a β-distribution, which is a product of two power laws with respect to the wall-normal distances from the wall and from the boundary layer edge. In addition, the streamwise decays of the excess Reynolds stress and turbulent heat flux also exhibit power laws with respect to the streamwise distance from the corner of the compression ramp. These results suggest that the relaxation TBL obeys the dilation symmetry, which is a specific form of self-organization in this complex non-equilibrium flow. The β-distribution yields important hints for the development of a turbulence model.
Russo, Michael A.; Högenauer, Christoph; Coates, Stephen W.; Santa Ana, Carol A.; Porter, Jack L.; Rosenblatt, Randall L.; Emmett, Michael; Fordtran, John S.
2003-01-01
Due to genetic defects in apical membrane chloride channels, the cystic fibrosis (CF) intestine does not secrete chloride normally. Depressed chloride secretion leaves CF intestinal absorptive processes unopposed, which results in net fluid hyperabsorption, dehydration of intestinal contents, and a propensity to inspissated intestinal obstruction. This theory is based primarily on in vitro studies of jejunal mucosa. To determine if CF patients actually hyperabsorb fluid in vivo, we measured electrolyte and water absorption during steady-state perfusion of the jejunum. As expected, chloride secretion was abnormally low in CF, but surprisingly, there was no net hyperabsorption of sodium or water during perfusion of a balanced electrolyte solution. This suggested that fluid absorption processes are reduced in CF jejunum, and further studies revealed that this was due to a marked depression of passive chloride absorption. Although Na+-glucose cotransport was normal in the CF jejunum, absence of passive chloride absorption completely blocked glucose-stimulated net sodium absorption and reduced glucose-stimulated water absorption 66%. This chloride absorptive abnormality acts in physiological opposition to the classic chloride secretory defect in the CF intestine. By increasing the fluidity of intraluminal contents, absence of passive chloride absorption may reduce the incidence and severity of intestinal disease in patients with CF. PMID:12840066
Solid Rocket Motor Combustion Instability Modeling in COMSOL Multiphysics
NASA Technical Reports Server (NTRS)
Fischbach, Sean R.
2015-01-01
Combustion instability modeling of Solid Rocket Motors (SRM) remains a topic of active research. Many rockets display violent fluctuations in pressure, velocity, and temperature originating from the complex interactions between the combustion process, acoustics, and steady-state gas dynamics. Recent advances in defining the energy transport of disturbances within steady flow-fields have been applied by combustion stability modelers to improve the analysis framework [1, 2, 3]. Employing this more accurate global energy balance requires a higher fidelity model of the SRM flow-field and acoustic mode shapes. The current industry standard analysis tool utilizes a one dimensional analysis of the time dependent fluid dynamics along with a quasi-three dimensional propellant grain regression model to determine the SRM ballistics. The code then couples with another application that calculates the eigenvalues of the one dimensional homogenous wave equation. The mean flow parameters and acoustic normal modes are coupled to evaluate the stability theory developed and popularized by Culick [4, 5]. The assumption of a linear, non-dissipative wave in a quiescent fluid remains valid while acoustic amplitudes are small and local gas velocities stay below Mach 0.2. The current study employs the COMSOL multiphysics finite element framework to model the steady flow-field parameters and acoustic normal modes of a generic SRM. The study requires one way coupling of the CFD High Mach Number Flow (HMNF) and mathematics module. The HMNF module evaluates the gas flow inside of a SRM using St. Robert's law to model the solid propellant burn rate, no slip boundary conditions, and the hybrid outflow condition. Results from the HMNF model are verified by comparing the pertinent ballistics parameters with the industry standard code outputs (i.e. pressure drop, thrust, ect.). These results are then used by the coefficient form of the mathematics module to determine the complex eigenvalues of the Acoustic Velocity Potential Equation (AVPE). The mathematics model is truncated at the nozzle sonic line, where a zero flux boundary condition is self-satisfying. The remaining boundaries are modeled with a zero flux boundary condition, assuming zero acoustic absorption on all surfaces. The results of the steady-state CFD and AVPE analyses are used to calculate the linear acoustic growth rate as is defined by Flandro and Jacob [2, 3]. In order to verify the process implemented within COMSOL we first employ the Culick theory and compare the results with the industry standard. After the process is verified, the Flandro/Jacob energy balance theory is employed and results displayed.
Tao, Zhi-Fu; Han, Zhong-Ling; Yao, Meng
2011-01-01
Using the difference of dielectric constant between malignant tumor tissue and normal breast tissue, breast tumor microwave sensor system (BRATUMASS) determines the detected target of imaging electromagnetic trait by analyzing the properties of target tissue back wave obtained after near-field microwave radicalization (conelrad). The key of obtained target properties relationship and reconstructed detected space is to analyze the characteristics of the whole process from microwave transmission to back wave reception. Using traveling wave method, we derive spatial transmission properties and the relationship of the relation detected points distances, and valuate the properties of each unit by statistical valuation theory. This chapter gives the experimental data analysis results.
Smith, Sharon T; Blanchard, Jennifer; Kools, Susan; Butler, Derrick
2017-02-01
Spirituality is important to holistic health, yet little is known about its impact on young people with HIV. To address this knowledge deficit, a grounded theory study used semi-structured interviews of 20 Christian-identified adolescent and emerging adult gay males and one perinatally infected male. This study revealed that, to cope with HIV health issues, participants used a process of reconnecting with their spirituality. In order to successfully reconnect with their spirituality, study participants reported a need to re-embrace and re-engage in spiritual practices, hold onto hope, believe they are normal, and commit to beliefs and practices despite rejection from the church.
NASA Astrophysics Data System (ADS)
Storms, Edmund
2010-10-01
The phenomenon called cold fusion has been studied for the last 21 years since its discovery by Profs. Fleischmann and Pons in 1989. The discovery was met with considerable skepticism, but supporting evidence has accumulated, plausible theories have been suggested, and research is continuing in at least eight countries. This paper provides a brief overview of the major discoveries and some of the attempts at an explanation. The evidence supports the claim that a nuclear reaction between deuterons to produce helium can occur in special materials without application of high energy. This reaction is found to produce clean energy at potentially useful levels without the harmful byproducts normally associated with a nuclear process. Various requirements of a model are examined.
Recurrent laryngeal nerve reinnervation for management of aspiration in a subset of children.
Zur, Karen B; Carroll, Linda M
2018-01-01
Pediatric aspiration is a multifactorial process that is often complex to manage. Recurrent laryngeal nerve (RLN) injury can cause glottic insufficiency and aspiration. We describe three cases of unilateral vocal fold paralysis resulting in aspiration and the successful use of the RLN reinnervation for its treatment. The theory for utilizing the reinnervation procedure is that when glottic closure improves and a less breathy vocalization occurs, then the larynx is better equipped to protect the lower airway and avoid aspiration. Our cases demonstrate stronger voice and improved swallow function, with normalization of modified barium swallow evaluation, at approximately 6-months post reinnervation. Copyright © 2017. Published by Elsevier B.V.
Storms, Edmund
2010-10-01
The phenomenon called cold fusion has been studied for the last 21 years since its discovery by Profs. Fleischmann and Pons in 1989. The discovery was met with considerable skepticism, but supporting evidence has accumulated, plausible theories have been suggested, and research is continuing in at least eight countries. This paper provides a brief overview of the major discoveries and some of the attempts at an explanation. The evidence supports the claim that a nuclear reaction between deuterons to produce helium can occur in special materials without application of high energy. This reaction is found to produce clean energy at potentially useful levels without the harmful byproducts normally associated with a nuclear process. Various requirements of a model are examined.
Estimating the number of motor units using random sums with independently thinned terms.
Müller, Samuel; Conforto, Adriana Bastos; Z'graggen, Werner J; Kaelin-Lang, Alain
2006-07-01
The problem of estimating the numbers of motor units N in a muscle is embedded in a general stochastic model using the notion of thinning from point process theory. In the paper a new moment type estimator for the numbers of motor units in a muscle is denned, which is derived using random sums with independently thinned terms. Asymptotic normality of the estimator is shown and its practical value is demonstrated with bootstrap and approximative confidence intervals for a data set from a 31-year-old healthy right-handed, female volunteer. Moreover simulation results are presented and Monte-Carlo based quantiles, means, and variances are calculated for N in{300,600,1000}.
Engine Data Interpretation System (EDIS)
NASA Technical Reports Server (NTRS)
Cost, Thomas L.; Hofmann, Martin O.
1990-01-01
A prototype of an expert system was developed which applies qualitative or model-based reasoning to the task of post-test analysis and diagnosis of data resulting from a rocket engine firing. A combined component-based and process theory approach is adopted as the basis for system modeling. Such an approach provides a framework for explaining both normal and deviant system behavior in terms of individual component functionality. The diagnosis function is applied to digitized sensor time-histories generated during engine firings. The generic system is applicable to any liquid rocket engine but was adapted specifically in this work to the Space Shuttle Main Engine (SSME). The system is applied to idealized data resulting from turbomachinery malfunction in the SSME.
NASA Astrophysics Data System (ADS)
Shimanovskii, A. V.
A method for calculating the plane bending of elastic-plastic filaments of finite stiffness is proposed on the basis of plastic flow theory. The problem considered is shown to reduce to relations similar to Kirchhoff equations for elastic work. Expressions are obtained for determining the normalized stiffness characteristics for the cross section of a filament with plastic regions containing beam theory equations as a particular case. A study is made of the effect of the plastic region size on the position of the elastic deformation-unloading interface and on the normalized stiffness of the filament cross section. Calculation results are presented in graphic form.
Dual-Process Theories and Cognitive Development: Advances and Challenges
ERIC Educational Resources Information Center
Barrouillet, Pierre
2011-01-01
Dual-process theories have gained increasing importance in psychology. The contrast that they describe between an old intuitive and a new deliberative mind seems to make these theories especially suited to account for development. Accordingly, this special issue aims at presenting the latest applications of dual-process theories to cognitive…
Hybrid colored noise process with space-dependent switching rates
NASA Astrophysics Data System (ADS)
Bressloff, Paul C.; Lawley, Sean D.
2017-07-01
A fundamental issue in the theory of continuous stochastic process is the interpretation of multiplicative white noise, which is often referred to as the Itô-Stratonovich dilemma. From a physical perspective, this reflects the need to introduce additional constraints in order to specify the nature of the noise, whereas from a mathematical perspective it reflects an ambiguity in the formulation of stochastic differential equations (SDEs). Recently, we have identified a mechanism for obtaining an Itô SDE based on a form of temporal disorder. Motivated by switching processes in molecular biology, we considered a Brownian particle that randomly switches between two distinct conformational states with different diffusivities. In each state, the particle undergoes normal diffusion (additive noise) so there is no ambiguity in the interpretation of the noise. However, if the switching rates depend on position, then in the fast switching limit one obtains Brownian motion with a space-dependent diffusivity of the Itô form. In this paper, we extend our theory to include colored additive noise. We show that the nature of the effective multiplicative noise process obtained by taking both the white-noise limit (κ →0 ) and fast switching limit (ɛ →0 ) depends on the order the two limits are taken. If the white-noise limit is taken first, then we obtain Itô, and if the fast switching limit is taken first, then we obtain Stratonovich. Moreover, the form of the effective diffusion coefficient differs in the two cases. The latter result holds even in the case of space-independent transition rates, where one obtains additive noise processes with different diffusion coefficients. Finally, we show that yet another form of multiplicative noise is obtained in the simultaneous limit ɛ ,κ →0 with ɛ /κ2 fixed.
A roadmap for the integration of culture into developmental psychopathology.
Causadias, José M
2013-11-01
In this paper, I propose a roadmap for the integration of culture in developmental psychopathology. This integration is pressing because culture continues to be somewhat disconnected from theory, research, training, and interventions in developmental psychopathology, thus limiting our understanding of the epigenesis of mental health. I argue that in order to successfully integrate culture into developmental psychopathology, it is crucial to (a) study cultural development, (b) consider both individual-level and social-level cultural processes, (c) examine the interplay between culture and biology, and (d) promote improved and direct cultural assessment. I provide evidence in support of each of these guidelines, present alternative conceptual frameworks, and suggest new lines of research. Hopefully, that these directions will contribute to the emerging field of cultural development and psychopathology, which focuses on the elucidation of the cultural processes that initiate, maintain, or derail trajectories of normal and abnormal behavior.
The role of responsibility and fear of guilt in hypothesis-testing.
Mancini, Francesco; Gangemi, Amelia
2006-12-01
Recent theories argue that both perceived responsibility and fear of guilt increase obsessive-like behaviours. We propose that hypothesis-testing might account for this effect. Both perceived responsibility and fear of guilt would influence subjects' hypothesis-testing, by inducing a prudential style. This style implies focusing on and confirming the worst hypothesis, and reiterating the testing process. In our experiment, we manipulated the responsibility and fear of guilt of 236 normal volunteers who executed a deductive task. The results show that perceived responsibility is the main factor that influenced individuals' hypothesis-testing. Fear of guilt has however a significant additive effect. Guilt-fearing participants preferred to carry on with the diagnostic process, even when faced with initial favourable evidence, whereas participants in the responsibility condition only did so when confronted with an unfavourable evidence. Implications for the understanding of obsessive-compulsive disorder (OCD) are discussed.
Diffusion Tensor Tractography Reveals Disrupted Structural Connectivity during Brain Aging
NASA Astrophysics Data System (ADS)
Lin, Lan; Tian, Miao; Wang, Qi; Wu, Shuicai
2017-10-01
Brain aging is one of the most crucial biological processes that entail many physical, biological, chemical, and psychological changes, and also a major risk factor for most common neurodegenerative diseases. To improve the quality of life for the elderly, it is important to understand how the brain is changed during the normal aging process. We compared diffusion tensor imaging (DTI)-based brain networks in a cohort of 75 healthy old subjects by using graph theory metrics to describe the anatomical networks and connectivity patterns, and network-based statistic (NBS) analysis was used to identify pairs of regions with altered structural connectivity. The NBS analysis revealed a significant network comprising nine distinct fiber bundles linking 10 different brain regions showed altered white matter structures in young-old group compare with middle-aged group (p < .05, family-wise error-corrected). Our results might guide future studies and help to gain a better understanding of brain aging.
ERIC Educational Resources Information Center
DeMars, Christine E.
2012-01-01
In structural equation modeling software, either limited-information (bivariate proportions) or full-information item parameter estimation routines could be used for the 2-parameter item response theory (IRT) model. Limited-information methods assume the continuous variable underlying an item response is normally distributed. For skewed and…
ERIC Educational Resources Information Center
Boutis, Kathy; Pecaric, Martin; Seeto, Brian; Pusic, Martin
2010-01-01
Signal detection theory (SDT) parameters can describe a learner's ability to discriminate (d[prime symbol]) normal from abnormal and the learner's criterion ([lambda]) to under or overcall abnormalities. To examine the serial changes in SDT parameters with serial exposure to radiological cases. 46 participants were recruited for this study: 20…
Ramsay-Curve Item Response Theory for the Three-Parameter Logistic Item Response Model
ERIC Educational Resources Information Center
Woods, Carol M.
2008-01-01
In Ramsay-curve item response theory (RC-IRT), the latent variable distribution is estimated simultaneously with the item parameters of a unidimensional item response model using marginal maximum likelihood estimation. This study evaluates RC-IRT for the three-parameter logistic (3PL) model with comparisons to the normal model and to the empirical…
ERIC Educational Resources Information Center
Woods, Carol M.; Thissen, David
2006-01-01
The purpose of this paper is to introduce a new method for fitting item response theory models with the latent population distribution estimated from the data using splines. A spline-based density estimation system provides a flexible alternative to existing procedures that use a normal distribution, or a different functional form, for the…
Optimal and Most Exact Confidence Intervals for Person Parameters in Item Response Theory Models
ERIC Educational Resources Information Center
Doebler, Anna; Doebler, Philipp; Holling, Heinz
2013-01-01
The common way to calculate confidence intervals for item response theory models is to assume that the standardized maximum likelihood estimator for the person parameter [theta] is normally distributed. However, this approximation is often inadequate for short and medium test lengths. As a result, the coverage probabilities fall below the given…
ERIC Educational Resources Information Center
Sengul Avsar, Asiye; Tavsancil, Ezel
2017-01-01
This study analysed polytomous items' psychometric properties according to nonparametric item response theory (NIRT) models. Thus, simulated datasets--three different test lengths (10, 20 and 30 items), three sample distributions (normal, right and left skewed) and three samples sizes (100, 250 and 500)--were generated by conducting 20…
ERIC Educational Resources Information Center
Bulcock, J. W.; And Others
Multicollinearity refers to the presence of highly intercorrelated independent variables in structural equation models, that is, models estimated by using techniques such as least squares regression and maximum likelihood. There is a problem of multicollinearity in both the natural and social sciences where theory formulation and estimation is in…
Shulman, Abraham; Strashun, Arnold M
2009-01-01
It is hypothesized that in all traumatic brain injury (TBI) patients with a clinical history of closed or penetrating head injury, the initial head trauma is associated with a vibratory sensation and noise exposure, with resultant alteration in vascular supply to the structures and contents of the fluid compartments of brain and ear (i.e., the fluid dynamics vascular theory of brain-inner-ear function [FDVTBE]). The primary etiology-head trauma-results in an initial fluctuation, interference, or interaction in the normal fluid dynamics between brain and labyrinth of the inner ear, with a resultant clinical diversity of complaints varying in time of onset and severity. Normal function of the brain and ear is a reflection of a normal state of homeostasis between the fluid compartments in the brain of cerebrospinal fluid and perilymph-endolymph in the labyrinth of the ear. The normal homeostasis in the structures and contents between the two fluid compartment systems--intracerebral and intralabyrinthine--is controlled by mechanisms involved in the maintenance of normal pressures, water and electrolyte content, and neurotransmitter activities. The initial pathophysiology (a reflection of an alteration in the vascular supply to the brain-ear) is hypothesized to be an initial acute inflammatory response, persistence of which results in ischemia and an irreversible alteration in the involved neural substrates of brain-ear. Clinically, a chronic multisymptom complex becomes manifest. The multisymptom complex, individual for each TBI patient regardless of the diagnostic TBI category (i.e., mild, moderate, or severe), initially reflects processes of inflammation and ischemia which, in brain, result in brain volume loss identified as neurodegeneration and hydrocephalus ex vacuo or an alteration in cerebrospinal fluid production (i.e., pseudotumor cerebri) and, in ear, secondary endolymphatic hydrops with associated cochleovestibular complaints of hearing loss, tinnitus, vertigo, ear blockage, and hyperacusis. The FDVTBE integrates and translates a neurovascular hypothesis for Alzheimer's disease to TBI. This study presents an FDVTBE hypothesis of TBI to explain the clinical association of head trauma (TBI) and central nervous system neurodegeneration with multisensory complaints, highlighted by and focusing on cochleovestibular complaints. A clinical case report, previously published for demonstration of the cerebrovascular medical significance of a particular type of tinnitus, and evidence-based basic science and clinical medicine are cited to provide objective evidence in support and demonstration of the FDVTBE.
The saturation of monochromatic lights obliquely incident on the retina.
Alpern, M; Tamaki, R
1983-01-01
Foveal dark-adaptation undertaken to test the hypothesis that the excitation of rods causes the desaturation of 'yellow' lights in a 1 degree field traversing the margin of the pupil, fails to exclude that possibility. The desaturation is largest for a 1 degree outside diameter annular test, is still measurable with a 0.5 degree circular disk, but disappears for a 0.29 degree disk. The supersaturation of obliquely incident 501.2 nm test light follows the opposite pattern; it disappears with an annulus and is largest for a 0.29 degree circular field. It is unlikely that rods replace short-wave sensitive cones in the trichromatic match of an obliquely incident test with normally incident primaries. If rods as well as all three cones species are involved, the matches might not be trichromatic in the strong sense. Grassmann's law of scalar multiplication was tested and shown not to hold for the match of an obliquely incident test with normally incident primaries, though it remains valid whenever, both primaries and test strike the retina at the same angle of incidence (independent of that angle). The result in section 3 (above) cannot be due to rod intrusion. It persists (and becomes more conspicuous) on backgrounds (4.0 log scotopic td) which saturate rods. Moreover obliquely incident 'yellow' lights remain desaturated in intervals in the dark after a full bleach, whilst the test field is below rod threshold. The amount of desaturation does not differ appreciably from that normally found. The assumption of the unified theory of Alpern, Kitahara & Tamaki (1983) that the outer segments of only a single set of three cone species (with acceptance angles wide enough to include the entire exit pupil) contain the visual pigments absorbing both the normally incident primaries and the obliquely incident test is disproved by these results. Failure of Grassmann's law is most conspicuous under the conditions for which the changes in saturation upon changing from normal to oblique incidence are greatest and least when the saturation changes are the smallest. Either all unified theories of the Stiles-Crawford effects are wrong or all the effects of oblique incidence operate at a stage in the visual process at which the effects of radiation of different wave-lengths are no longer compounded by the simple linear laws. PMID:6875976
NASA Astrophysics Data System (ADS)
Al-Bustany, Fatin Khalil Ismail
1989-09-01
My aim in this dissertation is to develop an evolutionary conception of science based on recent studies in evolution theory, the thermodynamics of non-equilibrium and information theory, as exemplified in the works of Prigogine, Jantsch, Wicken and Gatlin. The nature of scientific change is of interest to philosophers and historians of science. Some construe it after a revolutionary model (e.g. Kuhn), others adopt an evolutionary view (e.g. Toulmin). It appears to me that it is possible to construct an evolutionary model encompassing the revolutionary mode as well. The following strategies are employed: (1) A distinction is made between two types of growth: one represents gradual change, the other designates radical transformations, and two principles underlying the process of change, one of conservation, the other of innovation. (2) Science in general, and scientific theories in particular, are looked upon as dissipative structures. These are characterised by openness, irreversibility and self-organisation. In terms of these, one may identify a state of "normal" growth and another of violent fluctuations leading to a new order (revolutionary phase). These fluctuations are generated by the flow of information coming from the observable world. The chief merits of this evolutionary model of the development of science lie in the emphasis it puts on the relation of science to its environment, in the description of scientific change as a process of interaction between internal and external elements (structural, conceptual, and cultural), in the enhancement of our understanding progress and rationality in science, and in the post Neo -Darwinian conception of evolution, stressing self-organisation, the innovativeness of the evolutionary process and the trend toward complexification. These features are also manifested in the process of discovery, which is a fundamental part of the scientific enterprise. In addition, a distinction is made between two types of discovery which serves as a criterion for delineating various episodes in the development of science. The evolutionary model further displays a complementarity mode of description on several levels: between science and its milieu, stability and instability, discovery and confirmation.
Smoldering Combustion Experiments in Microgravity
NASA Technical Reports Server (NTRS)
Walther, David C.; Fernandez-Pello, A. Carlos; Urban, David L.
1997-01-01
The Microgravity Smoldering Combustion (MSC) experiment is part of a study of the smolder characteristics of porous combustible materials in a microgravity environment. Smoldering is a non-flaming form of combustion that takes place in the interior of porous materials and takes place in a number of processes ranging from smoldering of porous insulation materials to high temperature synthesis of metals. The objective of the study is to provide a better understanding of the controlling mechanisms of smolder, both in microgravity and normal-gravity. As with many forms of combustion, gravity affects the availability of oxidizer and transport of heat, and therefore the rate of combustion. Microgravity smolder experiments, in both a quiescent oxidizing environment, and in a forced oxidizing flow have been conducted aboard the NASA Space Shuttle (STS-69 and STS-77 missions) to determine the effect of the ambient oxygen concentration and oxidizer forced flow velocity on smolder combustion in microgravity. The experimental apparatus is contained within the NASA Get Away Special Canister (GAS-CAN) Payload. These two sets of experiments investigate the propagation of smolder along the polyurethane foam sample under both diffusion driven and forced flow driven smoldering. The results of the microgravity experiments are compared with identical ones carried out in normal gravity, and are used to verify present theories of smolder combustion. The results of this study will provide new insights into the smoldering combustion process. Thermocouple histories show that the microgravity smolder reaction temperatures (Ts) and propagation velocities (Us) lie between those of identical normal-gravity upward and downward tests. These observations indicate the effect of buoyancy on the transport of oxidizer to the reaction front.
Neurodynamic system theory: scope and limits.
Erdi, P
1993-06-01
This paper proposes that neurodynamic system theory may be used to connect structural and functional aspects of neural organization. The paper claims that generalized causal dynamic models are proper tools for describing the self-organizing mechanism of the nervous system. In particular, it is pointed out that ontogeny, development, normal performance, learning, and plasticity, can be treated by coherent concepts and formalism. Taking into account the self-referential character of the brain, autopoiesis, endophysics and hermeneutics are offered as elements of a poststructuralist brain (-mind-computer) theory.
Yu, Jia-Lu; Yang, Chun-Nuan; Cai, Hao; Huang, Nian-Ning
2007-04-01
After finding the basic solutions of the linearized nonlinear Schrödinger equation by the method of separation of variables, the perturbation theory for the dark soliton solution is constructed by linear Green's function theory. In application to the self-induced Raman scattering, the adiabatic corrections to the soliton's parameters are obtained and the remaining correction term is given as a pure integral with respect to the continuous spectral parameter.
Modelling stock order flows with non-homogeneous intensities from high-frequency data
NASA Astrophysics Data System (ADS)
Gorshenin, Andrey K.; Korolev, Victor Yu.; Zeifman, Alexander I.; Shorgin, Sergey Ya.; Chertok, Andrey V.; Evstafyev, Artem I.; Korchagin, Alexander Yu.
2013-10-01
A micro-scale model is proposed for the evolution of such information system as the limit order book in financial markets. Within this model, the flows of orders (claims) are described by doubly stochastic Poisson processes taking account of the stochastic character of intensities of buy and sell orders that determine the price discovery mechanism. The proposed multiplicative model of stochastic intensities makes it possible to analyze the characteristics of the order flows as well as the instantaneous proportion of the forces of buyers and sellers, that is, the imbalance process, without modelling the external information background. The proposed model gives the opportunity to link the micro-scale (high-frequency) dynamics of the limit order book with the macro-scale models of stock price processes of the form of subordinated Wiener processes by means of limit theorems of probability theory and hence, to use the normal variance-mean mixture models of the corresponding heavy-tailed distributions. The approach can be useful in different areas with similar properties (e.g., in plasma physics).
Kim, Jeong Chul; Wang, Li; Shen, Dinggang; Lin, Weili
2016-12-02
The first year of life is the most critical time period for structural and functional development of the human brain. Combining longitudinal MR imaging and finite strain theory, this study aimed to provide new insights into normal brain development through a biomechanical framework. Thirty-three normal infants were longitudinally imaged using MRI from 2 weeks to 1 year of age. Voxel-wise Jacobian determinant was estimated to elucidate volumetric changes while Lagrange strains (both normal and shear strains) were measured to reveal directional growth information every 3 months during the first year of life. Directional normal strain maps revealed that, during the first 6 months, the growth pattern of gray matter is anisotropic and spatially inhomogeneous with higher left-right stretch around the temporal lobe and interhemispheric fissure, anterior-posterior stretch in the frontal and occipital lobes, and superior-inferior stretch in right inferior occipital and right inferior temporal gyri. In contrast, anterior lateral ventricles and insula showed an isotropic stretch pattern. Volumetric and directional growth rates were linearly decreased with age for most of the cortical regions. Our results revealed anisotropic and inhomogeneous brain growth patterns of the human brain during the first year of life using longitudinal MRI and a biomechanical framework.
Problems of the theory of superconductivity which involve spatial inhomogeneity
NASA Astrophysics Data System (ADS)
Svidzinskii, A. V.
This book is concerned with questions which are related to equilibrium phenomena in superconductors, giving particular attention to effects determined by a spatial variation of the order parameter. The microscopic theory of superconductivity is developed on the basis of a model which takes into account the direct interaction between electrons. The theory of current relations in superconductors is discussed, taking into consideration the magnetic properties of superconductors in weak fields and the Meissner effect. Aspects regarding the general theory of tunneling are also explored, including the Josephson effect. An investigation is conducted of the theory of current conditions in areas in which the superconductor is in contact with normally conducting metal.
NASA Astrophysics Data System (ADS)
Li, Tianxing; Zhou, Junxiang; Deng, Xiaozhong; Li, Jubo; Xing, Chunrong; Su, Jianxin; Wang, Huiliang
2018-07-01
A manufacturing error of a cycloidal gear is the key factor affecting the transmission accuracy of a robot rotary vector (RV) reducer. A methodology is proposed to realize the digitized measurement and data processing of the cycloidal gear manufacturing error based on the gear measuring center, which can quickly and accurately measure and evaluate the manufacturing error of the cycloidal gear by using both the whole tooth profile measurement and a single tooth profile measurement. By analyzing the particularity of the cycloidal profile and its effect on the actual meshing characteristics of the RV transmission, the cycloid profile measurement strategy is planned, and the theoretical profile model and error measurement model of cycloid-pin gear transmission are established. Through the digital processing technology, the theoretical trajectory of the probe and the normal vector of the measured point are calculated. By means of precision measurement principle and error compensation theory, a mathematical model for the accurate calculation and data processing of manufacturing error is constructed, and the actual manufacturing error of the cycloidal gear is obtained by the optimization iterative solution. Finally, the measurement experiment of the cycloidal gear tooth profile is carried out on the gear measuring center and the HEXAGON coordinate measuring machine, respectively. The measurement results verify the correctness and validity of the measurement theory and method. This methodology will provide the basis for the accurate evaluation and the effective control of manufacturing precision of the cycloidal gear in a robot RV reducer.
O'Nions, Elizabeth; Sebastian, Catherine L; McCrory, Eamon; Chantiluke, Kaylita; Happé, Francesca; Viding, Essi
2014-09-01
Individuals with autism spectrum disorders (ASD) have difficulty understanding other minds (Theory of Mind; ToM), with atypical processing evident at both behavioural and neural levels. Individuals with conduct problems and high levels of callous-unemotional (CU) traits (CP/HCU) exhibit reduced responsiveness to others' emotions and difficulties interacting with others, but nonetheless perform normally in experimental tests of ToM. The present study aimed to examine the neural underpinnings of ToM in children (aged 10-16) with ASD (N = 16), CP/HCU (N = 16) and typically developing (TD) controls (N = 16) using a non-verbal cartoon vignette task. Whilst individuals with ASD were predicted to show reduced fMRI responses across regions involved in ToM processing, CP/HCU individuals were predicted to show no differences compared with TD controls. The analyses indicated that neural responses did not differ between TD and CP/HCU groups during ToM. TD and CP/HCU children exhibited significantly greater medial prefrontal cortex responses during ToM than did the ASD group. Within the ASD group, responses in medial prefrontal cortex and right temporoparietal junction (TPJ) correlated with symptom severity as measured by the Autism Diagnostic Observation Schedule (ADOS). Findings suggest that although both ASD and CP/HCU are characterized by social difficulties, only children with ASD display atypical neural processing associated with ToM. © 2014 The Authors. Developmental Science Published by John Wiley & Sons Ltd.
Cosmonumerology, Cosmophysics, and the Large Numbers Hypothesis: British Cosmology in the 1930s
NASA Astrophysics Data System (ADS)
Durham, Ian
2001-04-01
A number of unorthodox cosmological models were developed in the 1930s, many by British theoreticians. Three of the most notable of these theories included Eddington's cosmonumerology, Milne's cosmophysics, and Dirac's large numbers hypothesis (LNH). Dirac's LNH was based partly on the other two and it has been argued that modern steady-state theories are based partly on Milne's cosmophysics. But what influenced Eddington and Milne? Both were products of the late Victorian education system in Britain and could conceivably have been influenced by Victorian thought which, in addition to its strict (though technically unoffical) social caste system, had a flair for the unusual. Victorianism was filled with a fascination for the occult and the supernatural, and science was not insulated from this trend (witness the Henry Slade trial in 1877). It is conceivable that the normally strict mentality of the scientific process in the minds of Eddington and Milne was affected, indirectly, by this trend for the unusual, possibly pushing them into thinking "outside the box" as it were. In addition, cosmonumerology and the LNH exhibit signs of Pythagorean and Aristotelian thought. It is the aim of this ongoing project at St. Andrews to determine the influences and characterize the relations existing in and within these and related theories.
Mourning beyond melancholia: Freud's psychoanalysis of loss.
Clewell, Tammy
2004-01-01
Freud's mourning theory has been criticized for assuming a model of subjectivity based on a strongly bounded form of individuation. This model informs "Mourning and Melancholia" (1917), in which Freud argued that mourning comes to a decisive end when the subject severs its emotional attachment to the lost one and reinvests the free libido in a new object. Yet Freud revised his mourning theory in writings concerned with the Great War and in The Ego and the Id (1923), where he redefined the identification process previously associated with melancholia as an integral component of mourning. By viewing the character of the ego as an elegiac formation, that is, as "a precipitate of abandoned object-cathexes," Freud's later work registers the endlessness of normal grieving; however, it also imports into mourning the violent characteristics of melancholia, the internal acts of moralized aggression waged in an effort to dissolve the internal trace of the other and establish an autonomous identity. Because it is not immediately clear how Freud's text offers a theory of mourning beyond melancholy violence, his account of the elegiac ego is shown here to ultimately undermine the wish for an identity unencumbered by the claims of the lost other and the past, and to suggest the affirmative and ethical aspects of mourning.
Theory to predict particle migration and margination in the pressure-driven channel flow of blood
NASA Astrophysics Data System (ADS)
Qi, Qin M.; Shaqfeh, Eric S. G.
2017-09-01
The inhomogeneous concentration distribution of erythrocytes and platelets in microchannel flows particularly in directions normal to the mean flow plays a significant role in hemostasis, drug delivery, and microfluidic applications. In this paper, we develop a coarse-grained theory to predict these distributions in pressure-driven channel flow at zero Reynolds number and compare them to experiments and simulations. We demonstrate that the balance between the deformability-induced lift force and the shear-induced diffusion created by hydrodynamic interactions in the suspension results in both a peak concentration of red blood cells at the channel center and a cell-free or Fahraeus-Lindqvist layer near the walls. On the other hand, the absence of a lift force and the strong red blood cell-platelet interactions result in an excess concentration of platelets in the cell-free layer. We demonstrate a strong role of hematocrit (i.e., erythrocyte volume fraction) in determining the cell-free layer thickness and the degree of platelet margination. We also demonstrate that the capillary number of the erythrocytes, based on the membrane shear modulus, plays a relatively insignificant role in the regimes that we have studied. Our theory serves as a good and simple alternative to large-scale computer simulations of the cross-stream transport processes in these mixtures.
Buckling analysis for anisotropic laminated plates under combined inplane loads
NASA Technical Reports Server (NTRS)
Viswanathan, A. V.; Tamekuni, M.; Baker, L. L.
1974-01-01
The buckling analysis presented considers rectangular flat or curved general laminates subjected to combined inplane normal and shear loads. Linear theory is used in the analysis. All prebuckling deformations and any initial imperfections are ignored. The analysis method can be readily extended to longitudinally stiffened structures subjected to combined inplane normal and shear loads.
ERIC Educational Resources Information Center
Peterson, Candida C.; Siegal, Michael
1997-01-01
Examined reasoning in normal, autistic, and deaf individuals. Found that deaf individuals who grow up in hearing homes without fluent signers show selective impairments in theory of mind similar to those of autistic individuals. Results suggest that conversational differences in the language children hear accounts for distinctive patterns of…
ERIC Educational Resources Information Center
Buium, Nissan; Rynders, John
To demonstrate that the child learning language constructs his theory of language on the basis of the linguistic data available to him, this study investigated 21 linguistic parameters that Down's Syndrome and normal children are exposed to in their maternal linguistic environment. It was found that mothers produced certain levels of linguistic…
Recollection is a continuous process: implications for dual-process theories of recognition memory.
Mickes, Laura; Wais, Peter E; Wixted, John T
2009-04-01
Dual-process theory, which holds that recognition decisions can be based on recollection or familiarity, has long seemed incompatible with signal detection theory, which holds that recognition decisions are based on a singular, continuous memory-strength variable. Formal dual-process models typically regard familiarity as a continuous process (i.e., familiarity comes in degrees), but they construe recollection as a categorical process (i.e., recollection either occurs or does not occur). A continuous process is characterized by a graded relationship between confidence and accuracy, whereas a categorical process is characterized by a binary relationship such that high confidence is associated with high accuracy but all lower degrees of confidence are associated with chance accuracy. Using a source-memory procedure, we found that the relationship between confidence and source-recollection accuracy was graded. Because recollection, like familiarity, is a continuous process, dual-process theory is more compatible with signal detection theory than previously thought.
The conflict and process theory of Melanie Klein.
Kavaler-Adler, S
1993-09-01
This article depicts the theory of Melanie Klein in both its conflict and process dimensions. In addition, it outlines Klein's strategic place in psychoanalytic history and in psychoanalytic theory formation. Her major contributions are seen in light of their clinical imperatives, and aspects of her metapsychology that seem negligible are differentiated from these clinical imperatives. Klein's role as a dialectical fulcrum between drive and object relations theories is explicated. Within the conflict theory, drive derivatives of sex and aggression are reformulated as object-related passions of love and hate. The process dimensions of Klein's theory are outlined in terms of dialectical increments of depressive position process as it alternates with regressive paranoid-schizoid-position mental phenomenology. The mourning process as a developmental process is particularly high-lighted in terms of self-integrative progression within the working through of the depressive position.
ERIC Educational Resources Information Center
Grenn, Michael W.
2013-01-01
This dissertation introduces a theory of information quality to explain macroscopic behavior observed in the systems engineering process. The theory extends principles of Shannon's mathematical theory of communication [1948] and statistical mechanics to information development processes concerned with the flow, transformation, and meaning of…
Chong, Raymond K Y; Mills, Bradley; Dailey, Leanna; Lane, Elizabeth; Smith, Sarah; Lee, Kyoung-Hyun
2010-07-01
We tested the hypothesis that a computational overload results when two activities, one motor and the other cognitive that draw on the same neural processing pathways, are performed concurrently. Healthy young adult subjects carried out two seemingly distinct tasks of maintaining standing balance control under conditions of low (eyes closed), normal (eyes open) or high (eyes open, sway-referenced surround) visuospatial processing load while concurrently performing a cognitive task of either subtracting backwards by seven or generating words of the same first letter. A decrease in the performance of the balance control task and a decrement in the speed and accuracy of responses were noted during the subtraction but not the word generation task. The interference in the subtraction task was isolated to the first trial of the high but not normal or low visuospatial conditions. Balance control improvements with repeated exposures were observed only in the low visuospatial conditions while performance in the other conditions remained compromised. These results suggest that sensory organization for balance control appear to draw on similar visuospatial computational resources needed for the subtraction but not the word generation task. In accordance with the theory of modularity in human performance, the contrast in results between the subtraction and word generation tasks suggests that the neural overload is related to competition for similar visuospatial processes rather than limited attentional resources. Copyright (c) 2010 Elsevier Ltd. All rights reserved.
Bayesian soft X-ray tomography using non-stationary Gaussian Processes
NASA Astrophysics Data System (ADS)
Li, Dong; Svensson, J.; Thomsen, H.; Medina, F.; Werner, A.; Wolf, R.
2013-08-01
In this study, a Bayesian based non-stationary Gaussian Process (GP) method for the inference of soft X-ray emissivity distribution along with its associated uncertainties has been developed. For the investigation of equilibrium condition and fast magnetohydrodynamic behaviors in nuclear fusion plasmas, it is of importance to infer, especially in the plasma center, spatially resolved soft X-ray profiles from a limited number of noisy line integral measurements. For this ill-posed inversion problem, Bayesian probability theory can provide a posterior probability distribution over all possible solutions under given model assumptions. Specifically, the use of a non-stationary GP to model the emission allows the model to adapt to the varying length scales of the underlying diffusion process. In contrast to other conventional methods, the prior regularization is realized in a probability form which enhances the capability of uncertainty analysis, in consequence, scientists who concern the reliability of their results will benefit from it. Under the assumption of normally distributed noise, the posterior distribution evaluated at a discrete number of points becomes a multivariate normal distribution whose mean and covariance are analytically available, making inversions and calculation of uncertainty fast. Additionally, the hyper-parameters embedded in the model assumption can be optimized through a Bayesian Occam's Razor formalism and thereby automatically adjust the model complexity. This method is shown to produce convincing reconstructions and good agreements with independently calculated results from the Maximum Entropy and Equilibrium-Based Iterative Tomography Algorithm methods.
Bayesian soft X-ray tomography using non-stationary Gaussian Processes.
Li, Dong; Svensson, J; Thomsen, H; Medina, F; Werner, A; Wolf, R
2013-08-01
In this study, a Bayesian based non-stationary Gaussian Process (GP) method for the inference of soft X-ray emissivity distribution along with its associated uncertainties has been developed. For the investigation of equilibrium condition and fast magnetohydrodynamic behaviors in nuclear fusion plasmas, it is of importance to infer, especially in the plasma center, spatially resolved soft X-ray profiles from a limited number of noisy line integral measurements. For this ill-posed inversion problem, Bayesian probability theory can provide a posterior probability distribution over all possible solutions under given model assumptions. Specifically, the use of a non-stationary GP to model the emission allows the model to adapt to the varying length scales of the underlying diffusion process. In contrast to other conventional methods, the prior regularization is realized in a probability form which enhances the capability of uncertainty analysis, in consequence, scientists who concern the reliability of their results will benefit from it. Under the assumption of normally distributed noise, the posterior distribution evaluated at a discrete number of points becomes a multivariate normal distribution whose mean and covariance are analytically available, making inversions and calculation of uncertainty fast. Additionally, the hyper-parameters embedded in the model assumption can be optimized through a Bayesian Occam's Razor formalism and thereby automatically adjust the model complexity. This method is shown to produce convincing reconstructions and good agreements with independently calculated results from the Maximum Entropy and Equilibrium-Based Iterative Tomography Algorithm methods.
Goals, intentions and mental states: challenges for theories of autism.
Hamilton, Antonia F de C
2009-08-01
The ability to understand the goals and intentions behind other people's actions is central to many social interactions. Given the profound social difficulties seen in autism, we might expect goal understanding to be impaired in these individuals. Two influential theories, the 'broken mirror' theory and the mentalising theory, can both predict this result. However, a review of the current data provides little empirical support for goal understanding difficulties; several studies demonstrate normal performance by autistic children on tasks requiring the understanding of goals or intentions. I suggest that this conclusion forces us to reject the basic broken mirror theory and to re-evaluate the breadth of the mentalising theory. More subtle theories which distinguish between different types of mirroring and different types of mentalising may be able to account for the present data, and further research is required to test and refine these theories.
Individual differences in working memory capacity and dual-process theories of the mind.
Barrett, Lisa Feldman; Tugade, Michele M; Engle, Randall W
2004-07-01
Dual-process theories of the mind are ubiquitous in psychology. A central principle of these theories is that behavior is determined by the interplay of automatic and controlled processing. In this article, the authors examine individual differences in the capacity to control attention as a major contributor to differences in working memory capacity (WMC). The authors discuss the enormous implications of this individual difference for a host of dual-process theories in social, personality, cognitive, and clinical psychology. In addition, the authors propose several new areas of investigation that derive directly from applying the concept of WMC to dual-process theories of the mind.
Individual Differences in Working Memory Capacity and Dual-Process Theories of the Mind
Barrett, Lisa Feldman; Tugade, Michele M.; Engle, Randall W.
2005-01-01
Dual-process theories of the mind are ubiquitous in psychology. A central principle of these theories is that behavior is determined by the interplay of automatic and controlled processing. In this article, the authors examine individual differences in the capacity to control attention as a major contributor to differences in working memory capacity (WMC). The authors discuss the enormous implications of this individual difference for a host of dual-process theories in social, personality, cognitive, and clinical psychology. In addition, the authors propose several new areas of investigation that derive directly from applying the concept of WMC to dual-process theories of the mind. PMID:15250813
Stigma, status, and population health.
Phelan, Jo C; Lucas, Jeffrey W; Ridgeway, Cecilia L; Taylor, Catherine J
2014-02-01
Stigma and status are the major concepts in two important sociological traditions that describe related processes but that have developed in isolation. Although both approaches have great promise for understanding and improving population health, this promise has not been realized. In this paper, we consider the applicability of status characteristics theory (SCT) to the problem of stigma with the goal of better understanding social systemic aspects of stigma and their health consequences. To this end, we identify common and divergent features of status and stigma processes. In both, labels that are differentially valued produce unequal outcomes in resources via culturally shared expectations associated with the labels; macro-level inequalities are enacted in micro-level interactions, which in turn reinforce macro-level inequalities; and status is a key variable. Status and stigma processes also differ: Higher- and lower-status states (e.g., male and female) are both considered normal, whereas stigmatized characteristics (e.g., mental illness) are not; interactions between status groups are guided by "social ordering schemas" that provide mutually agreed-upon hierarchies and interaction patterns (e.g., men assert themselves while women defer), whereas interactions between "normals" and stigmatized individuals are not so guided and consequently involve uncertainty and strain; and social rejection is key to stigma but not status processes. Our juxtaposition of status and stigma processes reveals close parallels between stigmatization and status processes that contribute to systematic stratification by major social groupings, such as race, gender, and SES. These parallels make salient that stigma is not only an interpersonal or intrapersonal process but also a macro-level process and raise the possibility of considering stigma as a dimension of social stratification. As such, stigma's impact on health should be scrutinized with the same intensity as that of other more status-based bases of stratification such as SES, race and gender, whose health impacts have been firmly established. Copyright © 2013 Elsevier Ltd. All rights reserved.
Hidden Fermi liquid: Self-consistent theory for the normal state of high-Tc superconductors
NASA Astrophysics Data System (ADS)
Casey, Philip A.
The anomalous "strange metal" properties of the normal, non-superconducting state of the high-Tc cuprate superconductors have been extensively studied for over two decades. The resistivity is robustly T-linear at high temperatures, while at low T it appears to maintain linearity near optimal doping and is T2 at higher doping. The inverse Hall angle is strictly T2 and hence has a distinct scattering lifetime from the resistivity. The transport scattering lifetime is highly anisotropic as directly measured by angle-dependent magnetoresistance (ADMR) and indirectly in more traditional transport experiments. The IR conductivity exhibits a non-integer power-law in frequency, which we take as a defining characteristic of the "strange metal". A phenomenological theory of the transport and spectroscopic properties at a self-consistent and predictive level has been much sought after, yet elusive. Hidden Fermi liquid theory (HFL) explicitly accounts for the effects of Gutzwiller projection in the t-J Hamiltonian, widely believed to contain the essential physics of the high-Tc superconductors. We show this theory to be the first self-consistent description for the normal state of the cuprates based on transparent, fundamental assumptions. Our well-defined formalism also serves as a guide for further experimental confirmation. Chapter 1 reviews the "strange metal" properties and the relevant aspects of competing models. Chapter 2 presents the theoretical foundations of the formalism. Chapters 3 and 4 derive expressions for the entire normal state relating many of the properties, for example: angle-resolved photoemission, IR conductivity, resistivity, Hall angle, and by generalizing the formalism to include the Fermi surface topology---ADMR. Self-consistency is demonstrated with experimental comparisons, including the most recent laser-ARPES and ADMR. Chapter 5 discusses entropy transport, as in the thermal conductivity, thermal Hall conductivity, and consequent metrics of non-Fermi liquid behavior such as the Wiedemann-Franz and Kadowaki-Woods ratios.
NNLO QCD corrections to Higgs boson production at large transverse momentum
NASA Astrophysics Data System (ADS)
Chen, X.; Cruz-Martinez, J.; Gehrmann, T.; Glover, E. W. N.; Jaquier, M.
2016-10-01
We derive the second-order QCD corrections to the production of a Higgs boson recoiling against a parton with finite transverse momentum, working in the effective field theory in which the top quark contributions are integrated out. To account for quark mass effects, we supplement the effective field theory result by the full quark mass dependence at leading order. Our calculation is fully differential in the final state kinematics and includes the decay of the Higgs boson to a photon pair. It allows one to make next-to-next-to-leading order (NNLO)-accurate theory predictions for Higgs-plus-jet final states and for the transverse momentum distribution of the Higgs boson, accounting for the experimental definition of the fiducial cross sections. The NNLO QCD corrections are found to be moderate and positive, they lead to a substantial reduction of the theory uncertainty on the predictions. We compare our results to 8 TeV LHC data from ATLAS and CMS. While the shape of the data is well-described for both experiments, we agree on the normalization only for CMS. By normalizing data and theory to the inclusive fiducial cross section for Higgs production, good agreement is found for both experiments, however at the expense of an increased theory uncertainty. We make predictions for Higgs production observables at the 13 TeV LHC, which are in good agreement with recent ATLAS data. At this energy, the leading order mass corrections to the effective field theory prediction become significant at large transverse momenta, and we discuss the resulting uncertainties on the predictions.
Kaufman, Scott Barry; Benedek, Mathias; Jung, Rex E.; Kenett, Yoed N.; Jauk, Emanuel; Neubauer, Aljoscha C.; Silvia, Paul J.
2015-01-01
Abstract The brain's default network (DN) has been a topic of considerable empirical interest. In fMRI research, DN activity is associated with spontaneous and self‐generated cognition, such as mind‐wandering, episodic memory retrieval, future thinking, mental simulation, theory of mind reasoning, and creative cognition. Despite large literatures on developmental and disease‐related influences on the DN, surprisingly little is known about the factors that impact normal variation in DN functioning. Using structural equation modeling and graph theoretical analysis of resting‐state fMRI data, we provide evidence that Openness to Experience—a normally distributed personality trait reflecting a tendency to engage in imaginative, creative, and abstract cognitive processes—underlies efficiency of information processing within the DN. Across two studies, Openness predicted the global efficiency of a functional network comprised of DN nodes and corresponding edges. In Study 2, Openness remained a robust predictor—even after controlling for intelligence, age, gender, and other personality variables—explaining 18% of the variance in DN functioning. These findings point to a biological basis of Openness to Experience, and suggest that normally distributed personality traits affect the intrinsic architecture of large‐scale brain systems. Hum Brain Mapp 37:773–779, 2016. © 2015 Wiley Periodicals, Inc. PMID:26610181
Bidelman, Gavin M.; Heinz, Michael G.
2011-01-01
Human listeners prefer consonant over dissonant musical intervals and the perceived contrast between these classes is reduced with cochlear hearing loss. Population-level activity of normal and impaired model auditory-nerve (AN) fibers was examined to determine (1) if peripheral auditory neurons exhibit correlates of consonance and dissonance and (2) if the reduced perceptual difference between these qualities observed for hearing-impaired listeners can be explained by impaired AN responses. In addition, acoustical correlates of consonance-dissonance were also explored including periodicity and roughness. Among the chromatic pitch combinations of music, consonant intervals∕chords yielded more robust neural pitch-salience magnitudes (determined by harmonicity∕periodicity) than dissonant intervals∕chords. In addition, AN pitch-salience magnitudes correctly predicted the ordering of hierarchical pitch and chordal sonorities described by Western music theory. Cochlear hearing impairment compressed pitch salience estimates between consonant and dissonant pitch relationships. The reduction in contrast of neural responses following cochlear hearing loss may explain the inability of hearing-impaired listeners to distinguish musical qualia as clearly as normal-hearing individuals. Of the neural and acoustic correlates explored, AN pitch salience was the best predictor of behavioral data. Results ultimately show that basic pitch relationships governing music are already present in initial stages of neural processing at the AN level. PMID:21895089
Goal selection versus process control in a brain-computer interface based on sensorimotor rhythms.
Royer, Audrey S; He, Bin
2009-02-01
In a brain-computer interface (BCI) utilizing a process control strategy, the signal from the cortex is used to control the fine motor details normally handled by other parts of the brain. In a BCI utilizing a goal selection strategy, the signal from the cortex is used to determine the overall end goal of the user, and the BCI controls the fine motor details. A BCI based on goal selection may be an easier and more natural system than one based on process control. Although goal selection in theory may surpass process control, the two have never been directly compared, as we are reporting here. Eight young healthy human subjects participated in the present study, three trained and five naïve in BCI usage. Scalp-recorded electroencephalograms (EEG) were used to control a computer cursor during five different paradigms. The paradigms were similar in their underlying signal processing and used the same control signal. However, three were based on goal selection, and two on process control. For both the trained and naïve populations, goal selection had more hits per run, was faster, more accurate (for seven out of eight subjects) and had a higher information transfer rate than process control. Goal selection outperformed process control in every measure studied in the present investigation.
A model for the geomorphic development of normal-fault facets
NASA Astrophysics Data System (ADS)
Tucker, G. E.; Hobley, D. E. J.; McCoy, S. W.
2014-12-01
Triangular facets are among the most striking landforms associated with normal faulting. The genesis of facets is of great interest both for the information facets contain about tectonic motion, and because the progressive emergence of facets makes them potential recorders of both geomorphic and tectonic history. In this report, we present observations of triangular facets in the western United States and in the Italian Central Apennines. Facets in these regions typically form quasi-planar surfaces that are aligned in series along and above the trace of an active fault. Some facet surfaces consist mainly of exposed bedrock, with a thin and highly discontinuous cover of loose regolith. Other facets are mantled by a several-decimeter-thick regolith cover. Over the course of its morphologic development, a facet slope segment may evolve from a steep (~60 degree) bedrock fault scarp, well above the angle of repose for soil, to a gentler (~20-40 degree) slope that can potentially sustain a coherent regolith cover. This evolutionary trajectory across the angle of repose renders nonlinear diffusion theory inapplicable. To formulate an alternative process-based theory for facet evolution, we use a particle-based approach that acknowledges the possibility for both short- and long-range sediment-grain motions, depending on the topography. The processes of rock weathering, grain entrainment, and grain motion are represented as stochastic state-pair transitions with specified transition rates. The model predicts that facet behavior can range smoothly along the spectrum from a weathering-limited mode to a transport-limited mode, depending on the ratio of fault-slip rate to bare-bedrock regolith production rate. The model also implies that facets formed along a fault with pinned tips should show systematic variation in slope angle that correlates with along-fault position and slip rate. Preliminary observations from central Italy and the eastern Basin and Range are consistent with this prediction.
Foster, Michele; Burridge, Letitia; Donald, Maria; Zhang, Jianzhen; Jackson, Claire
2016-01-14
Service delivery innovation is at the heart of efforts to combat the growing burden of chronic disease and escalating healthcare expenditure. Small-scale, locally-led service delivery innovation is a valuable source of learning about the complexities of change and the actions of local change agents. This exploratory qualitative study captures the perspectives of clinicians and managers involved in a general practitioner-led integrated diabetes care innovation. Data on these change agents' perspectives on the local innovation and how it works in the local context were collected through focus groups and semi-structured interviews at two primary health care sites. Transcribed data were analysed thematically. Normalization Process Theory provided a framework to explore perspectives on the individual and collective work involved in putting the innovation into practice in local service delivery contexts. Twelve primary health care clinicians, hospital-based medical specialists and practice managers participated in the study, which represented the majority involved in the innovation at the two sites. The thematic analysis highlighted three main themes of local innovation work: 1) trusting and embedding new professional relationships; 2) synchronizing services and resources; and 3) reconciling realities of innovation work. As a whole, the findings show that while locally-led service delivery innovation is designed to respond to local problems, convincing others to trust change and managing the boundary tensions is core to local work, particularly when it challenges taken-for-granted practices and relationships. Despite this, the findings also show that local innovators can and do act in both discretionary and creative ways to progress the innovation. The use of Normalization Process Theory uncovered some critical professional, organizational and structural factors early in the progression of the innovation. The key to local service delivery innovation lies in building coalitions of trust at the point of service delivery and persuading organizational and institutional mindsets to consider the opportunities of locally-led innovation.
Pieterse, Arwen H; de Vries, Marieke; Kunneman, Marleen; Stiggelbout, Anne M; Feldman-Stewart, Deb
2013-01-01
Healthcare decisions, particularly those involving weighing benefits and harms that may significantly affect quality and/or length of life, should reflect patients' preferences. To support patients in making choices, patient decision aids and values clarification methods (VCM) in particular have been developed. VCM intend to help patients to determine the aspects of the choices that are important to their selection of a preferred option. Several types of VCM exist. However, they are often designed without clear reference to theory, which makes it difficult for their development to be systematic and internally coherent. Our goal was to provide theory-informed recommendations for the design of VCM. Process theories of decision making specify components of decision processes, thus, identify particular processes that VCM could aim to facilitate. We conducted a review of the MEDLINE and PsycINFO databases and of references to theories included in retrieved papers, to identify process theories of decision making. We selected a theory if (a) it fulfilled criteria for a process theory; (b) provided a coherent description of the whole process of decision making; and (c) empirical evidence supports at least some of its postulates. Four theories met our criteria: Image Theory, Differentiation and Consolidation theory, Parallel Constraint Satisfaction theory, and Fuzzy-trace Theory. Based on these, we propose that VCM should: help optimize mental representations; encourage considering all potentially appropriate options; delay selection of an initially favoured option; facilitate the retrieval of relevant values from memory; facilitate the comparison of options and their attributes; and offer time to decide. In conclusion, our theory-based design recommendations are explicit and transparent, providing an opportunity to test each in a systematic manner. Copyright © 2012 Elsevier Ltd. All rights reserved.
Rippon, T. S.
1928-01-01
(I) Theory Rivers' theory of the “danger instincts” is a key to the problem of moral and prevention of war neuroses. (II) Causes of War Neuroses These are believed to be largely mental, e.g., conflict between the instinct of self-preservation and the sense of duty. (III) Instinct of Self-Preservation This subject presents difficulties, because people react in so many different ways; a man may be impelled to run away, or to become aggressive or even motionless when in danger. (IV) Importance The importance of knowing all the reactions of the normal man to danger is, first—the need to know the normal before considering the abnormal states; second—the chemical warfare of the future will involve increased emotional stress; third—in such war, there will be an additional strain of inactivity during a gas attack. (V) The Danger Instincts as described by Rivers Reaction by flight. Aggression. Manipulative activity. Immobility and collapse. Emotional states associated with reactions. Conflict between different tendencies the reason for collapse when in danger. (VI) Evidence supporting Rivers' Theories Relative severity of war neurosis in pilots, observers, balloon officers, Army officers and submarine crews. Investigation on reactions of pilots to danger and fear. (VII) Rivers' Theory applied to Moral (Mental Hygiene) Knowledge of normal reactions to danger enables the medical officer to help to maintain moral by:—(a) Preparing the mind to meet danger. Explaining that fear is a natural emotion under certain circumstances. Need for self-control but not shame. (b) Prevention of repression. (c) Counter-suggestion and panic. (VIII) Concluding Statement on Cowardice Difficulty in distinguishing cowardice from neurosis. Definition suggested. Medical tests. PMID:19986246
ERIC Educational Resources Information Center
Sharma, Kshitij; Chavez-Demoulin, Valérie; Dillenbourg, Pierre
2017-01-01
The statistics used in education research are based on central trends such as the mean or standard deviation, discarding outliers. This paper adopts another viewpoint that has emerged in statistics, called extreme value theory (EVT). EVT claims that the bulk of normal distribution is comprised mainly of uninteresting variations while the most…
Referential Communication Abilities and Theory of Mind Development in Preschool Children
ERIC Educational Resources Information Center
Resches, Mariela; Pereira, Miguel Perez
2007-01-01
This work aims to analyse the specific contribution of social abilities (here considered as the capacity for attributing knowledge to others) in a particular communicative context. 74 normally developing children (aged 3;4 to 5;9, M=4.6) were given two Theory of Mind (ToM) tasks, which are considered to assess increasing complexity levels of…
NASA Technical Reports Server (NTRS)
Unz, H.; Roskam, J.
1979-01-01
The theory of acoustic plane wave normally incident on a clamped panel in a rectangular duct is developed. The coupling theory between the elastic vibrations of the panel (plate) and the acoustic wave propagation in infinite space and in the rectangular duct is considered. The partial differential equation which governs the vibration of the panel (plate) is modified by adding to its stiffness (spring) forces and damping forces, and the fundamental resonance frequency and the attenuation factor are discussed. The noise reduction expression based on the theory is found to agree well with the corresponding experimental data of a sample aluminum panel in the mass controlled region, the damping controlled region, and the stiffness controlled region. All the frequency positions of the upward and downward resonance spikes in the sample experimental data are identified theoretically as resulting from four cross interacting major resonance phenomena: the cavity resonance, the acoustic resonance, the plate resonance, and the wooden back panel resonance.
Analysis and modification of theory for impact of seaplanes on water
NASA Technical Reports Server (NTRS)
Mayo, Wilbur L
1945-01-01
An analysis of available theory on seaplane impact and a proposed modification thereto are presented. In previous methods the overall momentum of the float and virtual mass has been assumed to remain constant during the impact but the present analysis shows that this assumption is rigorously correct only when the resultant velocity of the float is normal to the keel. The proposed modification chiefly involves consideration of the fact that forward velocity of the seaplane float causes momentum to be passed into the hydrodynamic downwash (an action that is the entire consideration in the case of the planing float) and consideration of the fact that, for an impact with trim, the rate of penetration is determined not only by the velocity component normal to the keel but also by the velocity component parallel to the keel, which tends to reduce the penetration. Experimental data for planing, oblique impact, and vertical drop are used to show that the accuracy of the proposed theory is good.
Contrasting Ohlsson's Resubsumption Theory with Chi's Categorical Shift Theory
ERIC Educational Resources Information Center
Chi, Michelene T. H.; Brem, Sarah K.
2009-01-01
Ohlsson's proposal of resubsumption as the dominant process in conceptual, or nonmonotonic, change presents a worthy challenge to more established theories, such as Chi's theory of ontological shift. The two approaches differ primarily in that Ohlsson's theory emphasizes a process of learning in which narrower, more specific concepts are subsumed…
NASA Astrophysics Data System (ADS)
Glazoff, Michael Vasily
In the post-Fukushima world, thermal and structural stability of materials under extreme conditions is an important issue for the safety of nuclear reactors. Because the nuclear industry will continue using zirconium (Zr) cladding for the foreseeable future, it becomes critical to gain a fundamental understanding of several interconnected problems. First, what are the thermodynamic and kinetic factors affecting oxidation and hydrogen pick-up by these materials at normal, off-normal conditions, and in long-term storage? Secondly, what protective coatings could be used in order to gain valuable time at off-normal conditions (temperature exceeds ~1200°C (2200°F)? Thirdly, the kinetics of the coating's oxidation must be understood. Lastly, one needs automated inspection algorithms allowing identifying cladding's defects. This work attempts to explore the problem from a computational perspective, utilizing first principles atomistic simulations, computational thermodynamics, plasticity theory, and morphological algorithms of image processing for defect identification. It consists of the four parts dealing with these four problem areas preceded by the introduction. In the 1st part, computational thermodynamics and ab initio calculations were used to shed light upon the different stages of zircaloy oxidation and hydrogen pickup, and microstructure optimization to increase thermal stability. The 2 nd part describes the kinetic theory of oxidation of the several materials considered to be perspective coatings for Zr alloys: SiC and ZrSiO4. The 3rd part deals with understanding the respective roles of the two different plasticity mechanisms in Zr nuclear alloys: twinning (at low T) and crystallographic slip (higher T's). For that goal, an advanced plasticity model was proposed. In the 4th part projectional algorithms for defect identification in zircaloy coatings are described. Conclusions and recommendations are presented in the 5th part. This integrative approach's value is in developing multi-faceted understanding of complex processes taking place in nuclear fuel rods. It helped identify several problems pertaining to the safe operations with nuclear fuel: limits of temperature that should be strictly obeyed in storage to retard zircaloy hydriding; understanding the benefits and limitations of coatings; developing in-depth understanding of Zr plasticity; developing original algorithms for defect identification in SiC-braided zircaloy. The obtained results will be useful for the nuclear industry.
Hydrogels for engineering: normalization of swelling due to arbitrary stimulus
NASA Astrophysics Data System (ADS)
Ehrenhofer, Adrian; Wallmersperger, Thomas
2017-04-01
In engineering, materials are chosen from databases: Engineers orient on specific parameters such as Young's modulus, yield stress or thermal expansion coefficients for a desired application. For hydrogels, the choice of materials is rather tedious since no generalized material parameters are currently available to quantify the swelling behavior. The normalization of swelling, which we present in the current work, allows an easy comparison of different hydrogel materials. Thus, for a specific application like a sensor or an actuator, an adequate material can be chosen. In the current work, we present the process of normalization and provide a course of action for the data analysis. Special challenges for hydrogels like hysteresis, conditional multi-sensitivity and anisotropic swelling are addressed. Then, the Temperature Expansion Model is shortly described and applied. Using the derived normalized swelling curves, a nonlinear expansion coefficient ß(F) is derived. The derived material behavior is used in an analytical model to predict the bending behavior of a beam made of thermo-responsive hydrogel material under an anisotropic temperature load. A bending behavior of the beam can be observed and the impact of other geometry and material parameters can be investigated. To overcome the limitations of the one-dimensional beam theory, the material behavior and geometry can be implemented in Finite Element analysis tools. Thus, novel applications for hydrogels in various fields can be envisioned, designed and tested. This can lead to a wider use of smart materials in sensor or actuator devices even by engineers without chemical background.
NASA Astrophysics Data System (ADS)
Nikitenko, V. R.; von Seggern, H.
2007-11-01
An analytic theory of nonequilibrium hopping charge transport in disordered organic materials includes quasiequilibrium (normal) and extremely nonequilibrium (dispersive) regimes as limiting cases at long and short times, respectively. In the intermediate interval of time quasiequilibrium value of mobility is nearly established while the coefficient of field-assisted diffusion continues to increase (quasidispersive regime). Therefore, normalized time dependencies of transient current in time-of-flight (TOF) conditions are practically independent of field strength and sample thickness, in good agreement both with data of TOF experiments for molecularly doped polymers and results of numerical simulations of Gaussian disorder model. An analytic model of transient electroluminescence (TEL) is developed on the base of the mentioned theory. Strong asymmetry of mobilities is presumed. In analogy with TOF transients, dispersion parameter of normalized TEL intensity is anomalously large and almost field independent in the quasidispersive regime of transport. The method for determination of mobility from TEL data is proposed.
Electric field induced sheeting and breakup of dielectric liquid jets
NASA Astrophysics Data System (ADS)
Khoshnevis, Ahmad; Tsai, Scott S. H.; Esmaeilzadeh, Esmaeil
2014-01-01
We report experimental observations of the controlled deformation of a dielectric liquid jet subjected to a local high-voltage electrostatic field in the direction normal to the jet. The jet deforms to the shape of an elliptic cylinder upon application of a normal electrostatic field. As the applied electric field strength is increased, the elliptic cylindrical jet deforms permanently into a flat sheet, and eventually breaks-up into droplets. We interpret this observation—the stretch of the jet is in the normal direction to the applied electric field—qualitatively using the Taylor-Melcher leaky dielectric theory, and develop a simple scaling model that predicts the critical electric field strength for the jet-to-sheet transition. Our model shows a good agreement with experimental results, and has a form that is consistent with the classical drop deformation criterion in the Taylor-Melcher theory. Finally, we statistically analyze the resultant droplets from sheet breakup, and find that increasing the applied electric field strength improves droplet uniformity and reduces droplet size.
Dunford, Jeffrey L; Dhirani, Al-Amin
2008-11-12
Interfaces between disordered normal materials and superconductors (S) can exhibit 'reflectionless tunnelling' (RT)-a phenomenon that arises from repeated disorder-driven elastic scattering, multiple Andreev reflections, and electron/hole interference. RT has been used to explain zero-bias conductance peaks (ZBCPs) observed using doped semiconductors and evaporated granular metal films as the disordered normal materials. Recently, in addition to ZBCPs, magnetoconductance oscillations predicted by RT theory have been observed using a novel normal disordered material: self-assembled nanoparticle films. In the present study, we find that the period of these oscillations decreases as temperature (T) increases. This suggests that the magnetic flux associated with interfering pathways increases accordingly. We propose that the increasing flux can be attributed to magnetic field penetration into S as [Formula: see text]. This model agrees remarkably well with known T dependence of penetration depth predicted by Bardeen-Cooper-Schrieffer theory. Our study shows that this additional region of flux is significant and must be considered in experimental and theoretical studies of RT.
Perinatal sadness among Shuar women: support for an evolutionary theory of psychic pain.
Hagen, Edward H; Barrett, H Clark
2007-03-01
Psychiatry faces an internal contradiction in that it regards mild sadness and low mood as normal emotions, yet when these emotions are directed toward a new infant, it regards them as abnormal. We apply parental investment theory, a widely used framework from evolutionary biology, to maternal perinatal emotions, arguing that negative emotions directed toward a new infant could serve an important evolved function. If so, then under some definitions of psychiatric disorder, these emotions are not disorders. We investigate the applicability of parental investment theory to maternal postpartum emotions among Shuar mothers. Shuar mothers' conceptions of perinatal sadness closely match predictions of parental investment theory.
ERIC Educational Resources Information Center
Bukach, Cindy M.; Bub, Daniel N.; Masson, Michael E. J.; Lindsay, D. Stephen
2004-01-01
Studies of patients with category-specific agnosia (CSA) have given rise to multiple theories of object recognition, most of which assume the existence of a stable, abstract semantic memory system. We applied an episodic view of memory to questions raised by CSA in a series of studies examining normal observers' recall of newly learned attributes…
ERIC Educational Resources Information Center
Sovik, Nils; Arntzen, Oddvar
1986-01-01
General movement/feedback theory and a "two-routes" theoretical model were tested on 24 normal, 24 dyslexic, and 24 dysgraphic children. Familiarity of the test items and complexity and length of required movement pattern played an important role in the writing/spelling performance of the nine-year-old subjects defined as dyslexic or dysgraphic.…