1983-08-01
ACCESSION NO «• TITLE (and Sublltle) TAILORED TESTING THEORY AND PRACTICE: A BASIC MODEL , NORMAL OGIVE SUBMODELS, AND TAILORED TESTING ALGORITHMS 7...single common-factor model , the author derives the two- and three-parametir normal ogfve il’^irTr^ functions as submodels. For both of these...PAOEfWiwi Dmia Bnfnd) NPRDC TR 83-32 AUGUST 1983 TAILORED TESTING THEORY AND PRACTICE: A BASIC MODEL , NORMAL OGIVE SUBMODELS, AND TAILORED TESTING
ERIC Educational Resources Information Center
Hufano, Linda D.
The study examined emotional-motivational personality characteristics of 15 learning disabled, 15 normal achieving, and 15 high achieving students (grades 3-5). The study tested the hypothesis derived from the A-R-D (attitude-reinforcer-discriminative) theory of motivation that learning disabled (LD) children differ from normal and high achieving…
Thompson, G Brian; Fletcher-Flinn, Claire M; Wilson, Kathryn J; McKay, Michael F; Margrain, Valerie G
2015-03-01
Predictions from theories of the processes of word reading acquisition have rarely been tested against evidence from exceptionally early readers. The theories of Ehri, Share, and Byrne, and an alternative, Knowledge Sources theory, were so tested. The former three theories postulate that full development of context-free letter sounds and awareness of phonemes are required for normal acquisition, while the claim of the alternative is that with or without such, children can use sublexical information from their emerging reading vocabularies to acquire word reading. Results from two independent samples of children aged 3-5, and 5 years, with mean word reading levels of 7 and 9 years respectively, showed underdevelopment of their context-free letter sounds and phoneme awareness, relative to their word reading levels and normal comparison samples. Despite such underdevelopment, these exceptional readers engaged in a form of phonological recoding that enabled pseudoword reading, at the level of older-age normal controls matched on word reading level. Moreover, in the 5-year-old sample further experiments showed that, relative to normal controls, they had a bias toward use of sublexical information from their reading vocabularies for phonological recoding of heterophonic pseudowords with irregular consistent spelling, and were superior in accessing word meanings independently of phonology, although only if the readers were without exposure to explicit phonics. The three theories were less satisfactory than the alternative theory in accounting for the learning of the exceptionally early readers. Copyright © 2014 Elsevier B.V. All rights reserved.
Theory and tests of a thermal ion detector sensitive only at Near-normal incidence
NASA Technical Reports Server (NTRS)
Robinson, J. W.
1981-01-01
Measurements of thermal ions are influenced by factors such as spacecraft potential, velocity, angle of attack, and sheath size. A theory is presented for the response of an instrument which accepts ions only within a small angle of incidence from normal. Although a more general theory is available and forms the basis of this one, the small angle restriction allows a simpler formulation which does not depend on sheath size. Furthermore, practical instruments are easily designed around this restriction. Laboratory tests verify that such instruments respond as expected and they illustrate how design details influence perturbations from the ideal response characteristics.
Destination memory and cognitive theory of mind in normal ageing.
El Haj, Mohamad; Raffard, Stéphane; Gély-Nargeot, Marie-Christine
2016-01-01
Destination memory is the ability to remember the destination to which a piece of information has been addressed (e.g., "Did I tell you about the promotion?"). This ability is found to be impaired in normal ageing. Our work aimed to link this deterioration to the decline in theory of mind. Forty younger adults (M age = 23.13 years, SD = 4.00) and 36 older adults (M age = 69.53 years, SD = 8.93) performed a destination memory task. They also performed the False-belief test addressing cognitive theory of mind and the Reading the mind in the eyes test addressing affective theory of mind. Results showed significant deterioration in destination memory, cognitive theory of mind and affective theory of mind in the older adults. The older adults' performance on destination memory was significantly correlated with and predicted by their performance on cognitive theory of mind. Difficulties in the ability to interpret and predict others' mental states are related to destination memory decline in older adults.
Normal Aging and Linguistic Decrement.
ERIC Educational Resources Information Center
Emery, Olga B.
A study investigated language patterning, as an indication of synthetic mental activity, in comparison groups of normal pre-middle-aged adults (30-42 years), normal elderly adults (75-93), and elderly adults (71-91) with Alzheimer's dementia. Semiotic theory was used as the conceptual context. Linguistic measures included the Token Test, the…
ERIC Educational Resources Information Center
Sengul Avsar, Asiye; Tavsancil, Ezel
2017-01-01
This study analysed polytomous items' psychometric properties according to nonparametric item response theory (NIRT) models. Thus, simulated datasets--three different test lengths (10, 20 and 30 items), three sample distributions (normal, right and left skewed) and three samples sizes (100, 250 and 500)--were generated by conducting 20…
Ziatabar Ahmadi, Seyyede Zohreh; Jalaie, Shohreh; Ashayeri, Hassan
2015-09-01
Theory of mind (ToM) or mindreading is an aspect of social cognition that evaluates mental states and beliefs of oneself and others. Validity and reliability are very important criteria when evaluating standard tests; and without them, these tests are not usable. The aim of this study was to systematically review the validity and reliability of published English comprehensive ToM tests developed for normal preschool children. We searched MEDLINE (PubMed interface), Web of Science, Science direct, PsycINFO, and also evidence base Medicine (The Cochrane Library) databases from 1990 to June 2015. Search strategy was Latin transcription of 'Theory of Mind' AND test AND children. Also, we manually studied the reference lists of all final searched articles and carried out a search of their references. Inclusion criteria were as follows: Valid and reliable diagnostic ToM tests published from 1990 to June 2015 for normal preschool children; and exclusion criteria were as follows: the studies that only used ToM tests and single tasks (false belief tasks) for ToM assessment and/or had no description about structure, validity or reliability of their tests. METHODological quality of the selected articles was assessed using the Critical Appraisal Skills Programme (CASP). In primary searching, we found 1237 articles in total databases. After removing duplicates and applying all inclusion and exclusion criteria, we selected 11 tests for this systematic review. There were a few valid, reliable and comprehensive ToM tests for normal preschool children. However, we had limitations concerning the included articles. The defined ToM tests were different in populations, tasks, mode of presentations, scoring, mode of responses, times and other variables. Also, they had various validities and reliabilities. Therefore, it is recommended that the researchers and clinicians select the ToM tests according to their psychometric characteristics, validity and reliability.
Ziatabar Ahmadi, Seyyede Zohreh; Jalaie, Shohreh; Ashayeri, Hassan
2015-01-01
Objective: Theory of mind (ToM) or mindreading is an aspect of social cognition that evaluates mental states and beliefs of oneself and others. Validity and reliability are very important criteria when evaluating standard tests; and without them, these tests are not usable. The aim of this study was to systematically review the validity and reliability of published English comprehensive ToM tests developed for normal preschool children. Method: We searched MEDLINE (PubMed interface), Web of Science, Science direct, PsycINFO, and also evidence base Medicine (The Cochrane Library) databases from 1990 to June 2015. Search strategy was Latin transcription of ‘Theory of Mind’ AND test AND children. Also, we manually studied the reference lists of all final searched articles and carried out a search of their references. Inclusion criteria were as follows: Valid and reliable diagnostic ToM tests published from 1990 to June 2015 for normal preschool children; and exclusion criteria were as follows: the studies that only used ToM tests and single tasks (false belief tasks) for ToM assessment and/or had no description about structure, validity or reliability of their tests. Methodological quality of the selected articles was assessed using the Critical Appraisal Skills Programme (CASP). Result: In primary searching, we found 1237 articles in total databases. After removing duplicates and applying all inclusion and exclusion criteria, we selected 11 tests for this systematic review. Conclusion: There were a few valid, reliable and comprehensive ToM tests for normal preschool children. However, we had limitations concerning the included articles. The defined ToM tests were different in populations, tasks, mode of presentations, scoring, mode of responses, times and other variables. Also, they had various validities and reliabilities. Therefore, it is recommended that the researchers and clinicians select the ToM tests according to their psychometric characteristics, validity and reliability. PMID:27006666
ERIC Educational Resources Information Center
Sovik, Nils; Arntzen, Oddvar
1986-01-01
General movement/feedback theory and a "two-routes" theoretical model were tested on 24 normal, 24 dyslexic, and 24 dysgraphic children. Familiarity of the test items and complexity and length of required movement pattern played an important role in the writing/spelling performance of the nine-year-old subjects defined as dyslexic or dysgraphic.…
Fonseca, A C; Yule, W
1995-12-01
Two studies were conducted to test the hypotheses derived from Eysenck's and Gray's theories of personality regarding antisocial behavior. For this purpose the Eysenck Personality Questionnaire (Junior) (EPQ-Junior) and a card task aimed at measuring sensitivity to reward were used in each of the studies. The first study compared a group of juvenile delinquents with a group of nondelinquents and the second study compared a group of severely conduct-disordered children with a group of normal children. The results did not support Eysenck's claim that delinquents score higher than their normal counterparts on extraversion, neuroticism, and psychoticism. Some support was found for the hypothesis derived from Gray's theory: Children and adolescents with severe antisocial behavior were more sensitive to rewards than their normal counterparts.
Investigation of Heat Transfer to a Flat Plate in a Shock Tube.
1987-12-01
2 Objectives and Scope . . . . . .. .. .. .... 5 11. Theory ............... ....... 7 Shock Tube Principles........... 7 Boundary Layer Theory ...in *excess of theory , but the rounded edge flat plate exhibited data which matched or was less than what theory predicted for each Mach number tested...normal shock advancing along an infinite flat plate. For x< Ugt there is a region of interaction between the downstream influence of the leading edge
Theory of Mind and Language in Children with Cochlear Implants
ERIC Educational Resources Information Center
Remmel, Ethan; Peters, Kimberly
2009-01-01
Thirty children with cochlear implants (CI children), age range 3-12 years, and 30 children with normal hearing (NH children), age range 4-6 years, were tested on theory of mind and language measures. The CI children showed little to no delay on either theory of mind, relative to the NH children, or spoken language, relative to hearing norms. The…
NASA Astrophysics Data System (ADS)
Filatov, Michael; Cremer, Dieter
2002-01-01
A recently developed variationally stable quasi-relativistic method, which is based on the low-order approximation to the method of normalized elimination of the small component, was incorporated into density functional theory (DFT). The new method was tested for diatomic molecules involving Ag, Cd, Au, and Hg by calculating equilibrium bond lengths, vibrational frequencies, and dissociation energies. The method is easy to implement into standard quantum chemical programs and leads to accurate results for the benchmark systems studied.
The Distribution of the Product Explains Normal Theory Mediation Confidence Interval Estimation.
Kisbu-Sakarya, Yasemin; MacKinnon, David P; Miočević, Milica
2014-05-01
The distribution of the product has several useful applications. One of these applications is its use to form confidence intervals for the indirect effect as the product of 2 regression coefficients. The purpose of this article is to investigate how the moments of the distribution of the product explain normal theory mediation confidence interval coverage and imbalance. Values of the critical ratio for each random variable are used to demonstrate how the moments of the distribution of the product change across values of the critical ratio observed in research studies. Results of the simulation study showed that as skewness in absolute value increases, coverage decreases. And as skewness in absolute value and kurtosis increases, imbalance increases. The difference between testing the significance of the indirect effect using the normal theory versus the asymmetric distribution of the product is further illustrated with a real data example. This article is the first study to show the direct link between the distribution of the product and indirect effect confidence intervals and clarifies the results of previous simulation studies by showing why normal theory confidence intervals for indirect effects are often less accurate than those obtained from the asymmetric distribution of the product or from resampling methods.
Using R to Simulate Permutation Distributions for Some Elementary Experimental Designs
ERIC Educational Resources Information Center
Eudey, T. Lynn; Kerr, Joshua D.; Trumbo, Bruce E.
2010-01-01
Null distributions of permutation tests for two-sample, paired, and block designs are simulated using the R statistical programming language. For each design and type of data, permutation tests are compared with standard normal-theory and nonparametric tests. These examples (often using real data) provide for classroom discussion use of metrics…
Left Hemisphere Regions Are Critical for Language in the Face of Early Left Focal Brain Injury
ERIC Educational Resources Information Center
Beharelle, Anjali Raja; Dick, Anthony Steven; Josse, Goulven; Solodkin, Ana; Huttenlocher, Peter R.; Levine, Susan C.; Small, Steven L.
2010-01-01
A predominant theory regarding early stroke and its effect on language development, is that early left hemisphere lesions trigger compensatory processes that allow the right hemisphere to assume dominant language functions, and this is thought to underlie the near normal language development observed after early stroke. To test this theory, we…
Optimal and Most Exact Confidence Intervals for Person Parameters in Item Response Theory Models
ERIC Educational Resources Information Center
Doebler, Anna; Doebler, Philipp; Holling, Heinz
2013-01-01
The common way to calculate confidence intervals for item response theory models is to assume that the standardized maximum likelihood estimator for the person parameter [theta] is normally distributed. However, this approximation is often inadequate for short and medium test lengths. As a result, the coverage probabilities fall below the given…
Ho, Andrew D; Yu, Carol C
2015-06-01
Many statistical analyses benefit from the assumption that unconditional or conditional distributions are continuous and normal. More than 50 years ago in this journal, Lord and Cook chronicled departures from normality in educational tests, and Micerri similarly showed that the normality assumption is met rarely in educational and psychological practice. In this article, the authors extend these previous analyses to state-level educational test score distributions that are an increasingly common target of high-stakes analysis and interpretation. Among 504 scale-score and raw-score distributions from state testing programs from recent years, nonnormal distributions are common and are often associated with particular state programs. The authors explain how scaling procedures from item response theory lead to nonnormal distributions as well as unusual patterns of discreteness. The authors recommend that distributional descriptive statistics be calculated routinely to inform model selection for large-scale test score data, and they illustrate consequences of nonnormality using sensitivity studies that compare baseline results to those from normalized score scales.
Chou, C P; Bentler, P M; Satorra, A
1991-11-01
Research studying robustness of maximum likelihood (ML) statistics in covariance structure analysis has concluded that test statistics and standard errors are biased under severe non-normality. An estimation procedure known as asymptotic distribution free (ADF), making no distributional assumption, has been suggested to avoid these biases. Corrections to the normal theory statistics to yield more adequate performance have also been proposed. This study compares the performance of a scaled test statistic and robust standard errors for two models under several non-normal conditions and also compares these with the results from ML and ADF methods. Both ML and ADF test statistics performed rather well in one model and considerably worse in the other. In general, the scaled test statistic seemed to behave better than the ML test statistic and the ADF statistic performed the worst. The robust and ADF standard errors yielded more appropriate estimates of sampling variability than the ML standard errors, which were usually downward biased, in both models under most of the non-normal conditions. ML test statistics and standard errors were found to be quite robust to the violation of the normality assumption when data had either symmetric and platykurtic distributions, or non-symmetric and zero kurtotic distributions.
Project Physics Tests 3, The Triumph of Mechanics.
ERIC Educational Resources Information Center
Harvard Univ., Cambridge, MA. Harvard Project Physics.
Test items relating to Project Physics Unit 3 are presented in this booklet. Included are 70 multiple-choice and 20 problem-and-essay questions. Concepts of mechanics are examined on energy, momentum, kinetic theory of gases, pulse analyses, "heat death," water waves, power, conservation laws, normal distribution, thermodynamic laws, and…
Kerr, Sharyn; Durkin, Kevin
2004-12-01
Standard false belief tasks indicate that normally developing children do not fully develop a theory of mind until the age of 4 years and that children with autism have an impaired theory of mind. Recent evidence, however, suggests that children as young as 3 years of age understand that thought bubbles depict mental representations and that these can be false. Twelve normally developing children and 11 children with autism were tested on a standard false belief task and a number of tasks that employed thought bubbles to represent mental states. While the majority of normally developing children and children with autism failed the standard false belief task, they understood that (i) thought bubbles represent thought, (ii) thought bubbles can be used to infer an unknown reality, (iii) thoughts can be different, and (iv) thoughts can be false. These results indicate that autistic children with a relatively low verbal mental age may be capable of understanding mental representations.
Theory of Mind Training in Children with Autism: A Randomized Controlled Trial
ERIC Educational Resources Information Center
Begeer, Sander; Gevers, Carolien; Clifford, Pamela; Verhoeve, Manja; Kat, Kirstin; Hoddenbach, Elske; Boer, Frits
2011-01-01
Many children with Autism Spectrum Disorders (ASD) participate in social skills or Theory of Mind (ToM) treatments. However, few studies have shown evidence for their effectiveness. The current study used a randomized controlled design to test the effectiveness of a 16-week ToM treatment in 8-13 year old children with ASD and normal IQs (n = 40).…
Is the ML Chi-Square Ever Robust to Nonnormality? A Cautionary Note with Missing Data
ERIC Educational Resources Information Center
Savalei, Victoria
2008-01-01
Normal theory maximum likelihood (ML) is by far the most popular estimation and testing method used in structural equation modeling (SEM), and it is the default in most SEM programs. Even though this approach assumes multivariate normality of the data, its use can be justified on the grounds that it is fairly robust to the violations of the…
Siéroff, Eric; Piquard, Ambre
2004-12-01
Due to progress in the cognitive theories in the last twenty years, the description of attentional deficits associated with normal or pathological aging has substantially improved. In this article, attentional deficits are presented according to Posner theory, which describes three sub-systems in a global network of attention: vigilance, selective attention, command. This theory not only characterizes the functions of these subsystems, but gives precise indications about their anatomical and neurochemical substrates. Several clinical tests can be described for each of these different subsystems. The main attentional deficits are presented in the second part of this paper: if some decline of the attentional command occurs in normal aging, a real deficit in this subsystem is found in most degenerative processes (frontotemporal dementia, Alzheimer and Parkinson diseases). Alzheimer disease is also frequently associated with a deficit of selective spatial attention, early in the evolution of the disease.
ERIC Educational Resources Information Center
Lera-Miguel, Sara; Rosa, Mireia; Puig, Olga; Kaland, Nils; Lázaro, Luisa; Castro-Formieles, Josefina; Calvo, Rosa
2016-01-01
Most individuals with autism spectrum disorders often fail in tasks of theory of mind (ToM). However, those with normal intellectual functioning known as high functioning ASD (HF-ASD) sometimes succeed in mentalizing inferences. Some tools have been developed to more accurately test their ToM abilities. The aims of this study were to examine the…
ERIC Educational Resources Information Center
Peterson, Candida C.
2002-01-01
Three studies examined theory-of-mind concepts among children ages 6-13 years with deafness or autism, and 4-year-olds with normal development. Findings indicated that while the children with deafness or autism scored significantly lower on standard tests of false belief understanding, they scored higher on even the most challenging drawing-based…
Fliss, Rafika; Lemerre, Marion; Mollard, Audrey
2016-06-01
Compromised theory of mind (ToM) can be explained either by a failure to implement specific representational capacities (mental state representations) or by more general executive selection demands. In older adult populations, evidence supporting affected executive functioning and cognitive ToM in normal aging are reported. However, links between these two functions remain unclear. In the present paper, we address these shortcomings by using a specific task of ToM and classical executive tasks. We studied, using an original cognitive ToM task, the effect of age on ToM performances, in link with the progressive executive decline. 96 elderly participants were recruited. They were asked to perform a cognitive ToM task, and 5 executive tests (Stroop test and Hayling Sentence Completion Test to appreciate inhibitory process, Trail Making Test and Verbal Fluency for shifting assessment and backward span dedicated to estimate working memory capacity). The results show changes in cognitive ToM performance according to executive demands. Correlational studies indicate a significant relationship between ToM performance and the selected executive measures. Regression analyzes demonstrates that level of vocabulary and age as the best predictors of ToM performance. The results are consistent with the hypothesis that ToM deficits are related to age-related domain-general decline rather than as to a breakdown in specialized representational system. The implications of these findings for the nature of social cognition tests in normal aging are also discussed.
Normality of raw data in general linear models: The most widespread myth in statistics
Kery, Marc; Hatfield, Jeff S.
2003-01-01
In years of statistical consulting for ecologists and wildlife biologists, by far the most common misconception we have come across has been the one about normality in general linear models. These comprise a very large part of the statistical models used in ecology and include t tests, simple and multiple linear regression, polynomial regression, and analysis of variance (ANOVA) and covariance (ANCOVA). There is a widely held belief that the normality assumption pertains to the raw data rather than to the model residuals. We suspect that this error may also occur in countless published studies, whenever the normality assumption is tested prior to analysis. This may lead to the use of nonparametric alternatives (if there are any), when parametric tests would indeed be appropriate, or to use of transformations of raw data, which may introduce hidden assumptions such as multiplicative effects on the natural scale in the case of log-transformed data. Our aim here is to dispel this myth. We very briefly describe relevant theory for two cases of general linear models to show that the residuals need to be normally distributed if tests requiring normality are to be used, such as t and F tests. We then give two examples demonstrating that the distribution of the response variable may be nonnormal, and yet the residuals are well behaved. We do not go into the issue of how to test normality; instead we display the distributions of response variables and residuals graphically.
Evaluation of a New Mean Scaled and Moment Adjusted Test Statistic for SEM
ERIC Educational Resources Information Center
Tong, Xiaoxiao; Bentler, Peter M.
2013-01-01
Recently a new mean scaled and skewness adjusted test statistic was developed for evaluating structural equation models in small samples and with potentially nonnormal data, but this statistic has received only limited evaluation. The performance of this statistic is compared to normal theory maximum likelihood and 2 well-known robust test…
Advances in Testing the Statistical Significance of Mediation Effects
ERIC Educational Resources Information Center
Mallinckrodt, Brent; Abraham, W. Todd; Wei, Meifen; Russell, Daniel W.
2006-01-01
P. A. Frazier, A. P. Tix, and K. E. Barron (2004) highlighted a normal theory method popularized by R. M. Baron and D. A. Kenny (1986) for testing the statistical significance of indirect effects (i.e., mediator variables) in multiple regression contexts. However, simulation studies suggest that this method lacks statistical power relative to some…
Application of binaural beat phenomenon with aphasic patients.
Barr, D F; Mullin, T A; Herbert, P S
1977-04-01
We investigated whether six aphasics and six normal subjects could binaurally fuse two slightly differing frequencies of constant amplitude. The aphasics were subdivided into two groups: (1) two men who had had mild cerebrovascular accidents (CVAs) during the past 15 months; (2) four men who had had severe CVAs during the last 15 months. Two tones of different frequency levels but equal in intensity were presented dichotically to the subjects at 40 dB sensation level. All subjects had normal hearing at 500 Hz (0 to 25 dB). All six normal subjects and the two aphasics who had had mild CVAs could hear the binaural beats. The four aphasics who had had severe CVAs could not hear them. A 2 X 2 design resulting from this study was compared using chi2 test with Yates correction and was found to be significantly different (P less than .05). Two theories are presented to explain these findings: the "depression theory" and the "temporal time-sequencing theory." Therapeutic implications are also discussed relative to cerebral and/or brain stem involvement in the fusion of binaural stimuli.
Linear perturbations of black holes: stability, quasi-normal modes and tails
NASA Astrophysics Data System (ADS)
Zhidenko, Alexander
2009-03-01
Black holes have their proper oscillations, which are called the quasi-normal modes. The proper oscillations of astrophysical black holes can be observed in the nearest future with the help of gravitational wave detectors. Quasi-normal modes are also very important in the context of testing of the stability of black objects, the anti-de Sitter/Conformal Field Theory (AdS/CFT) correspondence and in higher dimensional theories, such as the brane-world scenarios and string theory. This dissertation reviews a number of works, which provide a thorough study of the quasi-normal spectrum of a wide class of black holes in four and higher dimensions for fields of various spin and gravitational perturbations. We have studied numerically the dependance of the quasi-normal modes on a number of factors, such as the presence of the cosmological constant, the Gauss-Bonnet parameter or the aether in the space-time, the dependance of the spectrum on parameters of the black hole and fields under consideration. By the analysis of the quasi-normal spectrum, we have studied the stability of higher dimensional Reissner-Nordstrom-de Sitter black holes, Kaluza-Klein black holes with squashed horizons, Gauss-Bonnet black holes and black strings. Special attention is paid to the evolution of massive fields in the background of various black holes. We have considered their quasi-normal ringing and the late-time tails. In addition, we present two new numerical techniques: a generalisation of the Nollert improvement of the Frobenius method for higher dimensional problems and a qualitatively new method, which allows to calculate quasi-normal frequencies for black holes, which metrics are not known analytically.
On Nonequivalence of Several Procedures of Structural Equation Modeling
ERIC Educational Resources Information Center
Yuan, Ke-Hai; Chan, Wai
2005-01-01
The normal theory based maximum likelihood procedure is widely used in structural equation modeling. Three alternatives are: the normal theory based generalized least squares, the normal theory based iteratively reweighted least squares, and the asymptotically distribution-free procedure. When data are normally distributed and the model structure…
Theory-Driven Models for Correcting Fight or Flight Imbalance in Gulf War Illness
2011-09-01
testing on software • Performed static and dynamic analysis on safety code Research Interests To understand how the nervous system operates, how...dynamics of these systems to reset control of the HPA-immune axis to normal. We have completed the negotiation of sub-awards to the CFIDS Association...We propose that severe physical or psychological insult to the endocrine and immune systems can displace these from a normal regulatory equilibrium
EVALUATION OF A NEW MEAN SCALED AND MOMENT ADJUSTED TEST STATISTIC FOR SEM.
Tong, Xiaoxiao; Bentler, Peter M
2013-01-01
Recently a new mean scaled and skewness adjusted test statistic was developed for evaluating structural equation models in small samples and with potentially nonnormal data, but this statistic has received only limited evaluation. The performance of this statistic is compared to normal theory maximum likelihood and two well-known robust test statistics. A modification to the Satorra-Bentler scaled statistic is developed for the condition that sample size is smaller than degrees of freedom. The behavior of the four test statistics is evaluated with a Monte Carlo confirmatory factor analysis study that varies seven sample sizes and three distributional conditions obtained using Headrick's fifth-order transformation to nonnormality. The new statistic performs badly in most conditions except under the normal distribution. The goodness-of-fit χ(2) test based on maximum-likelihood estimation performed well under normal distributions as well as under a condition of asymptotic robustness. The Satorra-Bentler scaled test statistic performed best overall, while the mean scaled and variance adjusted test statistic outperformed the others at small and moderate sample sizes under certain distributional conditions.
ASTROPHYSICS. Atom-interferometry constraints on dark energy.
Hamilton, P; Jaffe, M; Haslinger, P; Simmons, Q; Müller, H; Khoury, J
2015-08-21
If dark energy, which drives the accelerated expansion of the universe, consists of a light scalar field, it might be detectable as a "fifth force" between normal-matter objects, in potential conflict with precision tests of gravity. Chameleon fields and other theories with screening mechanisms, however, can evade these tests by suppressing the forces in regions of high density, such as the laboratory. Using a cesium matter-wave interferometer near a spherical mass in an ultrahigh-vacuum chamber, we reduced the screening mechanism by probing the field with individual atoms rather than with bulk matter. We thereby constrained a wide class of dark energy theories, including a range of chameleon and other theories that reproduce the observed cosmic acceleration. Copyright © 2015, American Association for the Advancement of Science.
The saturation of monochromatic lights obliquely incident on the retina.
Alpern, M; Tamaki, R
1983-01-01
Foveal dark-adaptation undertaken to test the hypothesis that the excitation of rods causes the desaturation of 'yellow' lights in a 1 degree field traversing the margin of the pupil, fails to exclude that possibility. The desaturation is largest for a 1 degree outside diameter annular test, is still measurable with a 0.5 degree circular disk, but disappears for a 0.29 degree disk. The supersaturation of obliquely incident 501.2 nm test light follows the opposite pattern; it disappears with an annulus and is largest for a 0.29 degree circular field. It is unlikely that rods replace short-wave sensitive cones in the trichromatic match of an obliquely incident test with normally incident primaries. If rods as well as all three cones species are involved, the matches might not be trichromatic in the strong sense. Grassmann's law of scalar multiplication was tested and shown not to hold for the match of an obliquely incident test with normally incident primaries, though it remains valid whenever, both primaries and test strike the retina at the same angle of incidence (independent of that angle). The result in section 3 (above) cannot be due to rod intrusion. It persists (and becomes more conspicuous) on backgrounds (4.0 log scotopic td) which saturate rods. Moreover obliquely incident 'yellow' lights remain desaturated in intervals in the dark after a full bleach, whilst the test field is below rod threshold. The amount of desaturation does not differ appreciably from that normally found. The assumption of the unified theory of Alpern, Kitahara & Tamaki (1983) that the outer segments of only a single set of three cone species (with acceptance angles wide enough to include the entire exit pupil) contain the visual pigments absorbing both the normally incident primaries and the obliquely incident test is disproved by these results. Failure of Grassmann's law is most conspicuous under the conditions for which the changes in saturation upon changing from normal to oblique incidence are greatest and least when the saturation changes are the smallest. Either all unified theories of the Stiles-Crawford effects are wrong or all the effects of oblique incidence operate at a stage in the visual process at which the effects of radiation of different wave-lengths are no longer compounded by the simple linear laws. PMID:6875976
Normal theory procedures for calculating upper confidence limits (UCL) on the risk function for continuous responses work well when the data come from a normal distribution. However, if the data come from an alternative distribution, the application of the normal theory procedure...
Goals, intentions and mental states: challenges for theories of autism.
Hamilton, Antonia F de C
2009-08-01
The ability to understand the goals and intentions behind other people's actions is central to many social interactions. Given the profound social difficulties seen in autism, we might expect goal understanding to be impaired in these individuals. Two influential theories, the 'broken mirror' theory and the mentalising theory, can both predict this result. However, a review of the current data provides little empirical support for goal understanding difficulties; several studies demonstrate normal performance by autistic children on tasks requiring the understanding of goals or intentions. I suggest that this conclusion forces us to reject the basic broken mirror theory and to re-evaluate the breadth of the mentalising theory. More subtle theories which distinguish between different types of mirroring and different types of mentalising may be able to account for the present data, and further research is required to test and refine these theories.
Chagas, Mauro H.; Magalhães, Fabrício A.; Peixoto, Gustavo H. C.; Pereira, Beatriz M.; Andrade, André G. P.; Menzel, Hans-Joachim K.
2016-01-01
ABSTRACT Background Stretching exercises are able to promote adaptations in the muscle-tendon unit (MTU), which can be tested through physiological and biomechanical variables. Identifying the key variables in MTU adaptations is crucial to improvements in training. Objective To perform an exploratory factor analysis (EFA) involving the variables often used to evaluate the response of the MTU to stretching exercises. Method Maximum joint range of motion (ROMMAX), ROM at first sensation of stretching (FSTROM), peak torque (torqueMAX), passive stiffness, normalized stiffness, passive energy, and normalized energy were investigated in 36 participants during passive knee extension on an isokinetic dynamometer. Stiffness and energy values were normalized by the muscle cross-sectional area and their passive mode assured by monitoring the EMG activity. Results EFA revealed two major factors that explained 89.68% of the total variance: 53.13% was explained by the variables torqueMAX, passive stiffness, normalized stiffness, passive energy, and normalized energy, whereas the remaining 36.55% was explained by the variables ROMMAX and FSTROM. Conclusion This result supports the literature wherein two main hypotheses (mechanical and sensory theories) have been suggested to describe the adaptations of the MTU to stretching exercises. Contrary to some studies, in the present investigation torqueMAX was significantly correlated with the variables of the mechanical theory rather than those of the sensory theory. Therefore, a new approach was proposed to explain the behavior of the torqueMAX during stretching exercises. PMID:27437715
A normal mode treatment of semi-diurnal body tides on an aspherical, rotating and anelastic Earth
NASA Astrophysics Data System (ADS)
Lau, Harriet C. P.; Yang, Hsin-Ying; Tromp, Jeroen; Mitrovica, Jerry X.; Latychev, Konstantin; Al-Attar, David
2015-08-01
Normal mode treatments of the Earth's body tide response were developed in the 1980s to account for the effects of Earth rotation, ellipticity, anelasticity and resonant excitation within the diurnal band. Recent space-geodetic measurements of the Earth's crustal displacement in response to luni-solar tidal forcings have revealed geographical variations that are indicative of aspherical deep mantle structure, thus providing a novel data set for constraining deep mantle elastic and density structure. In light of this, we make use of advances in seismic free oscillation literature to develop a new, generalized normal mode theory for the tidal response within the semi-diurnal and long-period tidal band. Our theory involves a perturbation method that permits an efficient calculation of the impact of aspherical structure on the tidal response. In addition, we introduce a normal mode treatment of anelasticity that is distinct from both earlier work in body tides and the approach adopted in free oscillation seismology. We present several simple numerical applications of the new theory. First, we compute the tidal response of a spherically symmetric, non-rotating, elastic and isotropic Earth model and demonstrate that our predictions match those based on standard Love number theory. Second, we compute perturbations to this response associated with mantle anelasticity and demonstrate that the usual set of seismic modes adopted for this purpose must be augmented by a family of relaxation modes to accurately capture the full effect of anelasticity on the body tide response. Finally, we explore aspherical effects including rotation and we benchmark results from several illustrative case studies of aspherical Earth structure against independent finite-volume numerical calculations of the semi-diurnal body tide response. These tests confirm the accuracy of the normal mode methodology to at least the level of numerical error in the finite-volume predictions. They also demonstrate that full coupling of normal modes, rather than group coupling, is necessary for accurate predictions of the body tide response.
Robert, Kylie A; Brunet-Rossinni, Anja; Bronikowski, Anne M
2007-06-01
We test the 'free radical theory of aging' using six species of colubrid snakes (numerous, widely distributed, non-venomous snakes of the family Colubridae) that exhibit long (> 15 years) or short (< 10 years) lifespans. Because the 'rate of living theory' predicts metabolic rates to be correlated with rates of aging and oxidative damage results from normal metabolic processes we sought to answer whether physiological parameters and locomotor performance (which is a good predictor of survival in juvenile snakes) mirrored the evolution of lifespans in these colubrid snakes. We measured whole animal metabolic rate (oxygen consumption Vo2), locomotor performance, cellular metabolic rate (mitochondrial oxygen consumption), and oxidative stress potential (hydrogen peroxide production by mitochondria). Longer-lived colubrid snakes have greater locomotor performance and reduced hydrogen peroxide production than short-lived species, while whole animal metabolic rates and mitochondrial efficiency did not differ with lifespan. We present the first measures testing the 'free radical theory of aging' using reptilian species as model organisms. Using reptiles with different lifespans as model organisms should provide greater insight into mechanisms of aging.
Health Insurance: The Trade-Off Between Risk Pooling and Moral Hazard.
1989-12-01
bias comes about because we suppress the intercept term in estimating VFor the power, the test is against 1, - 1. With this transform, the risk...dealing with the same utility function. As one test of whether families behave in the way economic theory suggests, we have also fitted a probit model of...nonparametric alternative to test our results’ sensitivity to the assumption of a normal error in both the theoretical and empirical models of the
[Specificity hypothesis of a theory of mind deficit in early childhood autism].
Kissgen, R; Schleiffer, R
2002-02-01
In order to test the hypothesis that a theory of mind deficit is specific for autism, the present study presents the first replication of the Sally-Anne test (Baron-Cohen, Leslie & Frith, 1985) in the German-speaking countries. The Sally-Anne test was administered to 16 autistic, 24 probands with Down's syndrome and 20 normal preschool prosands. The intelligence of the autistic group and that with Down's syndrome was measured by the CPM/SPM. In addition, the ADI-R was used with the principal caregivers of the autistic and Down's syndrome subjects. With regard to the clinical diagnosis, theory of mind deficit turned out to be not specific for autism. Six of 16 (37.5%) autistic subjects passed the theory of mind tasks. Thus performance in the autistic group surpassed that of both control groups. Out of 16 autistic subjects, autism could be confirmed in only 8 on the basis of the ADI-R diagnostic criteria, only one of whom showed a theory of mind. The autistic individuals with a theory of mind differed significantly in their mean IQ from those without this ability. Spectrum and specificity of a theory of mind deficit in autism remain controversial. For further research it seems important to administer the ADI-R during the diagnostic process. The findings suggest that the clinical diagnosis of autism is not precise enough to distinguish between autism and nonautistic mental handicap.
Influence of the Reynolds number on normal forces of slender bodies of revolution
NASA Technical Reports Server (NTRS)
Hartmann, K.
1982-01-01
Comprehensive force, moment, and pressure distribution measurements as well as flow visualization experiments were carried out to determine the influence of the Reynolds number on nonlinear normal forces of slender bodies of revolution. Experiments were performed in transonic wind tunnels at angles of attack up to 90 deg in the Mach number range 0.5 to 2.2 at variable Reynolds numbers. The results were analysed theoretically and an empirical theory was developed which describes the test results satisfactory.
Normalized Cut Algorithm for Automated Assignment of Protein Domains
NASA Technical Reports Server (NTRS)
Samanta, M. P.; Liang, S.; Zha, H.; Biegel, Bryan A. (Technical Monitor)
2002-01-01
We present a novel computational method for automatic assignment of protein domains from structural data. At the core of our algorithm lies a recently proposed clustering technique that has been very successful for image-partitioning applications. This grap.,l-theory based clustering method uses the notion of a normalized cut to partition. an undirected graph into its strongly-connected components. Computer implementation of our method tested on the standard comparison set of proteins from the literature shows a high success rate (84%), better than most existing alternative In addition, several other features of our algorithm, such as reliance on few adjustable parameters, linear run-time with respect to the size of the protein and reduced complexity compared to other graph-theory based algorithms, would make it an attractive tool for structural biologists.
A Comparison of the Effects of Non-Normal Distributions on Tests of Equivalence
ERIC Educational Resources Information Center
Ellington, Linda F.
2011-01-01
Statistical theory and its application provide the foundation to modern systematic inquiry in the behavioral, physical and social sciences disciplines (Fisher, 1958; Wilcox, 1996). It provides the tools for scholars and researchers to operationalize constructs, describe populations, and measure and interpret the relations between populations and…
The Variance Normalization Method of Ridge Regression Analysis.
ERIC Educational Resources Information Center
Bulcock, J. W.; And Others
The testing of contemporary sociological theory often calls for the application of structural-equation models to data which are inherently collinear. It is shown that simple ridge regression, which is commonly used for controlling the instability of ordinary least squares regression estimates in ill-conditioned data sets, is not a legitimate…
Normal Science in a Multiverse
NASA Astrophysics Data System (ADS)
Carroll, Sean
2016-06-01
A number of theories in contemporary physics and cosmology place an emphasis on features that are hard, and arguably impossible, to test. These include the cosmological multiverse as well as some approaches to quantum gravity. Worries have been raised that these models attempt to sidestep the purportedly crucial principle of falsifiability. Proponents of these theories sometimes suggest that we are seeing a new approach to science, while opponents fear that we are abandoning science altogether. I will argue that in fact these theories are straightforwardly scientific and can be evaluated in absolutely conventional ways, based on empiricism, abduction (inference to the best explanation), and Bayesian reasoning. The integrity of science remains intact.
Poythress, Norman G.; Edens, John F.; Landfield, Kristin; Lilienfeld, Scott O.; Skeem, Jennifer L.; Douglas, Kevin S.
2008-01-01
In a 1995 monograph, Lykken asserted that an innate fearless temperament underpins the development of primary psychopathy as described by Cleckley (1941). To embed this insight in a larger theory of behavior, Lykken embraced constructs from Gray’s (1982) reinforcement sensitivity theory (RST). Specifically, he hypothesized that in primary psychopaths the behavioral inhibition system (BIS) lacks normal sensitivity to cues of conditioned punishment or non-reward. Subsequent researchers have embraced Carver and White’s (1994) BIS scale as the instrument of choice for testing Lykken’s theory of primary psychopathy, a practice that this review calls into question. We note (a) a dearth of research using the BIS scales in offender samples, where more psychopathic individuals are likely to be found and (b) limited BIS scale coverage of the functions attributed to the behavioral inhibition system in RST. In addition, (c) we review literature suggesting that rather than assessing the fear sensitivity function critical to Lykken’s theory, the BIS scale instead functions primarily as an index of negative emotionality. We recommend a moratorium on the use of the BIS scale to test Lykken’s theory of primary psychopathy. PMID:19727321
Monsen, Karen A; Kelechi, Teresa J; McRae, Marion E; Mathiason, Michelle A; Martin, Karen S
The growth and diversification of nursing theory, nursing terminology, and nursing data enable a convergence of theory- and data-driven discovery in the era of big data research. Existing datasets can be viewed through theoretical and terminology perspectives using visualization techniques in order to reveal new patterns and generate hypotheses. The Omaha System is a standardized terminology and metamodel that makes explicit the theoretical perspective of the nursing discipline and enables terminology-theory testing research. The purpose of this paper is to illustrate the approach by exploring a large research dataset consisting of 95 variables (demographics, temperature measures, anthropometrics, and standardized instruments measuring quality of life and self-efficacy) from a theory-based perspective using the Omaha System. Aims were to (a) examine the Omaha System dataset to understand the sample at baseline relative to Omaha System problem terms and outcome measures, (b) examine relationships within the normalized Omaha System dataset at baseline in predicting adherence, and (c) examine relationships within the normalized Omaha System dataset at baseline in predicting incident venous ulcer. Variables from a randomized clinical trial of a cryotherapy intervention for the prevention of venous ulcers were mapped onto Omaha System terms and measures to derive a theoretical framework for the terminology-theory testing study. The original dataset was recoded using the mapping to create an Omaha System dataset, which was then examined using visualization to generate hypotheses. The hypotheses were tested using standard inferential statistics. Logistic regression was used to predict adherence and incident venous ulcer. Findings revealed novel patterns in the psychosocial characteristics of the sample that were discovered to be drivers of both adherence (Mental health Behavior: OR = 1.28, 95% CI [1.02, 1.60]; AUC = .56) and incident venous ulcer (Mental health Behavior: OR = 0.65, 95% CI [0.45, 0.93]; Neuro-musculo-skeletal function Status: OR = 0.69, 95% CI [0.47, 1.00]; male: OR = 3.08, 95% CI [1.15, 8.24]; not married: OR = 2.70, 95% CI [1.00, 7.26]; AUC = .76). The Omaha System was employed as ontology, nursing theory, and terminology to bridge data and theory and may be considered a data-driven theorizing methodology. Novel findings suggest a relationship between psychosocial factors and incident venous ulcer outcomes. There is potential to employ this method in further research, which is needed to generate and test hypotheses from other datasets to extend scientific investigations from existing data.
ERIC Educational Resources Information Center
Sinharay, Sandip
2015-01-01
The maximum likelihood estimate (MLE) of the ability parameter of an item response theory model with known item parameters was proved to be asymptotically normally distributed under a set of regularity conditions for tests involving dichotomous items and a unidimensional ability parameter (Klauer, 1990; Lord, 1983). This article first considers…
Lera-Miguel, Sara; Rosa, Mireia; Puig, Olga; Kaland, Nils; Lázaro, Luisa; Castro-Formieles, Josefina; Calvo, Rosa
2016-01-01
Most individuals with autism spectrum disorders often fail in tasks of theory of mind (ToM). However, those with normal intellectual functioning known as high functioning ASD (HF-ASD) sometimes succeed in mentalizing inferences. Some tools have been developed to more accurately test their ToM abilities. The aims of this study were to examine the psychometric properties of the Spanish version of Stories of Everyday Life Test (SEL) in a sample of 29 children and adolescents with HF-ASD and 25 typically developing controls and to compare their performance. The Spanish-SEL demonstrated good internal consistency, strong convergence with clinical severity and another ToM test, and adequate discriminant validity from intellectual capability and age, identifying the condition of 70 % of participants.
Theory of the Bloch oscillating transistor
NASA Astrophysics Data System (ADS)
Hassel, J.; Seppä, H.
2005-01-01
The Bloch oscillating transistor (BOT) is a device in which single electron current through a normal tunnel junction enhances Cooper pair current in a mesoscopic Josephson junction, leading to signal amplification. In this article we develop a theory in which the BOT dynamics is described as a two-level system. The theory is used to predict current-voltage characteristics and small-signal response. The transition from stable operation into the hysteretic regime is studied. By identifying the two-level switching noise as the main source of fluctuations, the expressions for equivalent noise sources and the noise temperature are derived. The validity of the model is tested by comparing the results with simulations and experiments.
Computer simulations of rapid granular flows of spheres interacting with a flat, frictional boundary
DOE Office of Scientific and Technical Information (OSTI.GOV)
Louge, M.Y.
This paper employs computer simulations to test the theory of Jenkins [J. Applied Mech. [bold 59], 120 (1992)] for the interaction between a rapid granular flow of spheres and a flat, frictional wall. This paper examines the boundary conditions that relate the shear stress and energy flux at the wall to the normal stress, slip velocity, and fluctuation energy, and to the parameters that characterize a collision. It is found that while the theory captures the trends of the boundary conditions at low friction, it does not anticipate their behavior at large friction. A critical evaluation of Jenkins' assumptions suggestsmore » where his theory may be improved.« less
Can Gravity Probe B usefully constrain torsion gravity theories?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flanagan, Eanna E.; Rosenthal, Eran
2007-06-15
In most theories of gravity involving torsion, the source for torsion is the intrinsic spin of matter. Since the spins of fermions are normally randomly oriented in macroscopic bodies, the amount of torsion generated by macroscopic bodies is normally negligible. However, in a recent paper, Mao et al. (arXiv:gr-qc/0608121) point out that there is a class of theories, including the Hayashi-Shirafuji (1979) theory, in which the angular momentum of macroscopic spinning bodies generates a significant amount of torsion. They further argue that, by the principle of action equals reaction, one would expect the angular momentum of test bodies to couplemore » to a background torsion field, and therefore the precession of the Gravity Probe B gyroscopes should be affected in these theories by the torsion generated by the Earth. We show that in fact the principle of action equals reaction does not apply to these theories, essentially because the torsion is not an independent dynamical degree of freedom. We examine in detail a generalization of the Hayashi-Shirafuji theory suggested by Mao et al. called Einstein-Hayashi-Shirafuji theory. There are a variety of different versions of this theory, depending on the precise form of the coupling to matter chosen for the torsion. We show that, for any coupling to matter that is compatible with the spin transport equation postulated by Mao et al., the theory has either ghosts or an ill-posed initial-value formulation. These theoretical problems can be avoided by specializing the parameters of the theory and in addition choosing the standard minimal coupling to matter of the torsion tensor. This yields a consistent theory, but one in which the action equals reaction principle is violated, and in which the angular momentum of the gyroscopes does not couple to the Earth's torsion field. Thus, the Einstein-Hayashi-Shirafuji theory does not predict a detectable torsion signal for Gravity Probe B. There may be other torsion theories which do.« less
ERIC Educational Resources Information Center
Fouladi, Rachel T.
2000-01-01
Provides an overview of standard and modified normal theory and asymptotically distribution-free covariance and correlation structure analysis techniques and details Monte Carlo simulation results on Type I and Type II error control. Demonstrates through the simulation that robustness and nonrobustness of structure analysis techniques vary as a…
IRT-LR-DIF with Estimation of the Focal-Group Density as an Empirical Histogram
ERIC Educational Resources Information Center
Woods, Carol M.
2008-01-01
Item response theory-likelihood ratio-differential item functioning (IRT-LR-DIF) is used to evaluate the degree to which items on a test or questionnaire have different measurement properties for one group of people versus another, irrespective of group-mean differences on the construct. Usually, the latent distribution is presumed normal for both…
ERIC Educational Resources Information Center
Landi, Nicole; Perfetti, Charles A.
2007-01-01
The most prominent theories of reading consider reading comprehension ability to be a direct consequence of lower-level reading skills. Recently however, research has shown that some children with poor comprehension ability perform normally on tests of lower-level skills (e.g., decoding). One promising line of behavioral research has found…
Comparing Simulated and Theoretical Sampling Distributions of the U3 Person-Fit Statistic.
ERIC Educational Resources Information Center
Emons, Wilco H. M.; Meijer, Rob R.; Sijtsma, Klaas
2002-01-01
Studied whether the theoretical sampling distribution of the U3 person-fit statistic is in agreement with the simulated sampling distribution under different item response theory models and varying item and test characteristics. Simulation results suggest that the use of standard normal deviates for the standardized version of the U3 statistic may…
Measurement of multiaxial ply strength by an off-axis flexure test
NASA Technical Reports Server (NTRS)
Crews, John H., Jr.; Naik, Rajiv A.
1992-01-01
An off-axis flexure (OAF) test was performed to measure ply strength under multiaxial stress states. This test involves unidirectional off-axis specimens loaded in bending, using an apparatus that allows these anisotropic specimens to twist as well as flex without the complications of a resisting torque. A 3D finite element stress analysis verified that simple beam theory could be used to compute the specimen bending stresses at failure. Unidirectional graphite/epoxy specimens with fiber angles ranging from 90 deg to 15 deg have combined normal and shear stresses on their failure planes that are typical of 45 deg plies in structural laminates. Tests for a range of stress states with AS4/3501-6 specimens showed that both normal and shear stresses on the failure plane influenced cracking resistance. This OAF test may prove to be useful for generating data needed to predict ply cracking in composite structures and may also provide an approach for studying fiber-matrix interface failures under stress states typical of structures.
Rippon, T. S.
1928-01-01
(I) Theory Rivers' theory of the “danger instincts” is a key to the problem of moral and prevention of war neuroses. (II) Causes of War Neuroses These are believed to be largely mental, e.g., conflict between the instinct of self-preservation and the sense of duty. (III) Instinct of Self-Preservation This subject presents difficulties, because people react in so many different ways; a man may be impelled to run away, or to become aggressive or even motionless when in danger. (IV) Importance The importance of knowing all the reactions of the normal man to danger is, first—the need to know the normal before considering the abnormal states; second—the chemical warfare of the future will involve increased emotional stress; third—in such war, there will be an additional strain of inactivity during a gas attack. (V) The Danger Instincts as described by Rivers Reaction by flight. Aggression. Manipulative activity. Immobility and collapse. Emotional states associated with reactions. Conflict between different tendencies the reason for collapse when in danger. (VI) Evidence supporting Rivers' Theories Relative severity of war neurosis in pilots, observers, balloon officers, Army officers and submarine crews. Investigation on reactions of pilots to danger and fear. (VII) Rivers' Theory applied to Moral (Mental Hygiene) Knowledge of normal reactions to danger enables the medical officer to help to maintain moral by:—(a) Preparing the mind to meet danger. Explaining that fear is a natural emotion under certain circumstances. Need for self-control but not shame. (b) Prevention of repression. (c) Counter-suggestion and panic. (VIII) Concluding Statement on Cowardice Difficulty in distinguishing cowardice from neurosis. Definition suggested. Medical tests. PMID:19986246
ERIC Educational Resources Information Center
Sass, D. A.; Schmitt, T. A.; Walker, C. M.
2008-01-01
Item response theory (IRT) procedures have been used extensively to study normal latent trait distributions and have been shown to perform well; however, less is known concerning the performance of IRT with non-normal latent trait distributions. This study investigated the degree of latent trait estimation error under normal and non-normal…
Semi-actuator disk theory for compressor choke flutter
NASA Technical Reports Server (NTRS)
Micklow, J.; Jeffers, J.
1981-01-01
A mathematical anaysis predict the unsteady aerodynamic utilizing semi actuator theory environment for a cascade of airfoils harmonically oscillating in choked flow was developed. A normal shock is located in the blade passage, its position depending on the time dependent geometry, and pressure perturbations of the system. In addition to shock dynamics, the model includes the effect of compressibility, interblade phase lag, and an unsteady flow field upstream and downstream of the cascade. Calculated unsteady aerodynamics were compared with isolated airfoil wind tunnel data, and choke flutter onset boundaries were compared with data from testing of an F100 high pressure compressor stage.
Bavarian, Niloofar; Flay, Brian R.; Ketcham, Patricia L.; Smit, Ellen; Kodama, Cathy; Martin, Melissa; Saltz, Robert F.
2014-01-01
Objective To test a theory-driven model of health behavior to predict the illicit use of prescription stimulants (IUPS) among college students. Participants A probability sample of 554 students from one university located in California (response rate = 90.52%). Methods Students completed a paper-based survey developed with guidance from the Theory of Triadic Influence. We first assessed normality of measures and checked for multicollinearity. A single structural equation model of frequency of IUPS in college was then tested using constructs from the theory’s three streams of influence (i.e., intrapersonal, social situation/context, and sociocultural environment) and four levels of causation (i.e., ultimate causes, distal influences, proximal predictors, and immediate precursors). Results Approximately 18% of students reported engaging in IUPS during college, with frequency of use ranging from never to 40 or more times per academic term. The model tested had strong fit and the majority of paths specified within and across streams were significant at the p<.01 level. Additionally, 46% of the variance in IUPS frequency was explained by the tested model. Conclusions Results suggest the utility of the TTI as an integrative model of health behavior, specifically in predicting IUPS, and provide insight on the need for multifaceted prevention and intervention efforts. PMID:24647369
NASA Technical Reports Server (NTRS)
Perkins, Edward W; Jorgensen, Leland H
1956-01-01
Effects of Reynolds number and angle of attack on the pressure distribution and normal-force characteristics of a body of revolution consisting of a fineness ratio 3 ogival nose tangent to a cylindrical afterbody 7 diameters long have been determined. The test Mach number was 1.98 and the angle-of-attack range from 0 degree to 20 degrees. The Reynolds numbers, based on body diameter, were 0.15 x 10(6) and 0.45 x 10(6). The experimental results are compared with theory.
Semantic memory and frontal executive function during transient global amnesia.
Hodges, J R
1994-05-01
To assess semantic memory and frontal executive function, two patients underwent neuropsychological testing during transient global amnesia (TGA) and after an interval of 6-8 weeks. In spite of a profound deficit in anterograde verbal and non-verbal memory, semantic memory was normal, as judged by category fluency measures, picture naming, and picture-word and picture-picture matching, and reading ability was normal. Similarly, there were no deficits on a number of tests known to be sensitive to frontal executive dysfunction. A hexamethylpropyleneamine-oxime (HMPAO) single photon emission CT (SPECT) scan, obtained on one patient 24 hours post-TGA, showed focal left temporal lobe hypoperfusion which had resolved three months later. The observed dissociation between episodic and semantic memory is discussed in the light of contemporary cognitive theories of memory organisation.
Functional buckling behavior of silicone rubber shells for biomedical use.
van der Houwen, E B; Kuiper, L H; Burgerhof, J G M; van der Laan, B F A M; Verkerke, G J
2013-12-01
The use of soft elastic biomaterials in medical devices enables substantial function integration. The consequent increased simplification in design can improve reliability at a lower cost in comparison to traditional (hard) biomaterials. Functional bi-stable buckling is one of the many new mechanisms made possible by soft materials. The buckling behavior of shells, however, is typically described from a structural failure point of view: the collapse of arches or rupture of steam vessels, for example. There is little or no literature about the functional elastic buckling of small-sized silicone rubber shells, and it is unknown whether or not theory can predict their behavior. Is functional buckling possible within the scale, material and pressure normally associated with physiological applications? An automatic speech valve is used as an example application. Silicone rubber spherical shells (diameter 30mm) with hinged and double-hinged boundaries were subjected to air pressure loading. Twelve different geometrical configurations were tested for buckling and reverse buckling pressures. Data were compared with the theory. Buckling pressure increases linearly with shell thickness and shell height. Reverse buckling shows these same relations, with pressures always below normal buckling pressure. Secondary hinges change normal/reverse buckling pressure ratios and promote symmetrical buckling. All tested configurations buckled within or closely around physiological pressures. Functional bi-stable buckling of silicone rubber shells is possible with adjustable properties in the physiological pressure range. Results can be predicted using the proposed relations and equations. Copyright © 2013 Elsevier Ltd. All rights reserved.
Explicit solutions of normal form of driven oscillatory systems in entrainment bands
NASA Astrophysics Data System (ADS)
Tsarouhas, George E.; Ross, John
1988-11-01
As in a prior article (Ref. 1), we consider an oscillatory dissipative system driven by external sinusoidal perturbations of given amplitude Q and frequency ω. The kinetic equations are transformed to normal form and solved for small Q near a Hopf bifurcation to oscillations in the autonomous system. Whereas before we chose irrational ratios of the frequency of the autonomous system ωn to ω, with quasiperiodic response of the system to the perturbation, we now choose rational coprime ratios, with periodic response (entrainment). The dissipative system has either two variables or is adequately described by two variables near the bifurcation. We obtain explicit solutions and develop these in detail for ωn/ω=1; 1:2; 2:1; 1:3; 3:1. We choose a specific dissipative model (Brusselator) and test the theory by comparison with full numerical solutions. The analytic solutions of the theory give an excellent approximation for the autonomous system near the bifurcation. The theoretically predicted and calculated entrainment bands agree very well for small Q in the vicinity of the bifurcation (small μ); deviations increase with increasing Q and μ. The theory is applicable to one or two external periodic perturbations.
Baron-Cohen, Simon; Richler, Jennifer; Bisarya, Dheraj; Gurunathan, Nhishanth; Wheelwright, Sally
2003-01-01
Systemizing is the drive to analyse systems or construct systems. A recent model of psychological sex differences suggests that this is a major dimension in which the sexes differ, with males being more drawn to systemize than females. Currently, there are no self-report measures to assess this important dimension. A second major dimension of sex differences is empathizing (the drive to identify mental states and respond to these with an appropriate emotion). Previous studies find females score higher on empathy measures. We report a new self-report questionnaire, the Systemizing Quotient (SQ), for use with adults of normal intelligence. It contains 40 systemizing items and 20 control items. On each systemizing item, a person can score 2, 1 or 0, so the SQ has a maximum score of 80 and a minimum of zero. In Study 1, we measured the SQ of n = 278 adults (114 males, 164 females) from a general population, to test for predicted sex differences (male superiority) in systemizing. All subjects were also given the Empathy Quotient (EQ) to test if previous reports of female superiority would be replicated. In Study 2 we employed the SQ and the EQ with n = 47 adults (33 males, 14 females) with Asperger syndrome (AS) or high-functioning autism (HFA), who are predicted to be either normal or superior at systemizing, but impaired at empathizing. Their scores were compared with n = 47 matched adults from the general population in Study 1. In Study 1, as predicted, normal adult males scored significantly higher than females on the SQ and significantly lower on the EQ. In Study 2, again as predicted, adults with AS/HFA scored significantly higher on the SQ than matched controls, and significantly lower on the EQ than matched controls. The SQ reveals both a sex difference in systemizing in the general population and an unusually strong drive to systemize in AS/HFA. These results are discussed in relation to two linked theories: the 'empathizing-systemizing' (E-S) theory of sex differences and the extreme male brain (EMB) theory of autism. PMID:12639333
Small bending and stretching of sandwich-type shells
NASA Technical Reports Server (NTRS)
Reissner, Eric
1950-01-01
A theory has been developed for small bending and stretching of sandwich-type shells. This theory is an extension of the known theory of homogeneous thin elastic shells. It was found that two effects are important in the present problem, which are not normally of importance in the theory of curved shells: (1) the effect of transverse shear deformation and (2) the effect of transverse normal stress deformation. The first of these two effects has been known to be of importance in the theory of plates and beams. The second effect was found to occur in a manner which is typical for shells and has no counterpart in flat-plate theory. The general results of this report have been applied to the solution of problems concerning flat plates, circular rings, circular cylindrical shells, and spherical shells. In each case numerical examples have been given, illustrating the magnitude of the effects of transverse shear and normal stress deformation.
Wang, Yong-Guang; Shi, Jian-fei; Roberts, David L; Jiang, Xiao-ying; Shen, Zhi-hua; Wang, Yi-quan; Wang, Kai
2015-09-30
In social interaction, Theory of Mind (ToM) enables us to construct representations of others' mental states, and to use those representations flexibly to explain or predict others' behavior. Although previous literature has documented that schizophrenia is associated with poor ToM ability, little is known about the cognitive mechanisms underlying their difficulty in ToM use. This study developed a new methodology to test whether the difficulty in false-belief-use might be related to deficits in perspective-switching or impaired inhibitory control among 23 remitted schizophrenia patients and 18 normal controls. Patients showed a significantly greater error rate in a perspective-switching condition than a perspective-repeating position in a false-belief-use task, whereas normal controls did not show a difference between the two conditions. In addition, a larger main effect of inhibition was found in remitted schizophrenia patients than normal controls in both a false-belief-use task and control task. Thus, remitted schizophrenia patients' impairment in ToM use might be accounted for, at least partially, by deficits in perspective-switching and impaired inhibitory control. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
A test of the Hall-MHD model: Application to low-frequency upstream waves at Venus
NASA Technical Reports Server (NTRS)
Orlowski, D. S.; Russell, C. T.; Krauss-Varban, D.; Omidi, N.
1994-01-01
Early studies suggested that in the range of parameter space where the wave angular frequency is less than the proton gyrofrequency and the plasma beta, the ratio of the thermal to magnetic pressure, is less than 1 magnetohydrodynamics provides an adequate description of the propagating modes in a plasma. However, recently, Lacombe et al. (1992) have reported significant differences between basic wave characteristics of the specific propagation modes derived from linear Vlasov and Hall-magnetohydrodynamic (MHD) theories even when the waves are only weakly damped. In this paper we compare the magnetic polarization and normalization magnetic compression ratio of ultra low frequency (ULF) upstream waves at Venus with magnetic polarization and normalized magnetic compression ratio derived from both theories. We find that while the 'kinetic' approach gives magnetic polarization and normalized magnetic compression ratio consistent with the data in the analyzed range of beta (0.5 less than beta less than 5) for the fast magnetosonic mode, the same wave characteristics derived from the Hall-MHD model strongly depend on beta and are consistent with the data only at low beta for the fast mode and at high beta for the intermediate mode.
Haberman, Shelby J; Sinharay, Sandip; Chon, Kyong Hee
2013-07-01
Residual analysis (e.g. Hambleton & Swaminathan, Item response theory: principles and applications, Kluwer Academic, Boston, 1985; Hambleton, Swaminathan, & Rogers, Fundamentals of item response theory, Sage, Newbury Park, 1991) is a popular method to assess fit of item response theory (IRT) models. We suggest a form of residual analysis that may be applied to assess item fit for unidimensional IRT models. The residual analysis consists of a comparison of the maximum-likelihood estimate of the item characteristic curve with an alternative ratio estimate of the item characteristic curve. The large sample distribution of the residual is proved to be standardized normal when the IRT model fits the data. We compare the performance of our suggested residual to the standardized residual of Hambleton et al. (Fundamentals of item response theory, Sage, Newbury Park, 1991) in a detailed simulation study. We then calculate our suggested residuals using data from an operational test. The residuals appear to be useful in assessing the item fit for unidimensional IRT models.
The use of normal forms for analysing nonlinear mechanical vibrations
Neild, Simon A.; Champneys, Alan R.; Wagg, David J.; Hill, Thomas L.; Cammarano, Andrea
2015-01-01
A historical introduction is given of the theory of normal forms for simplifying nonlinear dynamical systems close to resonances or bifurcation points. The specific focus is on mechanical vibration problems, described by finite degree-of-freedom second-order-in-time differential equations. A recent variant of the normal form method, that respects the specific structure of such models, is recalled. It is shown how this method can be placed within the context of the general theory of normal forms provided the damping and forcing terms are treated as unfolding parameters. The approach is contrasted to the alternative theory of nonlinear normal modes (NNMs) which is argued to be problematic in the presence of damping. The efficacy of the normal form method is illustrated on a model of the vibration of a taut cable, which is geometrically nonlinear. It is shown how the method is able to accurately predict NNM shapes and their bifurcations. PMID:26303917
A closed form slug test theory for high permeability aquifers.
Ostendorf, David W; DeGroot, Don J; Dunaj, Philip J; Jakubowski, Joseph
2005-01-01
We incorporate a linear estimate of casing friction into the analytical slug test theory of Springer and Gelhar (1991) for high permeability aquifers. The modified theory elucidates the influence of inertia and casing friction on consistent, closed form equations for the free surface, pressure, and velocity fluctuations for overdamped and underdamped conditions. A consistent, but small, correction for kinetic energy is included as well. A characteristic velocity linearizes the turbulent casing shear stress so that an analytical solution for attenuated, phase shifted pressure fluctuations fits a single parameter (damping frequency) to transducer data from any depth in the casing. Underdamped slug tests of 0.3, 0.6, and 1 m amplitudes at five transducer depths in a 5.1 cm diameter PVC well 21 m deep in the Plymouth-Carver Aquifer yield a consistent hydraulic conductivity of 1.5 x 10(-3) m/s. The Springer and Gelhar (1991) model underestimates the hydraulic conductivity for these tests by as much as 25% by improperly ascribing smooth turbulent casing friction to the aquifer. The match point normalization of Butler (1998) agrees with our fitted hydraulic conductivity, however, when friction is included in the damping frequency. Zurbuchen et al. (2002) use a numerical model to establish a similar sensitivity of hydraulic conductivity to nonlinear casing friction.
Friedman, Ori; Neary, Karen R; Burnstein, Corinna L; Leslie, Alan M
2010-05-01
When young children observe pretend-play, do they interpret it simply as a type of behavior, or do they infer the underlying mental state that gives the behavior meaning? This is a long-standing question with deep implications for how "theory on mind" develops. The two leading accounts of shared pretense give opposing answers. The behavioral theory proposes that children represent pretense as a form of behavior (behaving in a way that would be appropriate if P); the metarepresentational theory argues that children instead represent pretense via the early concept PRETEND. A test between these accounts is provided by children's understanding of pretend sounds and speech. We report the first experiments directly investigating this understanding. In three experiments, 2- and 3-year-olds' listened to requests that were either spoken normally, or with the pretense that a teddy bear was uttering them. To correctly fulfill the requests, children had to represent the normal utterance as the experimenter's, and the pretend utterances as the bear's. Children succeeded at both ages, suggesting that they can represent pretend speech (the requests) as coming from counterfactual sources (the bear rather than the experimenter). We argue that this is readily explained by the metarepresentational theory, but harder to explain if children are behaviorists about pretense. Copyright 2010 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Adelfang, S. I.
1977-01-01
Wind vector change with respect to time at Cape Kennedy, Florida, is examined according to the theory of multivariate normality. The joint distribution of the four variables represented by the components of the wind vector at an initial time and after a specified elapsed time is hypothesized to be quadravariate normal; the fourteen statistics of this distribution, calculated from fifteen years of twice daily Rawinsonde data are presented by monthly reference periods for each month from 0 to 27 km. The hypotheses that the wind component changes with respect to time is univariate normal, the joint distribution of wind component changes is bivariate normal, and the modulus of vector wind change is Rayleigh, has been tested by comparison with observed distributions. Statistics of the conditional bivariate normal distributions of vector wind at a future time given the vector wind at an initial time are derived. Wind changes over time periods from one to five hours, calculated from Jimsphere data, are presented.
A Multidimensional Ideal Point Item Response Theory Model for Binary Data.
Maydeu-Olivares, Albert; Hernández, Adolfo; McDonald, Roderick P
2006-12-01
We introduce a multidimensional item response theory (IRT) model for binary data based on a proximity response mechanism. Under the model, a respondent at the mode of the item response function (IRF) endorses the item with probability one. The mode of the IRF is the ideal point, or in the multidimensional case, an ideal hyperplane. The model yields closed form expressions for the cell probabilities. We estimate and test the goodness of fit of the model using only information contained in the univariate and bivariate moments of the data. Also, we pit the new model against the multidimensional normal ogive model estimated using NOHARM in four applications involving (a) attitudes toward censorship, (b) satisfaction with life, (c) attitudes of morality and equality, and (d) political efficacy. The normal PDF model is not invariant to simple operations such as reverse scoring. Thus, when there is no natural category to be modeled, as in many personality applications, it should be fit separately with and without reverse scoring for comparisons.
A Numerical Theory for Impedance Education in Three-Dimensional Normal Incidence Tubes
NASA Technical Reports Server (NTRS)
Watson, Willie R.; Jones, Michael G.
2016-01-01
A method for educing the locally-reacting acoustic impedance of a test sample mounted in a 3-D normal incidence impedance tube is presented and validated. The unique feature of the method is that the excitation frequency (or duct geometry) may be such that high-order duct modes may exist. The method educes the impedance, iteratively, by minimizing an objective function consisting of the difference between the measured and numerically computed acoustic pressure at preselected measurement points in the duct. The method is validated on planar and high-order mode sources with data synthesized from exact mode theory. These data are then subjected to random jitter to simulate the effects of measurement uncertainties on the educed impedance spectrum. The primary conclusions of the study are 1) Without random jitter the method is in excellent agreement with that for known impedance samples, and 2) Random jitter that is compatible to that found in a typical experiment has minimal impact on the accuracy of the educed impedance.
ERIC Educational Resources Information Center
Wood, Phil
2017-01-01
In this article, I begin by outlining some of the barriers which constrain sustainable organizational change in schools and universities. I then go on to introduce a theory which has already started to help explain complex change and innovation processes in health and care contexts, Normalization Process Theory. Finally, I consider what this…
NASA Astrophysics Data System (ADS)
Ingraham, M. D.; Dewers, T. A.; Heath, J. E.
2016-12-01
Utilizing the localization conditions laid out in Rudnicki 2002, the failure of a series of tests performed on Mancos shale has been analyzed. Shale specimens were tested under constant mean stress conditions in an axisymmetric stress state, with specimens cored both parallel and perpendicular to bedding. Failure data indicates that for the range of pressures tested the failure surface is well represented by a Mohr- Coulomb failure surface with a friction angle of 34.4 for specimens cored parallel to bedding, and 26.5 for specimens cored perpendicular to bedding. There is no evidence of a yield cap up to 200 MPa mean stress. Comparison with the theory shows that the best agreement in terms of band angles comes from assuming normality of the plastic strain increment. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
NASA Technical Reports Server (NTRS)
Slemp, Wesley C. H.; Kapania, Rakesh K.; Tessler, Alexander
2010-01-01
Computation of interlaminar stresses from the higher-order shear and normal deformable beam theory and the refined zigzag theory was performed using the Sinc method based on Interpolation of Highest Derivative. The Sinc method based on Interpolation of Highest Derivative was proposed as an efficient method for determining through-the-thickness variations of interlaminar stresses from one- and two-dimensional analysis by integration of the equilibrium equations of three-dimensional elasticity. However, the use of traditional equivalent single layer theories often results in inaccuracies near the boundaries and when the lamina have extremely large differences in material properties. Interlaminar stresses in symmetric cross-ply laminated beams were obtained by solving the higher-order shear and normal deformable beam theory and the refined zigzag theory with the Sinc method based on Interpolation of Highest Derivative. Interlaminar stresses and bending stresses from the present approach were compared with a detailed finite element solution obtained by ABAQUS/Standard. The results illustrate the ease with which the Sinc method based on Interpolation of Highest Derivative can be used to obtain the through-the-thickness distributions of interlaminar stresses from the beam theories. Moreover, the results indicate that the refined zigzag theory is a substantial improvement over the Timoshenko beam theory due to the piecewise continuous displacement field which more accurately represents interlaminar discontinuities in the strain field. The higher-order shear and normal deformable beam theory more accurately captures the interlaminar stresses at the ends of the beam because it allows transverse normal strain. However, the continuous nature of the displacement field requires a large number of monomial terms before the interlaminar stresses are computed as accurately as the refined zigzag theory.
[Disorders of cognitive activity in schizophrenics].
Follin, S; Perrette, J; Sandretto, M
1979-01-01
4 tests are exploring the cognitive activity of 3 groups of persons: normal, mental patients of various types, schizophrenics, homogeneous as far as the I.Q. is concerned (above 110) and education (secondary school, or university). Whereas normal and mental patients give identical results, except that they are worse for the latter, schizophrenics have better success than other patients in two tests of logic-mathematical reasoning and obviously worse in two tests of experimental logic. These results are interpreted in the frame of Piaget's theory as demonstrating the discordance of the very dynamics of schizophrenic thinking whose cognitive activity is at the same time too near to the object by adherence to the perceived structure (too concrete) and too far from it by adherence to formal reasoning schemes acquired under genetic development (too abstract). These results are coherent with clinical features showing that autistic thinking is not only discordant by its contents and its meaning, but also by the formal dynamic patterns of its modus operandi.
NASA Technical Reports Server (NTRS)
Yip, L. P.; Shubert, G. L.
1976-01-01
A 1- by 3-meter semispan wing of taper ratio 1.0 with NACA 0012 airfoil section contours was tested in the Langley V/STOL tunnel to measure the pressure distribution at five sweep angles, 0 deg, 10 deg, 20 deg, 30 deg, and 40 deg, through an angle-of-attack range from -6 deg to 20 deg. The pressure data are presented as plots of pressure coefficients at each static-pressure tap location on the wing. Flow visualization wing-tuft photographs are also presented for a wing of 40 deg sweep. A comparison between theory and experiment using two inviscid theories and a viscous theory shows good agreement for pressure distributions, normal forces, and pitching moments for the wing at 0 deg sweep.
Decision Processes in Discrimination: Fundamental Misrepresentations of Signal Detection Theory
NASA Technical Reports Server (NTRS)
Balakrishnan, J. D.
1998-01-01
In the first part of this article, I describe a new approach to studying decision making in discrimination tasks that does not depend on the technical assumptions of signal detection theory (e.g., normality of the encoding distributions). Applying these new distribution-free tests to data from three experiments, I show that base rate and payoff manipulations had substantial effects on the participants' encoding distributions but no effect on their decision rules, which were uniformly unbiased in equal and unequal base rate conditions and in symmetric and asymmetric payoff conditions. In the second part of the article, I show that this seemingly paradoxical result is readily explained by the sequential sampling models of discrimination. I then propose a new, "model-free" test for response bias that seems to more properly identify both the nature and direction of the biases induced by the classical bias manipulations.
Rezazadeh, Afsaneh; Solhi, Mahnaz; Azam, Kamal
2015-01-01
Adolescence is a sensitive period of acquiring normal and abnormal habits for all oflife. The study investigates determinants of responsibility for health, spiritual health and interpersonal relations and predictive factors based on the theory of planned behavior in high school girl students in Tabriz. In this Cross-sectional study, 340 students were selected thorough multi-stage sampling. An author-made questionnaire based on standard questionnaires of Health Promotion and Lifestyle II (HPLPII), spiritual health standards (Palutzian & Ellison) and components of the theory of planned behavior (attitudes, subjective norms, perceived behavioral control, and behavioral intention) was used for data collection. The questionnaire was validated in a pilot study. Data were analyzed using SPSS v.15 and descriptive and analytical tests (Chi-square test, Pearson correlation co-efficient and liner regression test in backward method). Students' responsibility for health, spiritual health, interpersonal relationships, and concepts of theory of planned behavior was moderate. We found a significant positive correlation (p<0/001) among all concepts of theory of planned behavior. Attitude and perceived behavioral control predicted 35% of intention of behavioral change (p<0.001). Attitude, subjective norms, and perceived behavioral control predicted 74% of behavioral change in accountability for health (p<0.0001), 56% for behavioral change in spiritual health (p<0.0001) and 63% for behavioral change in interpersonal relationship (p<0.0001). Status of responsibility for health, spiritual health and interpersonal relationships of students was moderate. Hence, behavioral intention and its determinants such as perceived behavioral control should be noted in promoting intervention programs.
NASA Astrophysics Data System (ADS)
Saengow, C.; Giacomin, A. J.
2017-12-01
The Oldroyd 8-constant framework for continuum constitutive theory contains a rich diversity of popular special cases for polymeric liquids. In this paper, we use part of our exact solution for shear stress to arrive at unique exact analytical solutions for the normal stress difference responses to large-amplitude oscillatory shear (LAOS) flow. The nonlinearity of the polymeric liquids, triggered by LAOS, causes these responses at even multiples of the test frequency. We call responses at a frequency higher than twice the test frequency higher harmonics. We find the new exact analytical solutions to be compact and intrinsically beautiful. These solutions reduce to those of our previous work on the special case of the corotational Maxwell fluid. Our solutions also agree with our new truncated Goddard integral expansion for the special case of the corotational Jeffreys fluid. The limiting behaviors of these exact solutions also yield new explicit expressions. Finally, we use our exact solutions to see how η∞ affects the normal stress differences in LAOS.
Bonifacci, Paola; Snowling, Margaret J
2008-06-01
English and Italian children with dyslexia were compared with children with reading difficulties associated with low-IQ on tests of simple and choice RT, and in number and symbol scanning tasks. On all four speed-of-processing tasks, children with low-IQ responded more slowly than children with dyslexia and age-controls. In the choice RT task, the performance of children with low-IQ was also less accurate than that of children of normal IQ, consistent with theories linking processing speed limitations with low-IQ. These findings support the hypothesis that dyslexia is a specific cognitive deficit that can arise in the context of normal IQ and normal speed of processing. The same cognitive phenotype was observed in readers of a deep (English) and a shallow (Italian) orthography.
Liu, Ying-Buh; Yang, Stephen S; Hsieh, Cheng-Hsing; Lin, Chia-Da; Chang, Shang-Jen
2014-05-01
To evaluate the inter-observer, intra-observer and intra-individual reliability of uroflowmetry and post-void residual urine (PVR) tests in adult men. Healthy volunteers aged over 40 years were enrolled. Every participant underwent two sets of uroflowmetry and PVR tests with a 2-week interval between the tests. The uroflowmetry tests were interpreted by four urologists independently. Uroflowmetry curves were classified as bell-shaped, bell-shaped with tail, obstructive, restrictive, staccato, interrupted and tower-shaped and scored from 1 (highly abnormal) to 5 (absolutely normal). The agreements between the observers, interpretations and tests within individuals were analyzed using kappa statistics and intraclass correlation coefficients. Generalizability theory with decision analysis was used to determine how many observers, tests, and interpretations were needed to obtain an acceptable reliability (> 0.80). Of 108 volunteers, we randomly selected the uroflowmetry results from 25 participants for the evaluation of reliability. The mean age of the studied adults was 55.3 years. The intra-individual and intra-observer reliability on uroflowmetry tests ranged from good to very good. However, the inter-observer reliability on normalcy and specific type of flow pattern were relatively lower. In generalizability theory, three observers were needed to obtain an acceptable reliability on normalcy of uroflow pattern if the patient underwent uroflowmetry tests twice with one observation. The intra-individual and intra-observer reliability on uroflowmetry tests were good while the inter-observer reliability was relatively lower. To improve inter-observer reliability, the definition of uroflowmetry should be clarified by the International Continence Society. © 2013 Wiley Publishing Asia Pty Ltd.
Accurate Thermal Stresses for Beams: Normal Stress
NASA Technical Reports Server (NTRS)
Johnson, Theodore F.; Pilkey, Walter D.
2002-01-01
Formulations for a general theory of thermoelasticity to generate accurate thermal stresses for structural members of aeronautical vehicles were developed in 1954 by Boley. The formulation also provides three normal stresses and a shear stress along the entire length of the beam. The Poisson effect of the lateral and transverse normal stresses on a thermally loaded beam is taken into account in this theory by employing an Airy stress function. The Airy stress function enables the reduction of the three-dimensional thermal stress problem to a two-dimensional one. Numerical results from the general theory of thermoelasticity are compared to those obtained from strength of materials. It is concluded that the theory of thermoelasticity for prismatic beams proposed in this paper can be used instead of strength of materials when precise stress results are desired.
Accurate Thermal Stresses for Beams: Normal Stress
NASA Technical Reports Server (NTRS)
Johnson, Theodore F.; Pilkey, Walter D.
2003-01-01
Formulations for a general theory of thermoelasticity to generate accurate thermal stresses for structural members of aeronautical vehicles were developed in 1954 by Boley. The formulation also provides three normal stresses and a shear stress along the entire length of the beam. The Poisson effect of the lateral and transverse normal stresses on a thermally loaded beam is taken into account in this theory by employing an Airy stress function. The Airy stress function enables the reduction of the three-dimensional thermal stress problem to a two-dimensional one. Numerical results from the general theory of thermoelasticity are compared to those obtained from strength of materials. It is concluded that the theory of thermoelasticity for prismatic beams proposed in this paper can be used instead of strength of materials when precise stress results are desired.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cordero, Nicolas A.; March, Norman H.; Alonso, Julio A.
2007-05-15
Partially correlated ground-state electron densities for some spherical light atoms are calculated, into which nonrelativistic ionization potentials represent essential input data. The nuclear cusp condition of Kato is satisfied precisely. The basic theoretical starting point, however, is Hartree-Fock (HF) theory for the N electrons under consideration but with nonintegral nuclear charge Z{sup '} slightly different from the atomic number Z (=N). This HF density is scaled with a parameter {lambda}, near to unity, to preserve normalization. Finally, some tests are performed on the densities for the atoms Ne and Ar, as well as for Be and Mg.
Revealing the Formation Mechanism of Ultra-Diffuse Galaxies
NASA Astrophysics Data System (ADS)
Garmire, Gordon
2017-09-01
Recently a population of large, very low optical surface brightness galaxies, so called ultra-diffuse galaxies (UDGs), were discovered in the outskirts of Coma clusters. Stellar line-of-sight velocity dispersions suggest large dark matter halo masses of 10^12 M_sun with very low baryon fractions ( 1%). The outstanding question waiting to be answered is: How do UDGs form and evolve? One theory is that UDGs are related to bright galaxies, however they are prevented from building a normal stellar population through various violent processes, such as gas stripping. We propose to observe Dragonfly 44, the most massive UDG known, for 100 ks with ACIS-I to test some of the formation theories.
Super-delta: a new differential gene expression analysis procedure with robust data normalization.
Liu, Yuhang; Zhang, Jinfeng; Qiu, Xing
2017-12-21
Normalization is an important data preparation step in gene expression analyses, designed to remove various systematic noise. Sample variance is greatly reduced after normalization, hence the power of subsequent statistical analyses is likely to increase. On the other hand, variance reduction is made possible by borrowing information across all genes, including differentially expressed genes (DEGs) and outliers, which will inevitably introduce some bias. This bias typically inflates type I error; and can reduce statistical power in certain situations. In this study we propose a new differential expression analysis pipeline, dubbed as super-delta, that consists of a multivariate extension of the global normalization and a modified t-test. A robust procedure is designed to minimize the bias introduced by DEGs in the normalization step. The modified t-test is derived based on asymptotic theory for hypothesis testing that suitably pairs with the proposed robust normalization. We first compared super-delta with four commonly used normalization methods: global, median-IQR, quantile, and cyclic loess normalization in simulation studies. Super-delta was shown to have better statistical power with tighter control of type I error rate than its competitors. In many cases, the performance of super-delta is close to that of an oracle test in which datasets without technical noise were used. We then applied all methods to a collection of gene expression datasets on breast cancer patients who received neoadjuvant chemotherapy. While there is a substantial overlap of the DEGs identified by all of them, super-delta were able to identify comparatively more DEGs than its competitors. Downstream gene set enrichment analysis confirmed that all these methods selected largely consistent pathways. Detailed investigations on the relatively small differences showed that pathways identified by super-delta have better connections to breast cancer than other methods. As a new pipeline, super-delta provides new insights to the area of differential gene expression analysis. Solid theoretical foundation supports its asymptotic unbiasedness and technical noise-free properties. Implementation on real and simulated datasets demonstrates its decent performance compared with state-of-art procedures. It also has the potential of expansion to be incorporated with other data type and/or more general between-group comparison problems.
Influence of stationary components on unsteady flow in industrial centrifugal compressors
NASA Technical Reports Server (NTRS)
Bonciani, L.; Terrinoni, L.
1984-01-01
An experimental investigation was performed to determine the characteristics of the onset and the growth of rotating nonuniform flow in a standard low specific speed stage, normally utilized in high pressure applications, in relation to change of stationary component geometry. Four configurations, differing only in the return channel and crossover geometry were tested on an atmospheric pressure open loop test rig. Experimental results make conspicious the effect of return channel geometry and give the possibility of shifting the unstable zone onset varying such geometry. An attempt was made to interpret the experimental results in the Emmons - Stenning's rotating stall theory.
Testing models of parental investment strategy and offspring size in ants.
Gilboa, Smadar; Nonacs, Peter
2006-01-01
Parental investment strategies can be fixed or flexible. A fixed strategy predicts making all offspring a single 'optimal' size. Dynamic models predict flexible strategies with more than one optimal size of offspring. Patterns in the distribution of offspring sizes may thus reveal the investment strategy. Static strategies should produce normal distributions. Dynamic strategies should often result in non-normal distributions. Furthermore, variance in morphological traits should be positively correlated with the length of developmental time the traits are exposed to environmental influences. Finally, the type of deviation from normality (i.e., skewed left or right, or platykurtic) should be correlated with the average offspring size. To test the latter prediction, we used simulations to detect significant departures from normality and categorize distribution types. Data from three species of ants strongly support the predicted patterns for dynamic parental investment. Offspring size distributions are often significantly non-normal. Traits fixed earlier in development, such as head width, are less variable than final body weight. The type of distribution observed correlates with mean female dry weight. The overall support for a dynamic parental investment model has implications for life history theory. Predicted conflicts over parental effort, sex investment ratios, and reproductive skew in cooperative breeders follow from assumptions of static parental investment strategies and omnipresent resource limitations. By contrast, with flexible investment strategies such conflicts can be either absent or maladaptive.
Morphological differences in the lateral geniculate nucleus associated with dyslexia
Giraldo-Chica, Mónica; Hegarty, John P.; Schneider, Keith A.
2015-01-01
Developmental dyslexia is a common learning disability characterized by normal intelligence but difficulty in skills associated with reading, writing and spelling. One of the most prominent, albeit controversial, theories of dyslexia is the magnocellular theory, which suggests that malfunction of the magnocellular system in the brain is responsible for the behavioral deficits. We sought to test the basis of this theory by directly measuring the lateral geniculate nucleus (LGN), the only location in the brain where the magnocellular and parvocellular streams are spatially disjoint. Using high-resolution proton-density weighted MRI scans, we precisely measured the anatomical boundaries of the LGN in 13 subjects with dyslexia (five female) and 13 controls (three female), all 22–26 years old. The left LGN was significantly smaller in volume in subjects with dyslexia and also differed in shape; no differences were observed in the right LGN. The functional significance of this asymmetry is unknown, but these results are consistent with the magnocellular theory and support theories of dyslexia that involve differences in the early visual system. PMID:26082892
Unifying Theories of Psychedelic Drug Effects
Swanson, Link R.
2018-01-01
How do psychedelic drugs produce their characteristic range of acute effects in perception, emotion, cognition, and sense of self? How do these effects relate to the clinical efficacy of psychedelic-assisted therapies? Efforts to understand psychedelic phenomena date back more than a century in Western science. In this article I review theories of psychedelic drug effects and highlight key concepts which have endured over the last 125 years of psychedelic science. First, I describe the subjective phenomenology of acute psychedelic effects using the best available data. Next, I review late 19th-century and early 20th-century theories—model psychoses theory, filtration theory, and psychoanalytic theory—and highlight their shared features. I then briefly review recent findings on the neuropharmacology and neurophysiology of psychedelic drugs in humans. Finally, I describe recent theories of psychedelic drug effects which leverage 21st-century cognitive neuroscience frameworks—entropic brain theory, integrated information theory, and predictive processing—and point out key shared features that link back to earlier theories. I identify an abstract principle which cuts across many theories past and present: psychedelic drugs perturb universal brain processes that normally serve to constrain neural systems central to perception, emotion, cognition, and sense of self. I conclude that making an explicit effort to investigate the principles and mechanisms of psychedelic drug effects is a uniquely powerful way to iteratively develop and test unifying theories of brain function. PMID:29568270
Gravitational waves in Einstein-æther and generalized TeVeS theory after GW170817
NASA Astrophysics Data System (ADS)
Gong, Yungui; Hou, Shaoqi; Liang, Dicong; Papantonopoulos, Eleftherios
2018-04-01
In this work we discuss the polarization contents of Einstein-æther theory and the generalized tensor-vector-scalar (TeVeS) theory, as both theories have a normalized timelike vector field. We derive the linearized equations of motion around the flat spacetime background using the gauge-invariant variables to easily separate physical degrees of freedom. We find the plane wave solutions and identify the polarizations by examining the geodesic deviation equations. We find that there are five polarizations in Einstein-æther theory and six polarizations in the generalized TeVeS theory. In particular, the transverse breathing mode is mixed with the pure longitudinal mode. We also discuss the experimental tests of the extra polarizations in Einstein-æther theory using pulsar timing arrays combined with the gravitational-wave speed bound derived from the observations on GW 170817 and GRB 170817A. It turns out that it might be difficult to use pulsar timing arrays to distinguish different polarizations in Einstein-æther theory. The same speed bound also forces one of the propagating modes in the generalized TeVeS theory to travel much faster than the speed of light. Since the strong coupling problem does not exist in some parameter subspaces, the generalized TeVeS theory is excluded in these parameter subspaces.
Cognitive, emotional and social markers of serial murdering.
Angrilli, Alessandro; Sartori, Giuseppe; Donzella, Giovanna
2013-01-01
Although criminal psychopathy is starting to be relatively well described, our knowledge of the characteristics and scientific markers of serial murdering is still very poor. A serial killer who murdered more than five people, KT, was administered a battery of standardized tests aimed at measuring neuropsychological impairment and social/emotional cognition deficits. KT exhibited a striking dissociation between a high level of emotional detachment and a low score on the antisocial behavior scale on the Psychopathy Checklist-Revised (PCL-R). The Minnesota Multiphasic Personality Inventory-2 showed a normal pattern with the psychotic triad at borderline level. KT had a high intelligence score and showed almost no impairment in cognitive tests sensitive to frontal lobe dysfunction (Wisconsin Card Sorting Test, Theory of Mind, Tower of London, this latter evidenced a mild impairment in planning performance). In the tests on moral, emotional and social cognition, his patterns of response differed from matched controls and from past reports on criminal psychopaths as, unlike these individuals, KT exhibited normal recognition of fear and a relatively intact knowledge of moral rules but he was impaired in the recognition of anger, embarrassment and conventional social rules. The overall picture of KT suggests that serial killing may be closer to normality than psychopathy defined according to either the DSM IV or the PCL-R, and it would be characterized by a relatively spared moral cognition and selective deficits in social and emotional cognition domains.
An account of the Speech-to-Song Illusion using Node Structure Theory.
Castro, Nichol; Mendoza, Joshua M; Tampke, Elizabeth C; Vitevitch, Michael S
2018-01-01
In the Speech-to-Song Illusion, repetition of a spoken phrase results in it being perceived as if it were sung. Although a number of previous studies have examined which characteristics of the stimulus will produce the illusion, there is, until now, no description of the cognitive mechanism that underlies the illusion. We suggest that the processes found in Node Structure Theory that are used to explain normal language processing as well as other auditory illusions might also account for the Speech-to-Song Illusion. In six experiments we tested whether the satiation of lexical nodes, but continued priming of syllable nodes may lead to the Speech-to-Song Illusion. The results of these experiments provide evidence for the role of priming, activation, and satiation as described in Node Structure Theory as an explanation of the Speech-to-Song Illusion.
Rapidly rotating neutron stars with a massive scalar field—structure and universal relations
NASA Astrophysics Data System (ADS)
Doneva, Daniela D.; Yazadjiev, Stoytcho S.
2016-11-01
We construct rapidly rotating neutron star models in scalar-tensor theories with a massive scalar field. The fact that the scalar field has nonzero mass leads to very interesting results since the allowed range of values of the coupling parameters is significantly broadened. Deviations from pure general relativity can be very large for values of the parameters that are in agreement with the observations. We found that the rapid rotation can magnify the differences several times compared to the static case. The universal relations between the normalized moment of inertia and quadrupole moment are also investigated both for the slowly and rapidly rotating cases. The results show that these relations are still EOS independent up to a large extend and the deviations from pure general relativity can be large. This places the massive scalar-tensor theories amongst the few alternative theories of gravity that can be tested via the universal I-Love-Q relations.
Piloting a fiber optics and electronic theory curriculum with high school students
NASA Astrophysics Data System (ADS)
Gilchrist, Pamela O.; Carpenter, Eric D.; Gray-Battle, Asia
2014-07-01
Previous participants from a multi-year blended learning intervention focusing on science, technology, engineering and mathematics (STEM) content knowledge, technical, college, and career preparatory skills were recruited to pilot a new module designed by the project staff. Participants met for a total of 22 contact hours receiving lectures from staff and two guest speakers from industries relevant to photonics, fiber optics hands-on experimentation, and practice with documenting progress. Activities included constructing a fiber optics communication system, troubleshooting breadboard circuits and diagrammed circuits as well as hypothesis testing to discover various aspects of fiber optic cables. Participants documented their activities, wrote reflections on the content and learning endeavor and gave talks about their research experiences to staff, peers, and relatives during the last session. Overall, it was found that a significant gain in content knowledge occurred between the time of pre-testing (Mean=0.54) and post-testing time points for the fiber optics portion of the curriculum via the use of a paired samples t-test (Mean=0.71), t=-2.72, p<.05. Additionally, the electronic theory test results were not a normal distribution and for this reason non-parametric testing was used, specifically a Wilcoxon signed-ranks test. Results indicated a significant increase in content knowledge occurred over time between the pre- (Mdn=0.35) and post-testing time points (Mdn=0.80) z=-2.49, p<,05, r=-0.59 for the electronic theory portion of the curriculum. An equivalent control group was recruited from the remaining participant pool, allowing for comparison between groups. The program design, findings, and lessons learned will be reported in this paper.
Affect intensity and processing fluency of deterrents.
Holman, Andrei
2013-01-01
The theory of emotional intensity (Brehm, 1999) suggests that the intensity of affective states depends on the magnitude of their current deterrents. Our study investigated the role that fluency--the subjective experience of ease of information processing--plays in the emotional intensity modulations as reactions to deterrents. Following an induction phase of good mood, we manipulated both the magnitude of deterrents (using sets of photographs with pre-tested potential to instigate an emotion incompatible with the pre-existent affective state--pity) and their processing fluency (normal vs. enhanced through subliminal priming). Current affective state and perception of deterrents were then measured. In the normal processing conditions, the results revealed the cubic effect predicted by the emotional intensity theory, with the initial affective state being replaced by the one appropriate to the deterrent only in participants exposed to the high magnitude deterrence. In the enhanced fluency conditions the emotional intensity pattern was drastically altered; also, the replacement of the initial affective state occurred at a lower level of deterrence magnitude (moderate instead of high), suggesting the strengthening of deterrence emotional impact by enhanced fluency.
Underwood, H R; Peterson, A F; Magin, R L
1992-02-01
A rectangular microstrip antenna radiator is investigated for its near-zone radiation characteristics in water. Calculations of a cavity model theory are compared with the electric-field measurements of a miniature nonperturbing diode-dipole E-field probe whose 3 mm tip was positioned by an automatic three-axis scanning system. These comparisons have implications for the use of microstrip antennas in a multielement microwave hyperthermia applicator. Half-wavelength rectangular microstrip patches were designed to radiate in water at 915 MHz. Both low (epsilon r = 10) and high (epsilon r = 85) dielectric constant substrates were tested. Normal and tangential components of the near-zone radiated electric field were discriminated by appropriate orientation of the E-field probe. Low normal to transverse electric-field ratios at 3.0 cm depth indicate that the radiators may be useful for hyperthermia heating with an intervening water bolus. Electric-field pattern addition from a three-element linear array of these elements in water indicates that phase and amplitude adjustment can achieve some limited control over the distribution of radiated power.
Theory of mind and frontal lobe pathology in schizophrenia: a voxel-based morphometry study.
Hirao, Kazuyuki; Miyata, Jun; Fujiwara, Hironobu; Yamada, Makiko; Namiki, Chihiro; Shimizu, Mitsuaki; Sawamoto, Nobukatsu; Fukuyama, Hidenao; Hayashi, Takuji; Murai, Toshiya
2008-10-01
Impaired ability to infer the mental states of others (theory of mind; ToM) is considered to be a key contributor to the poor social functioning of patients with schizophrenia. Although neuroimaging and lesion studies have provided empirical evidence for the neural basis of ToM ability, including the involvement of several prefrontal and temporal structures, the association between pathology of these structures and ToM impairment in schizophrenia patients is less well understood. To address this issue, we investigated structural brain abnormalities and ToM impairment in patients with schizophrenia, and examined the relationship between them. Twenty schizophrenia patients and 20 age-, sex- and education-matched healthy participants underwent magnetic resonance imaging (MRI) and were examined for ToM ability based on the revised version of the "Reading the Mind in the Eyes" (or Eyes) test [Baron-Cohen, S., Wheelwright, S., Hill, J., Raste, Y., Plumb, I., 2001. The 'Reading the Mind in the Eyes' test revised version: A study with normal adults, and adults with Asperger syndrome or high-functioning autism. J. Child Psychol. Psychiatry 42, 241-251]. Voxel-based morphometry (VBM) was performed to investigate regional brain alterations. Relative to normal controls, schizophrenia patients exhibited gray matter reductions in the dorsomedial prefrontal cortex (DMPFC), left ventrolateral prefrontal cortex (VLPFC), ventromedial prefrontal cortex (VMPFC), anterior cingulate cortex (ACC), right superior temporal gyrus (STG) and right insula. The patients performed poorly on the Eyes test. Importantly, poor performance on the Eyes test was found to be associated with gray matter reduction in the left VLPFC in the patient group. These results suggest that prefrontal cortical reduction, especially in the left VLPFC, is a key pathology underlying the difficulties faced by schizophrenia patients in inferring the mental states of others.
Cointegration as a data normalization tool for structural health monitoring applications
NASA Astrophysics Data System (ADS)
Harvey, Dustin Y.; Todd, Michael D.
2012-04-01
The structural health monitoring literature has shown an abundance of features sensitive to various types of damage in laboratory tests. However, robust feature extraction in the presence of varying operational and environmental conditions has proven to be one of the largest obstacles in the development of practical structural health monitoring systems. Cointegration, a technique adapted from the field of econometrics, has recently been introduced to the SHM field as one solution to the data normalization problem. Response measurements and feature histories often show long-run nonstationarity due to fluctuating temperature, load conditions, or other factors that leads to the occurrence of false positives. Cointegration theory allows nonstationary trends common to two or more time series to be modeled and subsequently removed. Thus, the residual retains sensitivity to damage with dependence on operational and environmental variability removed. This study further explores the use of cointegration as a data normalization tool for structural health monitoring applications.
Intra- and interpattern relations in letter recognition.
Sanocki, T
1991-11-01
Strings of 4 unrelated letters were backward masked at varying durations to examine 3 major issues. (a) One issue concerned relational features. Letters with abnormal relations but normal elements were created by interchanging elements between large and small normal letters. Overall accuracy was higher for letters with normal relations, consistent with the idea that relational features are important in recognition. (b) Interpattern relations were examined by mixing large and small letters within strings. Relative to pure strings, accuracy was reduced, but only for small letters and only when in mixed strings. This effect can be attributed to attentional priority for larger forms over smaller forms, which also explains global precedence with hierarchical forms. (c) Forced-choice alternatives were manipulated in Experiments 2 and 3 to test feature integration theory. Relational information was found to be processed at least as early as feature presence or absence.
Challenging the Ideology of Normal in Schools
ERIC Educational Resources Information Center
Annamma, Subini A.; Boelé, Amy L.; Moore, Brooke A.; Klingner, Janette
2013-01-01
In this article, we build on Brantlinger's work to critique the binary of normal and abnormal applied in US schools that create inequities in education. Operating from a critical perspective, we draw from Critical Race Theory, Disability Studies in Education, and Cultural/Historical Activity Theory to build a conceptual framework for…
Introduction to "Queering the Writing Center"
ERIC Educational Resources Information Center
Eodice, Michele
2010-01-01
Queer theory challenges what is "normal" and questions the mechanics behind individuals and their institutions' efforts to maintain "normal." Queer theory can help a person get over himself/herself, and, as a result, the words, bodies, spaces, and beliefs that he/she holds dear will be called upon to respond. Harry Denny's article instructs…
[Overview and prospect of syndrome differentiation of hypertension in traditional Chinese medicine].
Yang, Xiao-Chen; Xiong, Xing-Jiang; Wang, Jie
2014-01-01
This article is to overview the literature of syndrome differentiation of traditional Chinese medicine on hypertension. According to the theory of disease in combination with syndrome, we concluded syndrome types of hypertension in four aspects, including national standards, industry standards, teaching standards and personal experience. Meanwhile, in order to provide new methods and approaches for normalized research, we integrated modern testing methods and statistical methods to analyze syndrome differentiation for the treatment of hypertension.
2014-11-01
intelligence. No. Title of Case Study P U Pc Pt Ft Pa 1 Clinical vs. Actuarial Geospatial Profiling Strategies X X 2 Route Security in Baghdad X...support. Information Sciences , 176, 1570-1589. Burns, K. (2005). Mental models and normal errors. In Montgomery, H., Lipshitz, & Brehmer, B. (eds...utilities. Information Sciences , 179, 1599-1607. Davis, M. (1997). Game Theory: A Nontechnical Introduction. New York: Dover. Edwards, W. (1982
Time reversal imaging and cross-correlations techniques by normal mode theory
NASA Astrophysics Data System (ADS)
Montagner, J.; Fink, M.; Capdeville, Y.; Phung, H.; Larmat, C.
2007-12-01
Time-reversal methods were successfully applied in the past to acoustic waves in many fields such as medical imaging, underwater acoustics, non destructive testing and recently to seismic waves in seismology for earthquake imaging. The increasing power of computers and numerical methods (such as spectral element methods) enables one to simulate more and more accurately the propagation of seismic waves in heterogeneous media and to develop new applications, in particular time reversal in the three-dimensional Earth. Generalizing the scalar approach of Draeger and Fink (1999), the theoretical understanding of time-reversal method can be addressed for the 3D- elastic Earth by using normal mode theory. It is shown how to relate time- reversal methods on one hand, with auto-correlation of seismograms for source imaging and on the other hand, with cross-correlation between receivers for structural imaging and retrieving Green function. The loss of information will be discussed. In the case of source imaging, automatic location in time and space of earthquakes and unknown sources is obtained by time reversal technique. In the case of big earthquakes such as the Sumatra-Andaman earthquake of december 2004, we were able to reconstruct the spatio-temporal history of the rupture. We present here some new applications at the global scale of these techniques on synthetic tests and on real data.
Trietsch, Jasper; van Steenkiste, Ben; Hobma, Sjoerd; Frericks, Arnoud; Grol, Richard; Metsemakers, Job; van der Weijden, Trudy
2014-12-01
A quality improvement strategy consisting of comparative feedback and peer review embedded in available local quality improvement collaboratives proved to be effective in changing the test-ordering behaviour of general practitioners. However, implementing this strategy was problematic. We aimed for large-scale implementation of an adapted strategy covering both test ordering and prescribing performance. Because we failed to achieve large-scale implementation, the aim of this study was to describe and analyse the challenges of the transferring process. In a qualitative study 19 regional health officers, pharmacists, laboratory specialists and general practitioners were interviewed within 6 months after the transfer period. The interviews were audiotaped, transcribed and independently coded by two of the authors. The codes were matched to the dimensions of the normalization process theory. The general idea of the strategy was widely supported, but generating the feedback was more complex than expected and the need for external support after transfer of the strategy remained high because participants did not assume responsibility for the work and the distribution of resources that came with it. Evidence on effectiveness, a national infrastructure for these collaboratives and a general positive attitude were not sufficient for normalization. Thinking about managing large databases, responsibility for tasks and distribution of resources should start as early as possible when planning complex quality improvement strategies. Merely exploring the barriers and facilitators experienced in a preceding trial is not sufficient. Although multifaceted implementation strategies to change professional behaviour are attractive, their inherent complexity is also a pitfall for large-scale implementation. © 2014 John Wiley & Sons, Ltd.
Quantized mode of a leaky cavity
NASA Astrophysics Data System (ADS)
Dutra, S. M.; Nienhuis, G.
2000-12-01
We use Thomson's classical concept of mode of a leaky cavity to develop a quantum theory of cavity damping. This theory generalizes the conventional system-reservoir theory of high-Q cavity damping to arbitrary Q. The small system now consists of damped oscillators corresponding to the natural modes of the leaky cavity rather than undamped oscillators associated with the normal modes of a fictitious perfect cavity. The formalism unifies semiclassical Fox-Li modes and the normal modes traditionally used for quantization. It also lays the foundations for a full quantum description of excess noise. The connection with Siegman's semiclassical work is straightforward. In a wider context, this theory constitutes a radical departure from present models of dissipation in quantum mechanics: unlike conventional models, system and reservoir operators no longer commute with each other. This noncommutability is an unavoidable consequence of having to use natural cavity modes rather than normal modes of a fictitious perfect cavity.
Sequential Objective Structured Clinical Examination based on item response theory in Iran.
Hejri, Sara Mortaz; Jalili, Mohammad
2017-01-01
In a sequential objective structured clinical examination (OSCE), all students initially take a short screening OSCE. Examinees who pass are excused from further testing, but an additional OSCE is administered to the remaining examinees. Previous investigations of sequential OSCE were based on classical test theory. We aimed to design and evaluate screening OSCEs based on item response theory (IRT). We carried out a retrospective observational study. At each station of a 10-station OSCE, the students' performance was graded on a Likert-type scale. Since the data were polytomous, the difficulty parameters, discrimination parameters, and students' ability were calculated using a graded response model. To design several screening OSCEs, we identified the 5 most difficult stations and the 5 most discriminative ones. For each test, 5, 4, or 3 stations were selected. Normal and stringent cut-scores were defined for each test. We compared the results of each of the 12 screening OSCEs to the main OSCE and calculated the positive and negative predictive values (PPV and NPV), as well as the exam cost. A total of 253 students (95.1%) passed the main OSCE, while 72.6% to 94.4% of examinees passed the screening tests. The PPV values ranged from 0.98 to 1.00, and the NPV values ranged from 0.18 to 0.59. Two tests effectively predicted the results of the main exam, resulting in financial savings of 34% to 40%. If stations with the highest IRT-based discrimination values and stringent cut-scores are utilized in the screening test, sequential OSCE can be an efficient and convenient way to conduct an OSCE.
Trojan dynamics well approximated by a new Hamiltonian normal form
NASA Astrophysics Data System (ADS)
Páez, Rocío Isabel; Locatelli, Ugo
2015-10-01
We revisit a classical perturbative approach to the Hamiltonian related to the motions of Trojan bodies, in the framework of the planar circular restricted three-body problem, by introducing a number of key new ideas in the formulation. In some sense, we adapt the approach of Garfinkel to the context of the normal form theory and its modern techniques. First, we make use of Delaunay variables for a physically accurate representation of the system. Therefore, we introduce a novel manipulation of the variables so as to respect the natural behaviour of the model. We develop a normalization procedure over the fast angle which exploits the fact that singularities in this model are essentially related to the slow angle. Thus, we produce a new normal form, i.e. an integrable approximation to the Hamiltonian. We emphasize some practical examples of the applicability of our normalizing scheme, e.g. the estimation of the stable libration region. Finally, we compare the level curves produced by our normal form with surfaces of section provided by the integration of the non-normalized Hamiltonian, with very good agreement. Further precision tests are also provided. In addition, we give a step-by-step description of the algorithm, allowing for extensions to more complicated models.
Influence of phase inversion on the formation and stability of one-step multiple emulsions.
Morais, Jacqueline M; Rocha-Filho, Pedro A; Burgess, Diane J
2009-07-21
A novel method of preparation of water-in-oil-in-micelle-containing water (W/O/W(m)) multiple emulsions using the one-step emulsification method is reported. These multiple emulsions were normal (not temporary) and stable over a 60 day test period. Previously, reported multiple emulsion by the one-step method were abnormal systems that formed at the inversion point of simple emulsion (where there is an incompatibility in the Ostwald and Bancroft theories, and typically these are O/W/O systems). Pseudoternary phase diagrams and bidimensional process-composition (phase inversion) maps were constructed to assist in process and composition optimization. The surfactants used were PEG40 hydrogenated castor oil and sorbitan oleate, and mineral and vegetables oils were investigated. Physicochemical characterization studies showed experimentally, for the first time, the significance of the ultralow surface tension point on multiple emulsion formation by one-step via phase inversion processes. Although the significance of ultralow surface tension has been speculated previously, to the best of our knowledge, this is the first experimental confirmation. The multiple emulsion system reported here was dependent not only upon the emulsification temperature, but also upon the component ratios, therefore both the emulsion phase inversion and the phase inversion temperature were considered to fully explain their formation. Accordingly, it is hypothesized that the formation of these normal multiple emulsions is not a result of a temporary incompatibility (at the inversion point) during simple emulsion preparation, as previously reported. Rather, these normal W/O/W(m) emulsions are a result of the simultaneous occurrence of catastrophic and transitional phase inversion processes. The formation of the primary emulsions (W/O) is in accordance with the Ostwald theory ,and the formation of the multiple emulsions (W/O/W(m)) is in agreement with the Bancroft theory.
McCarthy, R A
2001-02-01
Clinical and normal psychology have had a long tradition of close interaction in British psychology. The roots of this interplay may predate the development of the British Psychological Society, but the Society has encouraged and supported this line of research since its inception. One fundamental British insight has been to consider the evidence from pathology as a potential constraint on theories of normal function. In turn, theories of normal function have been used to understand and illuminate cognitive pathology. This review discusses some of the areas in which clinical contributions to cognitive theory have been most substantial. As with other contributions to this volume, attempts are also made to read the runes and anticipate future developments.
Analysis of Particle Image Velocimetry (PIV) Data for Acoustic Velocity Measurements
NASA Technical Reports Server (NTRS)
Blackshire, James L.
1997-01-01
Acoustic velocity measurements were taken using Particle Image Velocimetry (PIV) in a Normal Incidence Tube configuration at various frequency, phase, and amplitude levels. This report presents the results of the PIV analysis and data reduction portions of the test and details the processing that was done. Estimates of lower measurement sensitivity levels were determined based on PIV image quality, correlation, and noise level parameters used in the test. Comparison of measurements with linear acoustic theory are presented. The onset of nonlinear, harmonic frequency acoustic levels were also studied for various decibel and frequency levels ranging from 90 to 132 dB and 500 to 3000 Hz, respectively.
Increased heart rate after exercise facilitates the processing of fearful but not disgusted faces.
Pezzulo, G; Iodice, P; Barca, L; Chausse, P; Monceau, S; Mermillod, M
2018-01-10
Embodied theories of emotion assume that emotional processing is grounded in bodily and affective processes. Accordingly, the perception of an emotion re-enacts congruent sensory and affective states; and conversely, bodily states congruent with a specific emotion facilitate emotional processing. This study tests whether the ability to process facial expressions (faces having a neutral expression, expressing fear, or disgust) can be influenced by making the participants' body state congruent with the expressed emotion (e.g., high heart rate in the case of faces expressing fear). We designed a task requiring participants to categorize pictures of male and female faces that either had a neutral expression (neutral), or expressed emotions whose linkage with high heart rate is strong (fear) or significantly weaker or absent (disgust). Critically, participants were tested in two conditions: with experimentally induced high heart rate (Exercise) and with normal heart rate (Normal). Participants processed fearful faces (but not disgusted or neutral faces) faster when they were in the Exercise condition than in the Normal condition. These results support the idea that an emotionally congruent body state facilitates the automatic processing of emotionally-charged stimuli and this effect is emotion-specific rather than due to generic factors such as arousal.
Local vibrational modes of the water dimer - Comparison of theory and experiment
NASA Astrophysics Data System (ADS)
Kalescky, R.; Zou, W.; Kraka, E.; Cremer, D.
2012-12-01
Local and normal vibrational modes of the water dimer are calculated at the CCSD(T)/CBS level of theory. The local H-bond stretching frequency is 528 cm-1 compared to a normal mode stretching frequency of just 143 cm-1. The adiabatic connection scheme between local and normal vibrational modes reveals that the lowering is due to mass coupling, a change in the anharmonicity, and coupling with the local HOH bending modes. The local mode stretching force constant is related to the strength of the H-bond whereas the normal mode stretching force constant and frequency lead to an erroneous underestimation of the H-bond strength.
φq-field theory for portfolio optimization: “fat tails” and nonlinear correlations
NASA Astrophysics Data System (ADS)
Sornette, D.; Simonetti, P.; Andersen, J. V.
2000-08-01
Physics and finance are both fundamentally based on the theory of random walks (and their generalizations to higher dimensions) and on the collective behavior of large numbers of correlated variables. The archetype examplifying this situation in finance is the portfolio optimization problem in which one desires to diversify on a set of possibly dependent assets to optimize the return and minimize the risks. The standard mean-variance solution introduced by Markovitz and its subsequent developments is basically a mean-field Gaussian solution. It has severe limitations for practical applications due to the strongly non-Gaussian structure of distributions and the nonlinear dependence between assets. Here, we present in details a general analytical characterization of the distribution of returns for a portfolio constituted of assets whose returns are described by an arbitrary joint multivariate distribution. In this goal, we introduce a non-linear transformation that maps the returns onto Gaussian variables whose covariance matrix provides a new measure of dependence between the non-normal returns, generalizing the covariance matrix into a nonlinear covariance matrix. This nonlinear covariance matrix is chiseled to the specific fat tail structure of the underlying marginal distributions, thus ensuring stability and good conditioning. The portfolio distribution is then obtained as the solution of a mapping to a so-called φq field theory in particle physics, of which we offer an extensive treatment using Feynman diagrammatic techniques and large deviation theory, that we illustrate in details for multivariate Weibull distributions. The interaction (non-mean field) structure in this field theory is a direct consequence of the non-Gaussian nature of the distribution of asset price returns. We find that minimizing the portfolio variance (i.e. the relatively “small” risks) may often increase the large risks, as measured by higher normalized cumulants. Extensive empirical tests are presented on the foreign exchange market that validate satisfactorily the theory. For “fat tail” distributions, we show that an adequate prediction of the risks of a portfolio relies much more on the correct description of the tail structure rather than on their correlations. For the case of asymmetric return distributions, our theory allows us to generalize the return-risk efficient frontier concept to incorporate the dimensions of large risks embedded in the tail of the asset distributions. We demonstrate that it is often possible to increase the portfolio return while decreasing the large risks as quantified by the fourth and higher-order cumulants. Exact theoretical formulas are validated by empirical tests.
Long-wave theory for a new convective instability with exponential growth normal to the wall.
Healey, J J
2005-05-15
A linear stability theory is presented for the boundary-layer flow produced by an infinite disc rotating at constant angular velocity in otherwise undisturbed fluid. The theory is developed in the limit of long waves and when the effects of viscosity on the waves can be neglected. This is the parameter regime recently identified by the author in a numerical stability investigation where a curious new type of instability was found in which disturbances propagate and grow exponentially in the direction normal to the disc, (i.e. the growth takes place in a region of zero mean shear). The theory describes the mechanisms controlling the instability, the role and location of critical points, and presents a saddle-point analysis describing the large-time evolution of a wave packet in frames of reference moving normal to the disc. The theory also shows that the previously obtained numerical solutions for numerically large wavelengths do indeed lie in the asymptotic long-wave regime, and so the behaviour and mechanisms described here may apply to a number of cross-flow instability problems.
Wang, Yong-Guang; Wang, Yi-Qiang; Chen, Shu-Lin; Zhu, Chun-Yan; Wang, Kai
2008-11-30
Previous reports have conceptualized theory of mind (ToM) as comprising two components and questioned whether ToM deficits are associated with psychotic symptoms. We investigated 33 nonpsychotic depressed inpatients, 23 psychotic depressed inpatients, and 53 normal controls with the following measures: Eyes Task, Faux pas Task, Verbal Fluency Test (VFT), Digit Span Test (DST) and WAIS-IQ. The depressed patients were also evaluated with the Beck Depression Inventory-II (BDI-II) and the Brief Psychiatric Rating Scale (BPRS). The nonpsychotic depressed patients and the psychotic depressed individuals were significantly impaired on tasks involving ToM social-perceptual and social-cognitive components, as well as the VFT. The psychotic depressed patients performed significantly worse than nonpsychotic depressed patients on ToM tasks. An association was found between ToM performances and both BPRS total and hostile-suspiciousness scores in the depressed group. Both of the ToM components were impaired in depressed patients. Similar mechanisms and neurobiological substrate may contribute to schizophrenia and major depression.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blume-Kohout, Robin J; Scholten, Travis L.
Quantum state tomography on a d-dimensional system demands resources that grow rapidly with d. They may be reduced by using model selection to tailor the number of parameters in the model (i.e., the size of the density matrix). Most model selection methods typically rely on a test statistic and a null theory that describes its behavior when two models are equally good. Here, we consider the loglikelihood ratio. Because of the positivity constraint ρ ≥ 0, quantum state space does not generally satisfy local asymptotic normality (LAN), meaning the classical null theory for the loglikelihood ratio (the Wilks theorem) shouldmore » not be used. Thus, understanding and quantifying how positivity affects the null behavior of this test statistic is necessary for its use in model selection for state tomography. We define a new generalization of LAN, metric-projected LAN, show that quantum state space satisfies it, and derive a replacement for the Wilks theorem. In addition to enabling reliable model selection, our results shed more light on the qualitative effects of the positivity constraint on state tomography.« less
Theory of mind and functionality in bipolar patients with symptomatic remission.
Barrera, Angeles; Vázquez, Gustavo; Tannenhaus, Lucila; Lolich, María; Herbst, Luis
2013-01-01
Functional deficits are commonly observed in bipolar disorder after symptomatic remission. Social cognition deficits have also been reported, which could contribute to dysfunction in patients with bipolar disorder in remission. Twelve bipolar disorder patients in symptomatic remission (7 patients with bipolar disorder type I and 5 with bipolar disorder type II) and 12 healthy controls completed the Reading the Mind in the Eyes Test and the Faux Pas Test to evaluate theory of mind (ToM). Both groups also completed the Functional Assessment Short Test (FAST). The performance of the bipolar patients in the cognitive component of ToM was below normal, although the difference between the control group was not statistically significant (P=.078), with a trend to a worse performance associated with a higher number of depressive episodes (P=.082). There were no statistically significant differences between groups for the emotional component of ToM. Global functionality was significantly lower in bipolar patients compared to the control group (P=.001). Significant differences were also observed between both groups in five of the six dimensions of functionality assessed. No significant correlation was found between functionality and theory of mind. Bipolar patients in symptomatic remission exhibit impairments in several areas of functioning. Cognitive ToM appears more affected than emotional ToM. Deficits in ToM were not related to functional impairment. Copyright © 2012 SEP y SEPB. Published by Elsevier Espana. All rights reserved.
Advanced Theory of Mind in patients at early stage of Parkinson's disease.
Yu, Rwei-Ling; Wu, Ruey-Meei; Chiu, Ming-Jang; Tai, Chun-Hwei; Lin, Chin-Hsien; Hua, Mau-Sun
2012-01-01
Advanced Theory of Mind (ToM) refers to the sophisticated ability to infer other people's thoughts, intentions, or emotions in social situations. With appropriate advanced ToM, one can behave well in social interactions and can understand the intention of others' behavior. Prefrontal cortex plays a vital role in this ability, as shown in functional brain imaging and lesion studies. Considering the primary neuropathology of Parkinson's disease (PD) involving the frontal lobe system, patients with PD are expected to exhibit deficits in advanced ToM. However, few studies on this issue have been explored, and whether advanced ToM is independent of executive functions remains uncertain. Thirty-nine early non-demented PD patients and 40 normal control subjects were included. Both groups were matched in age, level of education, and verbal intelligence quotient. Each participant received advanced ToM, executive functions, and verbal intelligence quotient tests. We discovered that the performance of the PD patients on the Cartoon ToM task was significantly poorer than that of their normal counterparts. Correlation analysis revealed that performance scores of advanced ToM in PD patients were significantly associated with their executive functions scores; however, this is not the case for normal controls. We conclude that dysfunction of advanced ToM develops in early PD patients, who require more cognitive abilities than their normal counterparts to generate advanced ToM. Our findings might be helpful in developing educational and medical care programs for PD patients in the future. Copyright © 2011 Elsevier Ltd. All rights reserved.
Markevych, Vladlena; Asbjørnsen, Arve E; Lind, Ola; Plante, Elena; Cone, Barbara
2011-07-01
The present study investigated a possible connection between speech processing and cochlear function. Twenty-two subjects with age range from 18 to 39, balanced for gender with normal hearing and without any known neurological condition, were tested with the dichotic listening (DL) test, in which listeners were asked to identify CV-syllables in a nonforced, and also attention-right, and attention-left condition. Transient evoked otoacoustic emissions (TEOAEs) were recorded for both ears, with and without the presentation of contralateral broadband noise. The main finding was a strong negative correlation between language laterality as measured with the dichotic listening task and of the TEOAE responses. The findings support a hypothesis of shared variance between central and peripheral auditory lateralities, and contribute to the attentional theory of auditory lateralization. The results have implications for the understanding of the cortico-fugal efferent control of cochlear activity. 2011 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Obrien, T. Kevin; Hooper, S. J.
1991-01-01
Quasi-static tension tests were conducted on AS4/3501-6 graphite epoxy laminates. Dye penetrant enhanced x-radiography was used to document the onset of matrix cracking and the onset of local delaminations at the intersection of the matrix cracks and the free edge. Edge micrographs taken after the onset of damage were used to verify the location of the matrix cracks and local delamination through the laminate thickness. A quasi-3D finite element analysis was conducted to calculate the stresses responsible for matrix cracking in the off-axis plies. Laminated plate theory indicated that the transverse normal stresses were compressive. However, the finite element analysis yielded tensile transverse normal stresses near the free edge. Matrix cracks formed in the off-axis plies near the free edge where in-plane transverse stresses were tensile and had their greatest magnitude. The influence of the matrix crack on interlaminar stresses is also discussed.
Ohno, Kaoru; Ono, Shota; Isobe, Tomoharu
2017-02-28
The quasiparticle (QP) energies, which are minus of the energies required by removing or produced by adding one electron from/to the system, corresponding to the photoemission or inverse photoemission (PE/IPE) spectra, are determined together with the QP wave functions, which are not orthonormal and even not linearly independent but somewhat similar to the normal spin orbitals in the theory of the configuration interaction, by self-consistently solving the QP equation coupled with the equation for the self-energy. The electron density, kinetic, and all interaction energies can be calculated using the QP wave functions. We prove in a simple way that the PE/IPE spectroscopy and therefore this QP theory can be applied to an arbitrary initial excited eigenstate. In this proof, we show that the energy-dependence of the self-energy is not an essential difficulty, and the QP picture holds exactly if there is no relaxation mechanism in the system. The validity of the present theory for some initial excited eigenstates is tested using the one-shot GW approximation for several atoms and molecules.
Aerodynamic characteristics of horizontal tail surfaces
NASA Technical Reports Server (NTRS)
Silverstein, Abe; Katzoff, S
1940-01-01
Collected data are presented on the aerodynamic characteristics of 17 horizontal tail surfaces including several with balanced elevators and two with end plates. Curves are given for coefficients of normal force, drag, and elevator hinge moment. A limited analysis of the results has been made. The normal-force coefficients are in better agreement with the lifting-surface theory of Prandtl and Blenk for airfoils of low aspect ratio than with the usual lifting-line theory. Only partial agreement exists between the elevator hinge-moment coefficients and those predicted by Glauert's thin-airfoil theory.
ERIC Educational Resources Information Center
Yuan, Ke-Hai; Lu, Laura
2008-01-01
This article provides the theory and application of the 2-stage maximum likelihood (ML) procedure for structural equation modeling (SEM) with missing data. The validity of this procedure does not require the assumption of a normally distributed population. When the population is normally distributed and all missing data are missing at random…
NASA Astrophysics Data System (ADS)
Hirsch, J. E.
2018-05-01
Since the discovery of the Meissner effect, the superconductor to normal (S-N) phase transition in the presence of a magnetic field is understood to be a first-order phase transformation that is reversible under ideal conditions and obeys the laws of thermodynamics. The reverse (N-S) transition is the Meissner effect. This implies in particular that the kinetic energy of the supercurrent is not dissipated as Joule heat in the process where the superconductor becomes normal and the supercurrent stops. In this paper, we analyze the entropy generation and the momentum transfer between the supercurrent and the body in the S-N transition and the N-S transition as described by the conventional theory of superconductivity. We find that it is not possible to explain the transition in a way that is consistent with the laws of thermodynamics unless the momentum transfer between the supercurrent and the body occurs with zero entropy generation, for which the conventional theory of superconductivity provides no mechanism. Instead, we point out that the alternative theory of hole superconductivity does not encounter such difficulties.
Normal Science Education and its Dangers: The Case of School Chemistry
NASA Astrophysics Data System (ADS)
Van Berkel, Berry; De Vos, Wobbe; Verdonk, Adri H.; Pilot, Albert
We started the Conceptual Structure of School Chemistry research project, a part of which is reported on here, with an attempt to solve the problem of the hidden structure in school chemistry. In order to solve that problem, and informed by previous research, we performed a content analysis of school chemistry textbooks and syllabi. This led us to the hypothesis that school chemistry curricula are based on an underlying, coherent structure of chemical concepts that students are supposed to learn for the purpose of explaining and predicting chemical phenomena. The elicited comments and criticisms of an International Forum of twenty-eight researchers of chemical education, though, refuted the central claims of this hypothesis. This led to a descriptive theory of the currently dominant school chemistry curriculum in terms of a rigid combination of a specific substantive structure, based on corpuscular theory, a specific philosophical structure, educational positivism, and a specific pedagogical structure, involving initiatory and preparatory training of future chemists. Secondly, it led to an explanatory theory of the structure of school chemistry - based on Kuhn's theory of normal science and scientific training - in which dominant school chemistry is interpreted as a form of normal science education. Since the former has almost all characteristics in common with the latter, dominant school chemistry must be regarded as normal chemistry education. Forum members also formulated a number of normative criticisms on dominant school chemistry, which we interpret as specific dangers of normal chemistry education, complementing Popper's discussion of the general dangers of normal science and its teaching. On the basis of these criticisms, it is argued that normal chemistry education is isolated from common sense, everyday life and society, history and philosophy of science, technology, school physics, and from chemical research.
Kletenik-Edelman, Orly; Reichman, David R; Rabani, Eran
2011-01-28
A novel quantum mode coupling theory combined with a kinetic approach is developed for the description of collective density fluctuations in quantum liquids characterized by Boltzmann statistics. Three mode-coupling approximations are presented and applied to study the dynamic response of para-hydrogen near the triple point and normal liquid helium above the λ-transition. The theory is compared with experimental results and to the exact imaginary time data generated by path integral Monte Carlo simulations. While for liquid para-hydrogen the combination of kinetic and quantum mode-coupling theory provides semi-quantitative results for both short and long time dynamics, it fails for normal liquid helium. A discussion of this failure based on the ideal gas limit is presented.
Role of environmental variability in the evolution of life history strategies.
Hastings, A; Caswell, H
1979-09-01
We reexamine the role of environmental variability in the evolution of life history strategies. We show that normally distributed deviations in the quality of the environment should lead to normally distributed deviations in the logarithm of year-to-year survival probabilities, which leads to interesting consequences for the evolution of annual and perennial strategies and reproductive effort. We also examine the effects of using differing criteria to determine the outcome of selection. Some predictions of previous theory are reversed, allowing distinctions between r and K theory and a theory based on variability. However, these distinctions require information about both the environment and the selection process not required by current theory.
NASA Technical Reports Server (NTRS)
Miquel, J.; Binnard, R.; Fleming, J. E.
1983-01-01
The notion that injury to mitochondrial DNA is a cause of intrinsic aging was tested by correlating the different respiration rates of several wild strains of Drosophila melanogaster with the life-spans. Respiration rate and aging in a mutant of D. melanogaster deficient in postreplication repair were also investigated. In agreement with the rate of living theory, there was an inverse relation between oxygen consumption and median life-span in flies having normal DNA repair. The mutant showed an abnormally low life-span as compared to the controls and also exhibited significant deficiency in mating fitness and a depressed metabolic rate. Therefore, the short life-span of the mutant may be due to the congenital condition rather than to accelerated aging.
Howard, Marc W.; Bessette-Symons, Brandy; Zhang, Yaofei; Hoyer, William J.
2006-01-01
Younger and older adults were tested on recognition memory for pictures. The Yonelinas high threshold (YHT) model, a formal implementation of two-process theory, fit the response distribution data of both younger and older adults significantly better than a normal unequal variance signal detection model. Consistent with this finding, non-linear zROC curves were obtained for both groups. Estimates of recollection from the YHT model were significantly higher for younger than older adults. This deficit was not a consequence of a general decline in memory; older adults showed comparable overall accuracy and in fact a non-significant increase in their familiarity scores. Implications of these results for theories of recognition memory and the mnemonic deficit associated with aging are discussed. PMID:16594795
Theory of Mind training in children with autism: a randomized controlled trial.
Begeer, Sander; Gevers, Carolien; Clifford, Pamela; Verhoeve, Manja; Kat, Kirstin; Hoddenbach, Elske; Boer, Frits
2011-08-01
Many children with Autism Spectrum Disorders (ASD) participate in social skills or Theory of Mind (ToM) treatments. However, few studies have shown evidence for their effectiveness. The current study used a randomized controlled design to test the effectiveness of a 16-week ToM treatment in 8-13 year old children with ASD and normal IQs (n = 40). The results showed that, compared to controls, the treated children with ASD improved in their conceptual ToM skills, but their elementary understanding, self reported empathic skills or parent reported social behaviour did not improve. Despite the effects on conceptual understanding, the current study does not indicate strong evidence for the effectiveness of a ToM treatment on the daily life mindreading skills.
Li, Rui; Ye, Hongfei; Zhang, Weisheng; Ma, Guojun; Su, Yewang
2015-10-29
Spring constant calibration of the atomic force microscope (AFM) cantilever is of fundamental importance for quantifying the force between the AFM cantilever tip and the sample. The calibration within the framework of thin plate theory undoubtedly has a higher accuracy and broader scope than that within the well-established beam theory. However, thin plate theory-based accurate analytic determination of the constant has been perceived as an extremely difficult issue. In this paper, we implement the thin plate theory-based analytic modeling for the static behavior of rectangular AFM cantilevers, which reveals that the three-dimensional effect and Poisson effect play important roles in accurate determination of the spring constants. A quantitative scaling law is found that the normalized spring constant depends only on the Poisson's ratio, normalized dimension and normalized load coordinate. Both the literature and our refined finite element model validate the present results. The developed model is expected to serve as the benchmark for accurate calibration of rectangular AFM cantilevers.
NASA Astrophysics Data System (ADS)
Becker, Leif E.; Shelley, Michael J.
2000-11-01
First normal stress differences in shear flow are a fundamental property of Non-Newtonian fluids. Experiments involving dilute suspensions of slender fibers exhibit a sharp transition to non-zero normal stress differences beyond a critical shear rate, but existing continuum theories for rigid rods predict neither this transition nor the corresponding magnitude of this effect. We present the first conclusive evidence that elastic instabilities are predominantly responsible for observed deviations from the dilute suspension theory of rigid rods. Our analysis is based on slender body theory and the equilibrium equations of elastica. A straight slender body executing its Jeffery orbit in Couette flow is subject to axial fluid forcing, alternating between compression and tension. We present a stability analysis showing that elastic instabilities are possible for strong flows. Simulations give the fully non-linear evolution of this shape instability, and show that flexibility of the fibers alone is sufficient to cause both shear-thinning and significant first normal stress differences.
Health as normal function: a weak link in Daniels's theory of just health distribution.
Krag, Erik
2014-10-01
Drawing on Christopher Boorse's Biostatistical Theory (BST), Norman Daniels contends that a genuine health need is one which is necessary to restore normal functioning - a supposedly objective notion which he believes can be read from the natural world without reference to potentially controversial normative categories. But despite his claims to the contrary, this conception of health harbors arbitrary evaluative judgments which make room for intractable disagreement as to which conditions should count as genuine health needs and therefore which needs should be met. I begin by offering a brief summary of Boorse's BST, the theory to which Daniels appeals for providing the conception of health as normal functioning upon which his overall distributive scheme rests. Next, I consider what I call practical objections to Daniels's use of Boorse's theory. Finally I recount Elseljin Kingma's theoretical objection to Boorse's BST and discuss its impact on Daniels's overall theory. Though I conclude that Boorse's view, so weakened, will no longer be able to sustain the judgments which Daniels's theory uses it to reach, in the end, I offer Daniels an olive branch by briefly sketching an alternative strategy for reaching suitably objective conclusions regarding the health and/or disease status of various conditions. © 2012 John Wiley & Sons Ltd.
Low-emittance tuning of storage rings using normal mode beam position monitor calibration
NASA Astrophysics Data System (ADS)
Wolski, A.; Rubin, D.; Sagan, D.; Shanks, J.
2011-07-01
We describe a new technique for low-emittance tuning of electron and positron storage rings. This technique is based on calibration of the beam position monitors (BPMs) using excitation of the normal modes of the beam motion, and has benefits over conventional methods. It is relatively fast and straightforward to apply, it can be as easily applied to a large ring as to a small ring, and the tuning for low emittance becomes completely insensitive to BPM gain and alignment errors that can be difficult to determine accurately. We discuss the theory behind the technique, present some simulation results illustrating that it is highly effective and robust for low-emittance tuning, and describe the results of some initial experimental tests on the CesrTA storage ring.
ERIC Educational Resources Information Center
Weiwei, Huang
2016-01-01
As a theory based on the hypothesis of "happy man" about human nature, happiness management plays a significant guiding role in the optimization of the training model of local Chinese normal university students during the transitional period. Under the guidance of this theory, China should adhere to the people-oriented principle,…
ERIC Educational Resources Information Center
O'Connor, Akira R.; Moulin, Christopher J. A.
2006-01-01
We report the case of a 25-year-old healthy, blind male, MT, who experiences normal patterns of deja vu. The optical pathway delay theory of deja vu formation assumes that neuronal input from the optical pathways is necessary for the formation of the experience. Surprisingly, although the sensation of deja vu is known to be experienced by blind…
Cheng, Dongliang; Zhong, Quanlin; Niklas, Karl J; Ma, Yuzhu; Yang, Yusheng; Zhang, Jianhua
2015-02-01
Empirical studies and allometric partitioning (AP) theory indicate that plant above-ground biomass (MA) scales, on average, one-to-one (isometrically) with below-ground biomass (MR) at the level of individual trees and at the level of entire forest communities. However, the ability of the AP theory to predict the biomass allocation patterns of understorey plants has not been established because most previous empirical tests have focused on canopy tree species or very large shrubs. In order to test the AP theory further, 1586 understorey sub-tropical forest plants from 30 sites in south-east China were harvested and examined. The numerical values of the scaling exponents and normalization constants (i.e. slopes and y-intercepts, respectively) of log-log linear MA vs. MR relationships were determined for all individual plants, for each site, across the entire data set, and for data sorted into a total of 19 sub-sets of forest types and successional stages. Similar comparisons of MA/MR were also made. The data revealed that the mean MA/MR of understorey plants was 2·44 and 1·57 across all 1586 plants and for all communities, respectively, and MA scaled nearly isometrically with respect to MR, with scaling exponents of 1·01 for all individual plants and 0·99 for all communities. The scaling exponents did not differ significantly among different forest types or successional stages, but the normalization constants did, and were positively correlated with MA/MR and negatively correlated with scaling exponents across all 1586 plants. The results support the AP theory's prediction that MA scales nearly one-to-one with MR (i.e. MA ∝ MR (≈1·0)) and that plant biomass partitioning for individual plants and at the community level share a strikingly similar pattern, at least for the understorey plants examined in this study. Furthermore, variation in environmental conditions appears to affect the numerical values of normalization constants, but not the scaling exponents of the MA vs. MR relationship. This feature of the results suggests that plant size is the primary driver of the MA vs. MR biomass allocation pattern for understorey plants in sub-tropical forests. © The Author 2015. Published by Oxford University Press on behalf of the Annals of Botany Company. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Rapidly rotating neutron stars with a massive scalar field—structure and universal relations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doneva, Daniela D.; Yazadjiev, Stoytcho S., E-mail: daniela.doneva@uni-tuebingen.de, E-mail: yazad@phys.uni-sofia.bg
We construct rapidly rotating neutron star models in scalar-tensor theories with a massive scalar field. The fact that the scalar field has nonzero mass leads to very interesting results since the allowed range of values of the coupling parameters is significantly broadened. Deviations from pure general relativity can be very large for values of the parameters that are in agreement with the observations. We found that the rapid rotation can magnify the differences several times compared to the static case. The universal relations between the normalized moment of inertia and quadrupole moment are also investigated both for the slowly andmore » rapidly rotating cases. The results show that these relations are still EOS independent up to a large extend and the deviations from pure general relativity can be large. This places the massive scalar-tensor theories amongst the few alternative theories of gravity that can be tested via the universal I -Love- Q relations.« less
The role of responsibility and fear of guilt in hypothesis-testing.
Mancini, Francesco; Gangemi, Amelia
2006-12-01
Recent theories argue that both perceived responsibility and fear of guilt increase obsessive-like behaviours. We propose that hypothesis-testing might account for this effect. Both perceived responsibility and fear of guilt would influence subjects' hypothesis-testing, by inducing a prudential style. This style implies focusing on and confirming the worst hypothesis, and reiterating the testing process. In our experiment, we manipulated the responsibility and fear of guilt of 236 normal volunteers who executed a deductive task. The results show that perceived responsibility is the main factor that influenced individuals' hypothesis-testing. Fear of guilt has however a significant additive effect. Guilt-fearing participants preferred to carry on with the diagnostic process, even when faced with initial favourable evidence, whereas participants in the responsibility condition only did so when confronted with an unfavourable evidence. Implications for the understanding of obsessive-compulsive disorder (OCD) are discussed.
Elevated temperature biaxial fatigue
NASA Technical Reports Server (NTRS)
Jordan, E. H.
1984-01-01
A three year experimental program for studying elevated temperature biaxial fatigue of a nickel based alloy Hastelloy-X has been completed. A new high temperature fatigue test facility with unique capabilities has been developed. Effort was directed toward understanding multiaxial fatigue and correlating the experimental data to the existing theories of fatigue failure. The difficult task of predicting fatigue lives for non-proportional loading was used as an ultimate test for various life prediction methods being considered. The primary means of reaching improved undertanding were through several critical non-proportional loading experiments. It was discovered that the cracking mode switched from primarily cracking on the maximum shear planes at room temperature to cracking on the maximum normal strain planes at 649 C.
Franzen, Jessica; Brinkmann, Kerstin
2016-12-01
Theories and research on depression point to reduced responsiveness during reward anticipation and in part also during punishment anticipation. They also suggest weaker affective responses to reward consumption and unchanged affective responses to punishment consumption. However, studies investigating incentive anticipation using effort mobilization and incentive consumption using facial expressions are scarce. The present studies tested reward and punishment responsiveness in a subclinically depressed sample, manipulating a monetary reward (Study 1) and a monetary punishment (Study 2). Effort mobilization was operationalized as cardiovascular reactivity, while facial expressions were measured by facial electromyographic reactivity. Compared to nondysphorics, dysphorics showed reduced pre-ejection period (PEP) reactivity and blunted self-reported wanting during reward anticipation but reduced PEP reactivity and normal self-reported wanting during punishment anticipation. Compared to nondysphorics, dysphorics showed reduced zygomaticus major muscle reactivity and blunted self-reported liking during reward consumption but normal corrugator supercilii muscle reactivity and normal self-reported disliking during punishment consumption. Copyright © 2016. Published by Elsevier B.V.
Thermal Theory of Combustion and Explosion. 3; Theory of Normal Flame Propagation
NASA Technical Reports Server (NTRS)
Semenov, N. N.
1942-01-01
The technical memorandum covers experimental data on flame propagation, the velocity of flame propagation, analysis of the old theoretical views of flame propagation, confirmation of the theory for simple reactions (theory of combustion of explosive substances and in particular nitroglycol), and check of the theory by example of a chain oxidizing reaction (theory of flame propagation in carbon monoxide, air and carbon monoxide - oxygen mixtures).
On the Stem Cell Origin of Cancer
Sell, Stewart
2010-01-01
In each major theory of the origin of cancer—field theory, chemical carcinogenesis, infection, mutation, or epigenetic change—the tissue stem cell is involved in the generation of cancer. Although the cancer type is identified by the more highly differentiated cells in the cancer cell lineage or hierarchy (transit-amplifying cells), the property of malignancy and the molecular lesion of the cancer exist in the cancer stem cell. In the case of teratocarcinomas, normal germinal stem cells have the potential to become cancers if placed in an environment that allows expression of the cancer phenotype (field theory). In cancers due to chemically induced mutations, viral infections, somatic and inherited mutations, or epigenetic changes, the molecular lesion or infection usually first occurs in the tissue stem cells. Cancer stem cells then give rise to transit-amplifying cells and terminally differentiated cells, similar to what happens in normal tissue renewal. However, the major difference between cancer growth and normal tissue renewal is that whereas normal transit amplifying cells usually differentiate and die, at various levels of differentiation, the cancer transit-amplifying cells fail to differentiate normally and instead accumulate (ie, they undergo maturation arrest), resulting in cancer growth. PMID:20431026
Order, topology and preference
NASA Technical Reports Server (NTRS)
Sertel, M. R.
1971-01-01
Some standard order-related and topological notions, facts, and methods are brought to bear on central topics in the theory of preference and the theory of optimization. Consequences of connectivity are considered, especially from the viewpoint of normally preordered spaces. Examples are given showing how the theory of preference, or utility theory, can be applied to social analysis.
Can dual processing theory explain physics students' performance on the Force Concept Inventory?
NASA Astrophysics Data System (ADS)
Wood, Anna K.; Galloway, Ross K.; Hardy, Judy
2016-12-01
According to dual processing theory there are two types, or modes, of thinking: system 1, which involves intuitive and nonreflective thinking, and system 2, which is more deliberate and requires conscious effort and thought. The Cognitive Reflection Test (CRT) is a widely used and robust three item instrument that measures the tendency to override system 1 thinking and to engage in reflective, system 2 thinking. Each item on the CRT has an intuitive (but wrong) answer that must be rejected in order to answer the item correctly. We therefore hypothesized that performance on the CRT may give useful insights into the cognitive processes involved in learning physics, where success involves rejecting the common, intuitive ideas about the world (often called misconceptions) and instead carefully applying physical concepts. This paper presents initial results from an ongoing study examining the relationship between students' CRT scores and their performance on the Force Concept Inventory (FCI), which tests students' understanding of Newtonian mechanics. We find that a higher CRT score predicts a higher FCI score for both precourse and postcourse tests. However, we also find that the FCI normalized gain is independent of CRT score. The implications of these results are discussed.
Elevated temperature biaxial fatigue
NASA Technical Reports Server (NTRS)
Jordan, E. H.
1985-01-01
A 3 year experimental program for studying elevated temperature biaxial fatigue of a nickel based alloy Hastelloy-X has been completed. A new high temperature fatigue test facility with unique capabilities has been developed. Effort was directed toward understanding multiaxial fatigue and correlating the experimental data to the existing theories of fatigue failure. The difficult task of predicting fatigue lives for nonproportional loading was used as an ultimate test for various life prediction methods being considered. The primary means of reaching improved understanding were through several critical nonproportional loading experiments. The direction of cracking observed on failed specimens was also recorded and used to guide the development of the theory. Cyclic deformation responses were permanently recorded digitally during each test. It was discovered that the cracking mode switched from primarily cracking on the maximum shear planes at room temperature to cracking on the maximum normal strain planes at 649 C. In contrast to some other metals, loading path in nonproportional loading had little effect on fatigue lives. Strain rate had a small effect on fatigue lives at 649 C. Of the various correlating parameters the modified plastic work and octahedral shear stress were the most successful.
NASA Technical Reports Server (NTRS)
Hooke, F. H.
1972-01-01
Both the conventional and reliability analyses for determining safe fatigue life are predicted on a population having a specified (usually log normal) distribution of life to collapse under a fatigue test load. Under a random service load spectrum, random occurrences of load larger than the fatigue test load may confront and cause collapse of structures which are weakened, though not yet to the fatigue test load. These collapses are included in reliability but excluded in conventional analysis. The theory of risk determination by each method is given, and several reasonably typical examples have been worked out, in which it transpires that if one excludes collapse through exceedance of the uncracked strength, the reliability and conventional analyses gave virtually identical probabilities of failure or survival.
Gifford, Katherine A; Liu, Dandan; Romano, Raymond; Jones, Richard N; Jefferson, Angela L
2015-12-01
Subjective cognitive decline (SCD) may indicate unhealthy cognitive changes, but no standardized SCD measurement exists. This pilot study aims to identify reliable SCD questions. 112 cognitively normal (NC, 76±8 years, 63% female), 43 mild cognitive impairment (MCI; 77±7 years, 51% female), and 33 diagnostically ambiguous participants (79±9 years, 58% female) were recruited from a research registry and completed 57 self-report SCD questions. Psychometric methods were used for item-reduction. Factor analytic models assessed unidimensionality of the latent trait (SCD); 19 items were removed with extreme response distribution or trait-fit. Item response theory (IRT) provided information about question utility; 17 items with low information were dropped. Post-hoc simulation using computerized adaptive test (CAT) modeling selected the most commonly used items (n=9 of 21 items) that represented the latent trait well (r=0.94) and differentiated NC from MCI participants (F(1,146)=8.9, p=0.003). Item response theory and computerized adaptive test modeling identified nine reliable SCD items. This pilot study is a first step toward refining SCD assessment in older adults. Replication of these findings and validation with Alzheimer's disease biomarkers will be an important next step for the creation of a SCD screener.
Effects of Stress on Judgment and Decision Making in Dynamic Tasks
1991-06-01
their normal working conditions, (2) to ascertain whether the results from lens model theory and research in static tasks generalize to these...8217 normal work environment. A further generalization from lens model theory is that those precursors (secondary cues) that are more conceptual in...potential microburst cases. Although this sample of cases is admittedly smaller than desirable, many hours of technical work were required to remove
Oliva, Jesús; Serrano, J Ignacio; del Castillo, M Dolores; Iglesias, Angel
2014-06-01
The diagnosis of mental disorders is in most cases very difficult because of the high heterogeneity and overlap between associated cognitive impairments. Furthermore, early and individualized diagnosis is crucial. In this paper, we propose a methodology to support the individualized characterization and diagnosis of cognitive impairments. The methodology can also be used as a test platform for existing theories on the causes of the impairments. We use computational cognitive modeling to gather information on the cognitive mechanisms underlying normal and impaired behavior. We then use this information to feed machine-learning algorithms to individually characterize the impairment and to differentiate between normal and impaired behavior. We apply the methodology to the particular case of specific language impairment (SLI) in Spanish-speaking children. The proposed methodology begins by defining a task in which normal and individuals with impairment present behavioral differences. Next we build a computational cognitive model of that task and individualize it: we build a cognitive model for each participant and optimize its parameter values to fit the behavior of each participant. Finally, we use the optimized parameter values to feed different machine learning algorithms. The methodology was applied to an existing database of 48 Spanish-speaking children (24 normal and 24 SLI children) using clustering techniques for the characterization, and different classifier techniques for the diagnosis. The characterization results show three well-differentiated groups that can be associated with the three main theories on SLI. Using a leave-one-subject-out testing methodology, all the classifiers except the DT produced sensitivity, specificity and area under curve values above 90%, reaching 100% in some cases. The results show that our methodology is able to find relevant information on the underlying cognitive mechanisms and to use it appropriately to provide better diagnosis than existing techniques. It is also worth noting that the individualized characterization obtained using our methodology could be extremely helpful in designing individualized therapies. Moreover, the proposed methodology could be easily extended to other languages and even to other cognitive impairments not necessarily related to language. Copyright © 2014 Elsevier B.V. All rights reserved.
Acoustic-gravity waves in atmospheric and oceanic waveguides.
Godin, Oleg A
2012-08-01
A theory of guided propagation of sound in layered, moving fluids is extended to include acoustic-gravity waves (AGWs) in waveguides with piecewise continuous parameters. The orthogonality of AGW normal modes is established in moving and motionless media. A perturbation theory is developed to quantify the relative significance of the gravity and fluid compressibility as well as sensitivity of the normal modes to variations in sound speed, flow velocity, and density profiles and in boundary conditions. Phase and group speeds of the normal modes are found to have certain universal properties which are valid for waveguides with arbitrary stratification. The Lamb wave is shown to be the only AGW normal mode that can propagate without dispersion in a layered medium.
Toward a Unified Consciousness Theory
ERIC Educational Resources Information Center
Johnson, Richard H.
1977-01-01
The beginning of a holistic theory that can treat paranormal phenomena as normal human development is presented. Implications for counseling, counselor education, and counselor supervision are discussed. (Author)
Robert Dicke and the naissance of experimental gravity physics, 1957-1967
NASA Astrophysics Data System (ADS)
Peebles, Phillip James Edwin
2017-06-01
The experimental study of gravity became much more active in the late 1950s, a change pronounced enough be termed the birth, or naissance, of experimental gravity physics. I present a review of developments in this subject since 1915, through the broad range of new approaches that commenced in the late 1950s, and up to the transition of experimental gravity physics to what might be termed a normal and accepted part of physical science in the late 1960s. This review shows the importance of advances in technology, here as in all branches of natural science. The role of contingency is illustrated by Robert Dicke's decision in the mid-1950s to change directions in mid-career, to lead a research group dedicated to the experimental study of gravity. The review also shows the power of nonempirical evidence. Some in the 1950s felt that general relativity theory is so logically sound as to be scarcely worth the testing. But Dicke and others argued that a poorly tested theory is only that, and that other nonempirical arguments, based on Mach's Principle and Dirac's Large Numbers hypothesis, suggested it would be worth looking for a better theory of gravity. I conclude by offering lessons from this history, some peculiar to the study of gravity physics during the naissance, some of more general relevance. The central lesson, which is familiar but not always well advertised, is that physical theories can be empirically established, sometimes with surprising results.
Left hemisphere regions are critical for language in the face of early left focal brain injury.
Raja Beharelle, Anjali; Dick, Anthony Steven; Josse, Goulven; Solodkin, Ana; Huttenlocher, Peter R; Levine, Susan C; Small, Steven L
2010-06-01
A predominant theory regarding early stroke and its effect on language development, is that early left hemisphere lesions trigger compensatory processes that allow the right hemisphere to assume dominant language functions, and this is thought to underlie the near normal language development observed after early stroke. To test this theory, we used functional magnetic resonance imaging to examine brain activity during category fluency in participants who had sustained pre- or perinatal left hemisphere stroke (n = 25) and in neurologically normal siblings (n = 27). In typically developing children, performance of a category fluency task elicits strong involvement of left frontal and lateral temporal regions and a lesser involvement of right hemisphere structures. In our cohort of atypically developing participants with early stroke, expressive and receptive language skills correlated with activity in the same left inferior frontal regions that support language processing in neurologically normal children. This was true independent of either the amount of brain injury or the extent that the injury was located in classical cortical language processing areas. Participants with bilateral activation in left and right superior temporal-inferior parietal regions had better language function than those with either predominantly left- or right-sided unilateral activation. The advantage conferred by left inferior frontal and bilateral temporal involvement demonstrated in our study supports a strong predisposition for typical neural language organization, despite an intervening injury, and argues against models suggesting that the right hemisphere fully accommodates language function following early injury.
fMRI responses to Jung's Word Association Test: implications for theory, treatment and research.
Petchkovsky, Leon; Petchkovsky, Michael; Morris, Philip; Dickson, Paul; Montgomery, Danielle; Dwyer, Jonathan; Burnett, Patrick
2013-06-01
Jung's Word Association Test was performed under fMRI conditions by 12 normal subjects. Pooled complexed responses were contrasted against pooled neutral ones. The fMRI activation pattern of this generic 'complexed response' was very strong (corrected Z scores ranging from 4.90 to 5.69). The activation pattern in each hemisphere includes mirror neurone areas that track 'otherness' (perspectival empathy), anterior insula (both self-awareness and emotional empathy), and cingulated gyrus (self-awareness and conflict-monitoring). These are the sites described by Siegel and colleagues as the 'resonance circuitry' in the brain which is central to mindfulness (awareness of self) and empathy (sense of the other), negotiations between self awareness and the 'internal other'. But there is also an interhemispheric dialogue. Within 3 seconds, the left hemisphere over-rides the right (at least in our normal subjects). Mindfulness and empathy are central to good psychotherapy, and complexes can be windows of opportunity if left-brain hegemony is resisted. This study sets foundations for further research: (i) QEEG studies (with their finer temporal resolution) of complexed responses in normal subjects (ii) QEEG and fMRI studies of complexed responses in other conditions, like schizophrenia, PTSD, disorders of self organization. © 2013, The Society of Analytical Psychology.
MacFarlane, Anne; O'Donnell, Catherine; Mair, Frances; O'Reilly-de Brún, Mary; de Brún, Tomas; Spiegel, Wolfgang; van den Muijsenbergh, Maria; van Weel-Baumgarten, Evelyn; Lionis, Christos; Burns, Nicola; Gravenhorst, Katja; Princz, Christine; Teunissen, Erik; van den Driessen Mareeuw, Francine; Saridaki, Aristoula; Papadakaki, Maria; Vlahadi, Maria; Dowrick, Christopher
2012-11-20
The implementation of guidelines and training initiatives to support communication in cross-cultural primary care consultations is ad hoc across a range of international settings with negative consequences particularly for migrants. This situation reflects a well-documented translational gap between evidence and practice and is part of the wider problem of implementing guidelines and the broader range of professional educational and quality interventions in routine practice. In this paper, we describe our use of a contemporary social theory, Normalization Process Theory and participatory research methodology--Participatory Learning and Action--to investigate and support implementation of such guidelines and training initiatives in routine practice. This is a qualitative case study, using multiple primary care sites across Europe. Purposive and maximum variation sampling approaches will be used to identify and recruit stakeholders-migrant service users, general practitioners, primary care nurses, practice managers and administrative staff, interpreters, cultural mediators, service planners, and policy makers. We are conducting a mapping exercise to identify relevant guidelines and training initiatives. We will then initiate a PLA-brokered dialogue with stakeholders around Normalization Process Theory's four constructs--coherence, cognitive participation, collective action, and reflexive monitoring. Through this, we will enable stakeholders in each setting to select a single guideline or training initiative for implementation in their local setting. We will prospectively investigate and support the implementation journeys for the five selected interventions. Data will be generated using a Participatory Learning and Action approach to interviews and focus groups. Data analysis will follow the principles of thematic analysis, will occur in iterative cycles throughout the project and will involve participatory co-analysis with key stakeholders to enhance the authenticity and veracity of findings. This research employs a unique combination of Normalization Process Theory and Participatory Learning and Action, which will provide a novel approach to the analysis of implementation journeys. The findings will advance knowledge in the field of implementation science because we are using and testing theoretical and methodological approaches so that we can critically appraise their scope to mediate barriers and improve the implementation processes.
Seth, Anil K
2014-01-01
Normal perception involves experiencing objects within perceptual scenes as real, as existing in the world. This property of "perceptual presence" has motivated "sensorimotor theories" which understand perception to involve the mastery of sensorimotor contingencies. However, the mechanistic basis of sensorimotor contingencies and their mastery has remained unclear. Sensorimotor theory also struggles to explain instances of perception, such as synesthesia, that appear to lack perceptual presence and for which relevant sensorimotor contingencies are difficult to identify. On alternative "predictive processing" theories, perceptual content emerges from probabilistic inference on the external causes of sensory signals, however, this view has addressed neither the problem of perceptual presence nor synesthesia. Here, I describe a theory of predictive perception of sensorimotor contingencies which (1) accounts for perceptual presence in normal perception, as well as its absence in synesthesia, and (2) operationalizes the notion of sensorimotor contingencies and their mastery. The core idea is that generative models underlying perception incorporate explicitly counterfactual elements related to how sensory inputs would change on the basis of a broad repertoire of possible actions, even if those actions are not performed. These "counterfactually-rich" generative models encode sensorimotor contingencies related to repertoires of sensorimotor dependencies, with counterfactual richness determining the degree of perceptual presence associated with a stimulus. While the generative models underlying normal perception are typically counterfactually rich (reflecting a large repertoire of possible sensorimotor dependencies), those underlying synesthetic concurrents are hypothesized to be counterfactually poor. In addition to accounting for the phenomenology of synesthesia, the theory naturally accommodates phenomenological differences between a range of experiential states including dreaming, hallucination, and the like. It may also lead to a new view of the (in)determinacy of normal perception.
The Development of Genetics in the Light of Thomas Kuhn's Theory of Scientific Revolutions.
Portin, Petter
2015-01-01
The concept of a paradigm is in the key position in Thomas Kuhn's theory of scientific revolutions. A paradigm is the framework within which the results, concepts, hypotheses and theories of scientific research work are understood. According to Kuhn, a paradigm guides the working and efforts of scientists during the time period which he calls the period of normal science. Before long, however, normal science leads to unexplained matters, a situation that then leads the development of the scientific discipline in question to a paradigm shift--a scientific revolution. When a new theory is born, it has either gradually emerged as an extension of the past theory, or the old theory has become a borderline case in the new theory. In the former case, one can speak of a paradigm extension. According to the present author, the development of modern genetics has, until very recent years, been guided by a single paradigm, the Mendelian paradigm which Gregor Mendel launched 150 years ago, and under the guidance of this paradigm the development of genetics has proceeded in a normal fashion in the spirit of logical positivism. Modern discoveries in genetics have, however, created a situation which seems to be leading toward a paradigm shift. The most significant of these discoveries are the findings of adaptive mutations, the phenomenon of transgenerational epigenetic inheritance, and, above all, the present deeply critical state of the concept of the gene.
As the wheel turns: a centennial reflection on Freud's Three Essays on the Theory of Sexuality.
Person, Ethel Spector
2005-01-01
Freud's theories of psychosexual development, while highly original, were anchored in the explosion of scientific studies of sex in the nineteenth century. Most of these studies were based on masturbation, homosexuality, and deviance, with little attention given to normal sexuality. Around the turn of the century, the narrow interest in pathological sexuality and sexual physiology gradually gave way to a broader interest in normal sexuality. It was in the context of these expanding studies of sexuality that Freud proposed the first psychological view of sexuality, a theory that defined sex as being at the interface between soma and psyche. Libido theory, which Freud developed, is a theory of drives and conflicts. For Freud, libido was the major force in personality development, and he posited sexual conflicts as the heart of neuroses, sexual fixations as the essence of perversions. This article traces the way Freud's libido theory has served as one of the mainsprings in the development of psychoanalytic theory. It also addresses the major revisions that have taken place in libido theory, with a focus primarily on object relations theory, and the impact of culture on the way sex and sexual mores are parsed.
Cancer Theory from Systems Biology Point of View
NASA Astrophysics Data System (ADS)
Wang, Gaowei; Tang, Ying; Yuan, Ruoshi; Ao, Ping
In our previous work, we have proposed a novel cancer theory, endogenous network theory, to understand mechanism underlying cancer genesis and development. Recently, we apply this theory to hepatocellular carcinoma (HCC). A core endogenous network of hepatocyte was established by integrating the current understanding of hepatocyte at molecular level. Quantitative description of the endogenous network consisted of a set of stochastic differential equations which could generate many local attractors with obvious or non-obvious biological functions. By comparing with clinical observation and experimental data, the results showed that two robust attractors from the model reproduced the main known features of normal hepatocyte and cancerous hepatocyte respectively at both modular and molecular level. In light of our theory, the genesis and progression of cancer is viewed as transition from normal attractor to HCC attractor. A set of new insights on understanding cancer genesis and progression, and on strategies for cancer prevention, cure, and care were provided.
Exploring the relation between people’s theories of intelligence and beliefs about brain development
Thomas, Ashley J.; Sarnecka, Barbara W.
2015-01-01
A person’s belief about whether intelligence can change (called their implicit theory of intelligence) predicts something about that person’s thinking and behavior. People who believe intelligence is fixed (called entity theorists) attribute failure to traits (i.e., “I failed the test because I’m not smart.”) and tend to be less motivated in school; those who believe intelligence is malleable (called incremental theorists) tend to attribute failure to behavior (i.e., “I failed the test because I didn’t study.”) and are more motivated in school. In previous studies, researchers have characterized participants as either entity or incremental theorists based on their agreement or disagreement with three statements. The present study further explored the theories-of-intelligence (TOI) construct in two ways: first, we asked whether these theories are coherent, in the sense that they show up not only in participants’ responses to the three standard assessment items, but on a broad range of questions about intelligence and the brain. Second, we asked whether these theories are discrete or continuous. In other words, we asked whether people believe one thing or the other (i.e., that intelligence is malleable or fixed), or if there is a continuous range of beliefs (i.e., people believe in malleability to a greater or lesser degree). Study (1) asked participants a range of general questions about the malleability of intelligence and the brain. Study (2) asked participants more specific questions about the brains of a pair of identical twins who were separated at birth. Results showed that TOI are coherent: participants’ responses to the three standard survey items are correlated with their responses to questions about the brain. But the theories are not discrete: although responses to the three standard survey items fell into a bimodal distribution, responses to the broader range of questions fell into a normal distribution suggesting the theories are continuous. PMID:26191027
Reception of Theory: Film-Television Studies and the Frankfurt School.
ERIC Educational Resources Information Center
Steinman, Clay
1988-01-01
Discusses the Critical Theory of the Frankfurt School and how it offers a way of seeing normally obscured relations of social power in the details of modern capitalist culture. Concentrates on claims about critical theory that have functioned as strategies of denial. (MS)
Taplin, Francis; O'Donnell, Deanna; Kubic, Thomas; Leona, Marco; Lombardi, John
2013-10-01
We evaluated the normal Raman (NR) and the surface-enhanced Raman scattering (SERS) of three sympathomimetic amines: phenethylamine, ephedrine, and 3,4-methylenedioxymethamphetamine (MDMA). In addition, quantum mechanical calculations-geometry optimization and calculations of the harmonic vibrational frequencies-were performed using the density functional theory (DFT) approach. Vibrational assignments were made by comparing the experimental and calculated spectra. The study found that both NR and SERS provided excellent spectra for the drugs tested. Certain conditions, such as response to various laser wavelengths and background fluorescence of the analyte, could be easily managed using SERS techniques. The DFT-calculated spectra could be correlated with the experimental spectra without the aid of a scaling factor. We also present a set of discriminant bands, useful for distinguishing the three compounds, despite their structural similarities.
Cellular automatons applied to gas dynamic problems
NASA Technical Reports Server (NTRS)
Long, Lyle N.; Coopersmith, Robert M.; Mclachlan, B. G.
1987-01-01
This paper compares the results of a relatively new computational fluid dynamics method, cellular automatons, with experimental data and analytical results. This technique has been shown to qualitatively predict fluidlike behavior; however, there have been few published comparisons with experiment or other theories. Comparisons are made for a one-dimensional supersonic piston problem, Stokes first problem, and the flow past a normal flat plate. These comparisons are used to assess the ability of the method to accurately model fluid dynamic behavior and to point out its limitations. Reasonable results were obtained for all three test cases, but the fundamental limitations of cellular automatons are numerous. It may be misleading, at this time, to say that cellular automatons are a computationally efficient technique. Other methods, based on continuum or kinetic theory, would also be very efficient if as little of the physics were included.
Efficient High-Fidelity, Geometrically Exact, Multiphysics Structural Models
2011-10-14
fuctionally graded core. International Journal for Numerical Methods in Engineering, 68:940– 966, 2006. 7F. Shang, Z. Wang, and Z. Li. Analysis of...normal deformable plate theory and MLPG method with radial basis fuctions . Composite Structures, 80:539– 552, 2007. 17W. Zhen and W. Chen. A higher-order...functionally graded plates by using higher-order shear and normal deformable plate theory and MLPG method with radial basis fuctions . Composite Structures, 80
A Higher-Order Bending Theory for Laminated Composite and Sandwich Beams
NASA Technical Reports Server (NTRS)
Cook, Geoffrey M.
1997-01-01
A higher-order bending theory is derived for laminated composite and sandwich beams. This is accomplished by assuming a special form for the axial and transverse displacement expansions. An independent expansion is also assumed for the transverse normal stress. Appropriate shear correction factors based on energy considerations are used to adjust the shear stiffness. A set of transverse normal correction factors is introduced, leading to significant improvements in the transverse normal strain and stress for laminated composite and sandwich beams. A closed-form solution to the cylindrical elasticity solutions for a wide range of beam aspect ratios and commonly used material systems. Accurate shear stresses for a wide range of laminates, including the challenging unsymmetric composite and sandwich laminates, are obtained using an original corrected integration scheme. For application of the theory to a wider range of problems, guidelines for finite element approximations are presented.
Roff, D A; Crnokrak, P; Fairbairn, D J
2003-07-01
Quantitative genetic theory assumes that trade-offs are best represented by bivariate normal distributions. This theory predicts that selection will shift the trade-off function itself and not just move the mean trait values along a fixed trade-off line, as is generally assumed in optimality models. As a consequence, quantitative genetic theory predicts that the trade-off function will vary among populations in which at least one of the component traits itself varies. This prediction is tested using the trade-off between call duration and flight capability, as indexed by the mass of the dorsolateral flight muscles, in the macropterous morph of the sand cricket. We use four different populations of crickets that vary in the proportion of macropterous males (Lab = 33%, Florida = 29%, Bermuda = 72%, South Carolina = 80%). We find, as predicted, that there is significant variation in the intercept of the trade-off function but not the slope, supporting the hypothesis that trade-off functions are better represented as bivariate normal distributions rather than single lines. We also test the prediction from a quantitative genetical model of the evolution of wing dimorphism that the mean call duration of macropterous males will increase with the percentage of macropterous males in the population. This prediction is also supported. Finally, we estimate the probability of a macropterous male attracting a female, P, as a function of the relative time spent calling (P = time spent calling by macropterous male/(total time spent calling by both micropterous and macropterous male). We find that in the Lab and Florida populations the probability of a female selecting the macropterous male is equal to P, indicating that preference is due simply to relative call duration. But in the Bermuda and South Carolina populations the probability of a female selecting a macropterous male is less than P, indicating a preference for the micropterous male even after differences in call duration are accounted for.
Normal mode study of the earth's rigid body motions
NASA Technical Reports Server (NTRS)
Chao, B. F.
1983-01-01
In this paper it is shown that the earth's rigid body (rb) motions can be represented by an analytical set of eigensolutions to the equation of motion for elastic-gravitational free oscillations. Thus each degree of freedom in the rb motion is associated with a rb normal mode. Cases of both nonrotating and rotating earth models are studied, and it is shown that the rb modes do incorporate neatly into the earth's system of normal modes of free oscillation. The excitation formula for the rb modes are also obtained, based on normal mode theory. Physical implications of the results are summarized and the fundamental differences between rb modes and seismic modes are emphasized. In particular, it is ascertained that the Chandler wobble, being one of the rb modes belonging to the rotating earth, can be studied using the established theory of normal modes.
Cheng, Dongliang; Zhong, Quanlin; Niklas, Karl J.; Ma, Yuzhu; Yang, Yusheng; Zhang, Jianhua
2015-01-01
Background and Aims Empirical studies and allometric partitioning (AP) theory indicate that plant above-ground biomass (MA) scales, on average, one-to-one (isometrically) with below-ground biomass (MR) at the level of individual trees and at the level of entire forest communities. However, the ability of the AP theory to predict the biomass allocation patterns of understorey plants has not been established because most previous empirical tests have focused on canopy tree species or very large shrubs. Methods In order to test the AP theory further, 1586 understorey sub-tropical forest plants from 30 sites in south-east China were harvested and examined. The numerical values of the scaling exponents and normalization constants (i.e. slopes and y-intercepts, respectively) of log–log linear MA vs. MR relationships were determined for all individual plants, for each site, across the entire data set, and for data sorted into a total of 19 sub-sets of forest types and successional stages. Similar comparisons of MA/MR were also made. Key Results The data revealed that the mean MA/MR of understorey plants was 2·44 and 1·57 across all 1586 plants and for all communities, respectively, and MA scaled nearly isometrically with respect to MR, with scaling exponents of 1·01 for all individual plants and 0·99 for all communities. The scaling exponents did not differ significantly among different forest types or successional stages, but the normalization constants did, and were positively correlated with MA/MR and negatively correlated with scaling exponents across all 1586 plants. Conclusions The results support the AP theory’s prediction that MA scales nearly one-to-one with MR (i.e. MA ∝ MR ≈1·0) and that plant biomass partitioning for individual plants and at the community level share a strikingly similar pattern, at least for the understorey plants examined in this study. Furthermore, variation in environmental conditions appears to affect the numerical values of normalization constants, but not the scaling exponents of the MA vs. MR relationship. This feature of the results suggests that plant size is the primary driver of the MA vs. MR biomass allocation pattern for understorey plants in sub-tropical forests. PMID:25564468
Counihan, T.D.; Miller, Allen I.; Parsley, M.J.
1999-01-01
The development of recruitment monitoring programs for age-0 white sturgeons Acipenser transmontanus is complicated by the statistical properties of catch-per-unit-effort (CPUE) data. We found that age-0 CPUE distributions from bottom trawl surveys violated assumptions of statistical procedures based on normal probability theory. Further, no single data transformation uniformly satisfied these assumptions because CPUE distribution properties varied with the sample mean (??(CPUE)). Given these analytic problems, we propose that an additional index of age-0 white sturgeon relative abundance, the proportion of positive tows (Ep), be used to estimate sample sizes before conducting age-0 recruitment surveys and to evaluate statistical hypothesis tests comparing the relative abundance of age-0 white sturgeons among years. Monte Carlo simulations indicated that Ep was consistently more precise than ??(CPUE), and because Ep is binomially rather than normally distributed, surveys can be planned and analyzed without violating the assumptions of procedures based on normal probability theory. However, we show that Ep may underestimate changes in relative abundance at high levels and confound our ability to quantify responses to management actions if relative abundance is consistently high. If data suggest that most samples will contain age-0 white sturgeons, estimators of relative abundance other than Ep should be considered. Because Ep may also obscure correlations to climatic and hydrologic variables if high abundance levels are present in time series data, we recommend ??(CPUE) be used to describe relations to environmental variables. The use of both Ep and ??(CPUE) will facilitate the evaluation of hypothesis tests comparing relative abundance levels and correlations to variables affecting age-0 recruitment. Estimated sample sizes for surveys should therefore be based on detecting predetermined differences in Ep, but data necessary to calculate ??(CPUE) should also be collected.
Testing for EMC (electromagnetic compatibility) in the clinical environment.
Paperman, D; David, Y; Martinez, M
1996-01-01
Testing for electromagnetic compatibility (EMC) in the clinical environment introduces a host of complex conditions not normally encountered under laboratory conditions. In the clinical environment, various radio-frequency (RF) sources of electromagnetic interference (EMI) may be present throughout the entire spectrum of interest. Isolating and analyzing the impact from the sources of interference to medical devices involves a multidisciplinary approach based on training in, and knowledge of, the following: operation of medical devices and their susceptibility to EMI; RF propagation modalities and interaction theory; spectrum analysis systems and techniques (preferably with signature analysis capabilities) and calibrated antennas; the investigation methodology of suspected EMC problems, and testing protocols and standards. Using combinations of standard test procedures adapted for the clinical environment with personnel that have an understanding of radio-frequency behavior increases the probability of controlling, proactively, EMI in the clinical environment, thus providing for a safe and more effective patient care environment.
Behavior of the maximum likelihood in quantum state tomography
NASA Astrophysics Data System (ADS)
Scholten, Travis L.; Blume-Kohout, Robin
2018-02-01
Quantum state tomography on a d-dimensional system demands resources that grow rapidly with d. They may be reduced by using model selection to tailor the number of parameters in the model (i.e., the size of the density matrix). Most model selection methods typically rely on a test statistic and a null theory that describes its behavior when two models are equally good. Here, we consider the loglikelihood ratio. Because of the positivity constraint ρ ≥ 0, quantum state space does not generally satisfy local asymptotic normality (LAN), meaning the classical null theory for the loglikelihood ratio (the Wilks theorem) should not be used. Thus, understanding and quantifying how positivity affects the null behavior of this test statistic is necessary for its use in model selection for state tomography. We define a new generalization of LAN, metric-projected LAN, show that quantum state space satisfies it, and derive a replacement for the Wilks theorem. In addition to enabling reliable model selection, our results shed more light on the qualitative effects of the positivity constraint on state tomography.
Behavior of the maximum likelihood in quantum state tomography
Blume-Kohout, Robin J; Scholten, Travis L.
2018-02-22
Quantum state tomography on a d-dimensional system demands resources that grow rapidly with d. They may be reduced by using model selection to tailor the number of parameters in the model (i.e., the size of the density matrix). Most model selection methods typically rely on a test statistic and a null theory that describes its behavior when two models are equally good. Here, we consider the loglikelihood ratio. Because of the positivity constraint ρ ≥ 0, quantum state space does not generally satisfy local asymptotic normality (LAN), meaning the classical null theory for the loglikelihood ratio (the Wilks theorem) shouldmore » not be used. Thus, understanding and quantifying how positivity affects the null behavior of this test statistic is necessary for its use in model selection for state tomography. We define a new generalization of LAN, metric-projected LAN, show that quantum state space satisfies it, and derive a replacement for the Wilks theorem. In addition to enabling reliable model selection, our results shed more light on the qualitative effects of the positivity constraint on state tomography.« less
Behavior of the maximum likelihood in quantum state tomography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blume-Kohout, Robin J; Scholten, Travis L.
Quantum state tomography on a d-dimensional system demands resources that grow rapidly with d. They may be reduced by using model selection to tailor the number of parameters in the model (i.e., the size of the density matrix). Most model selection methods typically rely on a test statistic and a null theory that describes its behavior when two models are equally good. Here, we consider the loglikelihood ratio. Because of the positivity constraint ρ ≥ 0, quantum state space does not generally satisfy local asymptotic normality (LAN), meaning the classical null theory for the loglikelihood ratio (the Wilks theorem) shouldmore » not be used. Thus, understanding and quantifying how positivity affects the null behavior of this test statistic is necessary for its use in model selection for state tomography. We define a new generalization of LAN, metric-projected LAN, show that quantum state space satisfies it, and derive a replacement for the Wilks theorem. In addition to enabling reliable model selection, our results shed more light on the qualitative effects of the positivity constraint on state tomography.« less
Quasi-Normal Modes of Stars and Black Holes.
Kokkotas, Kostas D; Schmidt, Bernd G
1999-01-01
Perturbations of stars and black holes have been one of the main topics of relativistic astrophysics for the last few decades. They are of particular importance today, because of their relevance to gravitational wave astronomy. In this review we present the theory of quasi-normal modes of compact objects from both the mathematical and astrophysical points of view. The discussion includes perturbations of black holes (Schwarzschild, Reissner-Nordström, Kerr and Kerr-Newman) and relativistic stars (non-rotating and slowly-rotating). The properties of the various families of quasi-normal modes are described, and numerical techniques for calculating quasi-normal modes reviewed. The successes, as well as the limits, of perturbation theory are presented, and its role in the emerging era of numerical relativity and supercomputers is discussed.
Finch, Tracy L; Mair, Frances S; O'Donnell, Catherine; Murray, Elizabeth; May, Carl R
2012-05-17
Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1) describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2) identify key issues and methodological challenges for advancing work in this field. A 30-item instrument (Technology Adoption Readiness Scale (TARS)) for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT). NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice) was used by health care professionals. The developed instrument was pre-tested in two professional samples (N=46; N=231). Ratings of items representing normalisation 'processes' were significantly related to staff members' perceptions of whether or not e-health had become 'routine'. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts. To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1) greater attention to underlying theoretical assumptions and extent of translation work required; (2) the need for appropriate but flexible approaches to outcomes measurement; (3) representation of multiple perspectives and collaborative nature of work; and (4) emphasis on generic measurement approaches that can be flexibly tailored to particular contexts of study.
Carhart-Harris, Robin L.; Leech, Robert; Hellyer, Peter J.; Shanahan, Murray; Feilding, Amanda; Tagliazucchi, Enzo; Chialvo, Dante R.; Nutt, David
2014-01-01
Entropy is a dimensionless quantity that is used for measuring uncertainty about the state of a system but it can also imply physical qualities, where high entropy is synonymous with high disorder. Entropy is applied here in the context of states of consciousness and their associated neurodynamics, with a particular focus on the psychedelic state. The psychedelic state is considered an exemplar of a primitive or primary state of consciousness that preceded the development of modern, adult, human, normal waking consciousness. Based on neuroimaging data with psilocybin, a classic psychedelic drug, it is argued that the defining feature of “primary states” is elevated entropy in certain aspects of brain function, such as the repertoire of functional connectivity motifs that form and fragment across time. Indeed, since there is a greater repertoire of connectivity motifs in the psychedelic state than in normal waking consciousness, this implies that primary states may exhibit “criticality,” i.e., the property of being poised at a “critical” point in a transition zone between order and disorder where certain phenomena such as power-law scaling appear. Moreover, if primary states are critical, then this suggests that entropy is suppressed in normal waking consciousness, meaning that the brain operates just below criticality. It is argued that this entropy suppression furnishes normal waking consciousness with a constrained quality and associated metacognitive functions, including reality-testing and self-awareness. It is also proposed that entry into primary states depends on a collapse of the normally highly organized activity within the default-mode network (DMN) and a decoupling between the DMN and the medial temporal lobes (which are normally significantly coupled). These hypotheses can be tested by examining brain activity and associated cognition in other candidate primary states such as rapid eye movement (REM) sleep and early psychosis and comparing these with non-primary states such as normal waking consciousness and the anaesthetized state. PMID:24550805
Carhart-Harris, Robin L; Leech, Robert; Hellyer, Peter J; Shanahan, Murray; Feilding, Amanda; Tagliazucchi, Enzo; Chialvo, Dante R; Nutt, David
2014-01-01
Entropy is a dimensionless quantity that is used for measuring uncertainty about the state of a system but it can also imply physical qualities, where high entropy is synonymous with high disorder. Entropy is applied here in the context of states of consciousness and their associated neurodynamics, with a particular focus on the psychedelic state. The psychedelic state is considered an exemplar of a primitive or primary state of consciousness that preceded the development of modern, adult, human, normal waking consciousness. Based on neuroimaging data with psilocybin, a classic psychedelic drug, it is argued that the defining feature of "primary states" is elevated entropy in certain aspects of brain function, such as the repertoire of functional connectivity motifs that form and fragment across time. Indeed, since there is a greater repertoire of connectivity motifs in the psychedelic state than in normal waking consciousness, this implies that primary states may exhibit "criticality," i.e., the property of being poised at a "critical" point in a transition zone between order and disorder where certain phenomena such as power-law scaling appear. Moreover, if primary states are critical, then this suggests that entropy is suppressed in normal waking consciousness, meaning that the brain operates just below criticality. It is argued that this entropy suppression furnishes normal waking consciousness with a constrained quality and associated metacognitive functions, including reality-testing and self-awareness. It is also proposed that entry into primary states depends on a collapse of the normally highly organized activity within the default-mode network (DMN) and a decoupling between the DMN and the medial temporal lobes (which are normally significantly coupled). These hypotheses can be tested by examining brain activity and associated cognition in other candidate primary states such as rapid eye movement (REM) sleep and early psychosis and comparing these with non-primary states such as normal waking consciousness and the anaesthetized state.
Local subsystems in gauge theory and gravity
Donnelly, William; Freidel, Laurent
2016-09-16
We consider the problem of defining localized subsystems in gauge theory and gravity. Such systems are associated to spacelike hypersurfaces with boundaries and provide the natural setting for studying entanglement entropy of regions of space. We present a general formalism to associate a gauge-invariant classical phase space to a spatial slice with boundary by introducing new degrees of freedom on the boundary. In Yang-Mills theory the new degrees of freedom are a choice of gauge on the boundary, transformations of which are generated by the normal component of the nonabelian electric field. In general relativity the new degrees of freedommore » are the location of a codimension-2 surface and a choice of conformal normal frame. These degrees of freedom transform under a group of surface symmetries, consisting of diffeomorphisms of the codimension-2 boundary, and position-dependent linear deformations of its normal plane. We find the observables which generate these symmetries, consisting of the conformal normal metric and curvature of the normal connection. We discuss the implications for the problem of defining entanglement entropy in quantum gravity. Finally, our work suggests that the Bekenstein-Hawking entropy may arise from the different ways of gluing together two partial Cauchy surfaces at a cross-section of the horizon.« less
Olderbak, Sally; Wilhelm, Oliver; Olaru, Gabriel; Geiger, Mattis; Brenneman, Meghan W.; Roberts, Richard D.
2015-01-01
The Reading the Mind in the Eyes Test is a popular measure of individual differences in Theory of Mind that is often applied in the assessment of particular clinical populations (primarily, individuals on the autism spectrum). However, little is known about the test's psychometric properties, including factor structure, internal consistency, and convergent validity evidence. We present a psychometric analysis of the test followed by an evaluation of other empirically proposed and statistically identified structures. We identified, and cross-validated in a second sample, an adequate short-form solution that is homogeneous with adequate internal consistency, and is moderately related to Cognitive Empathy, Emotion Perception, and strongly related to Vocabulary. We recommend the use of this short-form solution in normal adults as a more precise measure over the original version. Future revisions of the test should seek to reduce the test's reliance on one's vocabulary and evaluate the short-form structure in clinical populations. PMID:26500578
The Microscope Space Mission and the In-Orbit Calibration Plan for its Instrument
NASA Astrophysics Data System (ADS)
Levy, Agnès Touboul, Pierre; Rodrigues, Manuel; Onera, Émilie Hardy; Métris, Gilles; Robert, Alain
2015-01-01
The MICROSCOPE space mission aims at testing the Equivalence Principle (EP) with an accuracy of 10-15. This principle is one of the basis of the General Relativity theory; it states the equivalence between gravitational and inertial mass. The test is based on the precise measurement of a gravitational signal by a differential electrostatic accelerometer which includes two cylindrical test masses made of different materials. The accelerometers constitute the payload accommodated on board a drag-free micro-satellite which is controlled inertial or rotating about the normal to the orbital plane. The acceleration estimates used for the EP test are disturbed by the instruments physical parameters and by the instrument environment conditions on-board the satellite. These parameters are partially measured with ground tests or during the integration of the instrument in the satellite (alignment). Nevertheless, the ground evaluations are not sufficient with respect to the EP test accuracy objectives. An in-orbit calibration is therefore needed to characterize them finely. The calibration process for each parameter has been defined.
Schlain, Brian; Amaravadi, Lakshmi; Donley, Jean; Wickramasekera, Ananda; Bennett, Donald; Subramanyam, Meena
2010-01-31
In recent years there has been growing recognition of the impact of anti-drug or anti-therapeutic antibodies (ADAs, ATAs) on the pharmacokinetic and pharmacodynamic behavior of the drug, which ultimately affects drug exposure and activity. These anti-drug antibodies can also impact safety of the therapeutic by inducing a range of reactions from hypersensitivity to neutralization of the activity of an endogenous protein. Assessments of immunogenicity, therefore, are critically dependent on the bioanalytical method used to test samples, in which a positive versus negative reactivity is determined by a statistically derived cut point based on the distribution of drug naïve samples. For non-normally distributed data, a novel gamma-fitting method for obtaining assay cut points is presented. Non-normal immunogenicity data distributions, which tend to be unimodal and positively skewed, can often be modeled by 3-parameter gamma fits. Under a gamma regime, gamma based cut points were found to be more accurate (closer to their targeted false positive rates) compared to normal or log-normal methods and more precise (smaller standard errors of cut point estimators) compared with the nonparametric percentile method. Under a gamma regime, normal theory based methods for estimating cut points targeting a 5% false positive rate were found in computer simulation experiments to have, on average, false positive rates ranging from 6.2 to 8.3% (or positive biases between +1.2 and +3.3%) with bias decreasing with the magnitude of the gamma shape parameter. The log-normal fits tended, on average, to underestimate false positive rates with negative biases as large a -2.3% with absolute bias decreasing with the shape parameter. These results were consistent with the well known fact that gamma distributions become less skewed and closer to a normal distribution as their shape parameters increase. Inflated false positive rates, especially in a screening assay, shifts the emphasis to confirm test results in a subsequent test (confirmatory assay). On the other hand, deflated false positive rates in the case of screening immunogenicity assays will not meet the minimum 5% false positive target as proposed in the immunogenicity assay guidance white papers. Copyright 2009 Elsevier B.V. All rights reserved.
Principals' Engagement of Low Ability Students in Singapore Secondary Schools
ERIC Educational Resources Information Center
Ong, Chye Hin; Dimmock, Clive
2013-01-01
This article describes a grounded theory constructed from a study of Singapore neighbourhood secondary school principals' engagement of their lowest stream, the Normal Technical students, in their schools. This substantive theory is labelled the "theory of selective engagement". It implies that how principals engage their lowest streamed…
Norm Theory: Comparing Reality to Its Alternatives.
ERIC Educational Resources Information Center
Kahneman, Daniel; Miller, Dale T.
1986-01-01
A theory of norms and normality is applied to some phenomena of emotional responses, social judgment, and conversations about causes. Norm theory is applied in analyses of enhanced emotional response to events that have abnormal causes, of generation of prediction from observations of behavior, and of the role of norms. (Author/LMO)
Lun, Aaron T L; Chen, Yunshun; Smyth, Gordon K
2016-01-01
RNA sequencing (RNA-seq) is widely used to profile transcriptional activity in biological systems. Here we present an analysis pipeline for differential expression analysis of RNA-seq experiments using the Rsubread and edgeR software packages. The basic pipeline includes read alignment and counting, filtering and normalization, modelling of biological variability and hypothesis testing. For hypothesis testing, we describe particularly the quasi-likelihood features of edgeR. Some more advanced downstream analysis steps are also covered, including complex comparisons, gene ontology enrichment analyses and gene set testing. The code required to run each step is described, along with an outline of the underlying theory. The chapter includes a case study in which the pipeline is used to study the expression profiles of mammary gland cells in virgin, pregnant and lactating mice.
A common optimization principle for motor execution in healthy subjects and parkinsonian patients.
Baraduc, Pierre; Thobois, Stéphane; Gan, Jing; Broussolle, Emmanuel; Desmurget, Michel
2013-01-09
Recent research on Parkinson's disease (PD) has emphasized that parkinsonian movement, although bradykinetic, shares many attributes with healthy behavior. This observation led to the suggestion that bradykinesia in PD could be due to a reduction in motor motivation. This hypothesis can be tested in the framework of optimal control theory, which accounts for many characteristics of healthy human movement while providing a link between the motor behavior and a cost/benefit trade-off. This approach offers the opportunity to interpret movement deficits of PD patients in the light of a computational theory of normal motor control. We studied 14 PD patients with bilateral subthalamic nucleus (STN) stimulation and 16 age-matched healthy controls, and tested whether reaching movements were governed by similar rules in these two groups. A single optimal control model accounted for the reaching movements of healthy subjects and PD patients, whatever the condition of STN stimulation (on or off). The choice of movement speed was explained in all subjects by the existence of a preset dynamic range for the motor signals. This range was idiosyncratic and applied to all movements regardless of their amplitude. In PD patients this dynamic range was abnormally narrow and correlated with bradykinesia. STN stimulation reduced bradykinesia and widened this range in all patients, but did not restore it to a normal value. These results, consistent with the motor motivation hypothesis, suggest that constrained optimization of motor effort is the main determinant of movement planning (choice of speed) and movement production, in both healthy and PD subjects.
Rational functional representation of flap noise spectra including correction for reflection effects
NASA Technical Reports Server (NTRS)
Miles, J. H.
1974-01-01
A rational function is presented for the acoustic spectra generated by deflection of engine exhaust jets for under-the-wing and over-the-wing versions of externally blown flaps. The functional representation is intended to provide a means for compact storage of data and for data analysis. The expressions are based on Fourier transform functions for the Strouhal normalized pressure spectral density, and on a correction for reflection effects based on Thomas' (1969) N-independent-source model extended by use of a reflected ray transfer function. Curve fit comparisons are presented for blown-flap data taken from turbofan engine tests and from large-scale cold-flow model tests. Application of the rational function to scrubbing noise theory is also indicated.
Tests of Mediation: Paradoxical Decline in Statistical Power as a Function of Mediator Collinearity
Beasley, T. Mark
2013-01-01
Increasing the correlation between the independent variable and the mediator (a coefficient) increases the effect size (ab) for mediation analysis; however, increasing a by definition increases collinearity in mediation models. As a result, the standard error of product tests increase. The variance inflation due to increases in a at some point outweighs the increase of the effect size (ab) and results in a loss of statistical power. This phenomenon also occurs with nonparametric bootstrapping approaches because the variance of the bootstrap distribution of ab approximates the variance expected from normal theory. Both variances increase dramatically when a exceeds the b coefficient, thus explaining the power decline with increases in a. Implications for statistical analysis and applied researchers are discussed. PMID:24954952
Developing Visualization Support System for Teaching/Learning Database Normalization
ERIC Educational Resources Information Center
Folorunso, Olusegun; Akinwale, AdioTaofeek
2010-01-01
Purpose: In tertiary institution, some students find it hard to learn database design theory, in particular, database normalization. The purpose of this paper is to develop a visualization tool to give students an interactive hands-on experience in database normalization process. Design/methodology/approach: The model-view-controller architecture…
Sikdar, Debabrata; Kornyshev, Alexei A
2016-09-22
Two-dimensional arrays of plasmonic nanoparticles at interfaces are promising candidates for novel optical metamaterials. Such systems materialise from 'top-down' patterning or 'bottom-up' self-assembly of nanoparticles at liquid/liquid or liquid/solid interfaces. Here, we present a comprehensive analysis of an extended effective quasi-static four-layer-stack model for the description of plasmon-resonance-enhanced optical responses of such systems. We investigate in detail the effects of the size of nanoparticles, average interparticle separation, dielectric constants of the media constituting the interface, and the nanoparticle position relative to the interface. Interesting interplays of these different factors are explored first for normally incident light. For off-normal incidence, the strong effects of the polarisation of light are found at large incident angles, which allows to dynamically tune the reflectance spectra. All the predictions of the theory are tested against full-wave simulations, proving this simplistic model to be adequate within the quasi-static limit. The model takes seconds to calculate the system's optical response and makes it easy to unravel the effect of each system parameter. This helps rapid rationalization of experimental data and understanding of the optical signals from these novel 'metamaterials', optimised for light reflection or harvesting.
Sikdar, Debabrata; Kornyshev, Alexei A.
2016-01-01
Two-dimensional arrays of plasmonic nanoparticles at interfaces are promising candidates for novel optical metamaterials. Such systems materialise from ‘top–down’ patterning or ‘bottom–up’ self-assembly of nanoparticles at liquid/liquid or liquid/solid interfaces. Here, we present a comprehensive analysis of an extended effective quasi-static four-layer-stack model for the description of plasmon-resonance-enhanced optical responses of such systems. We investigate in detail the effects of the size of nanoparticles, average interparticle separation, dielectric constants of the media constituting the interface, and the nanoparticle position relative to the interface. Interesting interplays of these different factors are explored first for normally incident light. For off-normal incidence, the strong effects of the polarisation of light are found at large incident angles, which allows to dynamically tune the reflectance spectra. All the predictions of the theory are tested against full-wave simulations, proving this simplistic model to be adequate within the quasi-static limit. The model takes seconds to calculate the system’s optical response and makes it easy to unravel the effect of each system parameter. This helps rapid rationalization of experimental data and understanding of the optical signals from these novel ‘metamaterials’, optimised for light reflection or harvesting. PMID:27652788
From Order to Chaos in Earth Satellite Orbits
NASA Astrophysics Data System (ADS)
Gkolias, Ioannis; Daquin, Jérôme; Gachet, Fabien; Rosengren, Aaron J.
2016-11-01
We consider Earth satellite orbits in the range of semimajor axes where the perturbing effects of Earth’s oblateness and lunisolar gravity are of comparable order. This range covers the medium-Earth orbits (MEO) of the Global Navigation Satellite Systems and the geosynchronous orbits (GEO) of the communication satellites. We recall a secular and quadrupolar model, based on the Milankovitch vector formulation of perturbation theory, which governs the long-term orbital evolution subject to the predominant gravitational interactions. We study the global dynamics of this two-and-a-half degrees-of-freedom Hamiltonian system by means of the fast Lyapunov indicator (FLI), used in a statistical sense. Specifically, we characterize the degree of chaoticity of the action space using angle-averaged normalized FLI maps, thereby overcoming the angle dependencies of the conventional stability maps. Emphasis is placed upon the phase-space structures near secular resonances, which are of primary importance to the space debris community. We confirm and quantify the transition from order to chaos in MEO, stemming from the critical inclinations and find that highly inclined GEO orbits are particularly unstable. Despite their reputed normality, Earth satellite orbits can possess an extraordinarily rich spectrum of dynamical behaviors and, from a mathematical perspective, have all the complications that make them very interesting candidates for testing the modern tools of chaos theory.
Shankle, William R; Pooley, James P; Steyvers, Mark; Hara, Junko; Mangrola, Tushar; Reisberg, Barry; Lee, Michael D
2013-01-01
Determining how cognition affects functional abilities is important in Alzheimer disease and related disorders. A total of 280 patients (normal or Alzheimer disease and related disorders) received a total of 1514 assessments using the functional assessment staging test (FAST) procedure and the MCI Screen. A hierarchical Bayesian cognitive processing model was created by embedding a signal detection theory model of the MCI Screen-delayed recognition memory task into a hierarchical Bayesian framework. The signal detection theory model used latent parameters of discriminability (memory process) and response bias (executive function) to predict, simultaneously, recognition memory performance for each patient and each FAST severity group. The observed recognition memory data did not distinguish the 6 FAST severity stages, but the latent parameters completely separated them. The latent parameters were also used successfully to transform the ordinal FAST measure into a continuous measure reflecting the underlying continuum of functional severity. Hierarchical Bayesian cognitive processing models applied to recognition memory data from clinical practice settings accurately translated a latent measure of cognition into a continuous measure of functional severity for both individuals and FAST groups. Such a translation links 2 levels of brain information processing and may enable more accurate correlations with other levels, such as those characterized by biomarkers.
Prediction of vein connectivity using the percolation approach: model test with field data
NASA Astrophysics Data System (ADS)
Belayneh, M.; Masihi, M.; Matthäi, S. K.; King, P. R.
2006-09-01
Evaluating the uncertainty in fracture connectivity and its effect on the flow behaviour of natural fracture networks formed under in situ conditions is an extremely difficult task. One widely used probabilistic approach is to use percolation theory, which is well adapted to estimate the connectivity and conductivity of geometrical objects near the percolation threshold. In this paper, we apply scaling laws from percolation theory to predict the connectivity of vein sets exposed on the southern margin of the Bristol Channel Basin. Two vein sets in a limestone bed interbedded with shales on the limb of a rollover fold were analysed for length, spacing and aperture distributions. Eight scan lines, low-level aerial photographs and mosaics of photographs taken with a tripod were used. The analysed veins formed contemporaneously with the rollover fold during basin subsidence on the hanging wall of a listric normal fault. The first vein set, V1, is fold axis-parallel (i.e. striking ~100°) and normal to bedding. The second vein set, V2, strikes 140° and crosscuts V1. We find a close agreement in connectivity between our predictions using the percolation approach and the field data. The implication is that reasonable predictions of vein connectivity can be made from sparse data obtained from boreholes or (limited) sporadic outcrop.
Theory of superconductivity in oxides. Final technical report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, P.W.
1988-05-18
Progress was made towards a final theory of high-Tc superconductivity. The key elements are the work on normal-state properties and the actual mechanism for Tc. With the understanding (ZA) of the large anisotropy and other transport properties in the normal state, the model is uniquely determined: one must have one version or another of a holon-spinon quantum-fluid state, which is not a normal Fermi liquid. And with the recognition (HWA) of the large-repulsion holon-holon interactions, the author has the first way of thinking quantitatively about the superconducting state. Work on the pure Heisenberg system, which is related but not necessarilymore » crucial to understanding the superconducting properties is described.« less
McVay, Jennifer C; Kane, Michael J
2012-05-01
Some people are better readers than others, and this variation in comprehension ability is predicted by measures of working memory capacity (WMC). The primary goal of this study was to investigate the mediating role of mind-wandering experiences in the association between WMC and normal individual differences in reading comprehension, as predicted by the executive-attention theory of WMC (e.g., Engle & Kane, 2004). We used a latent-variable, structural-equation-model approach, testing skilled adult readers on 3 WMC span tasks, 7 varied reading-comprehension tasks, and 3 attention-control tasks. Mind wandering was assessed using experimenter-scheduled thought probes during 4 different tasks (2 reading, 2 attention-control). The results support the executive-attention theory of WMC. Mind wandering across the 4 tasks loaded onto a single latent factor, reflecting a stable individual difference. Most important, mind wandering was a significant mediator in the relationship between WMC and reading comprehension, suggesting that the WMC-comprehension correlation is driven, in part, by attention control over intruding thoughts. We discuss implications for theories of WMC, attention control, and reading comprehension.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Artemyev, A. V., E-mail: ante0226@gmail.com; Mourenas, D.; Krasnoselskikh, V. V.
2015-06-15
In this paper, we study relativistic electron scattering by fast magnetosonic waves. We compare results of test particle simulations and the quasi-linear theory for different spectra of waves to investigate how a fine structure of the wave emission can influence electron resonant scattering. We show that for a realistically wide distribution of wave normal angles θ (i.e., when the dispersion δθ≥0.5{sup °}), relativistic electron scattering is similar for a wide wave spectrum and for a spectrum consisting in well-separated ion cyclotron harmonics. Comparisons of test particle simulations with quasi-linear theory show that for δθ>0.5{sup °}, the quasi-linear approximation describes resonantmore » scattering correctly for a large enough plasma frequency. For a very narrow θ distribution (when δθ∼0.05{sup °}), however, the effect of a fine structure in the wave spectrum becomes important. In this case, quasi-linear theory clearly fails in describing accurately electron scattering by fast magnetosonic waves. We also study the effect of high wave amplitudes on relativistic electron scattering. For typical conditions in the earth's radiation belts, the quasi-linear approximation cannot accurately describe electron scattering for waves with averaged amplitudes >300 pT. We discuss various applications of the obtained results for modeling electron dynamics in the radiation belts and in the Earth's magnetotail.« less
NASA Astrophysics Data System (ADS)
Wang, Ding; Ding, Pin-bo; Ba, Jing
2018-03-01
In Part I, a dynamic fracture compliance model (DFCM) was derived based on the poroelastic theory. The normal compliance of fractures is frequency-dependent and closely associated with the connectivity of porous media. In this paper, we first compare the DFCM with previous fractured media theories in the literature in a full frequency range. Furthermore, experimental tests are performed on synthetic rock specimens, and the DFCM is compared with the experimental data in the ultrasonic frequency band. Synthetic rock specimens saturated with water have more realistic mineral compositions and pore structures relative to previous works in comparison with natural reservoir rocks. The fracture/pore geometrical and physical parameters can be controlled to replicate approximately those of natural rocks. P- and S-wave anisotropy characteristics with different fracture and pore properties are calculated and numerical results are compared with experimental data. Although the measurement frequency is relatively high, the results of DFCM are appropriate for explaining the experimental data. The characteristic frequency of fluid pressure equilibration calculated based on the specimen parameters is not substantially less than the measurement frequency. In the dynamic fracture model, the wave-induced fluid flow behavior is an important factor for the fracture-wave interaction process, which differs from the models at the high-frequency limits, for instance, Hudson's un-relaxed model.
Adaptive psychological structure in childhood hearing impairment: audiological correlations.
Serra, A; Spinato, G; Cocuzza, S; Licciardello, L; Pavone, P; Maiolino, L
2017-06-01
The present research deals with the clinical and social problems present during linguistic and cognitive development of deaf children. Currently, the development of Theory of Mind represents an important research field in deafness studies. These international studies highlighted a significant alteration in the development of Theory of Mind in deaf children compared to normal hearing children, especially in cases of congenital or preverbal hearing loss. In particular, the research focuses on the skills of deaf children in recognising emotions and desires, through both perceptive and cognitive methods, by evaluation of psycho-cognitive skills of children with severe hearing loss using a set of questions to be administered to hearing loss patients. The experiment was performed on a group composed of 10 children (5 males and 5 females) aged 4 to 9 years and 54 to 108 months, affected by bilateral congenital hearing loss (severe to total), or hearing loss that developed in preverbal children the year before entering elementary school, or during the fourth year of elementary school. The selection criteria were based on: audiologic evaluation, neuro-psychological tests administered to assess general, cognitive as well as praxis and perceptive abilities, and clinical observations performed to assess psychopathology using tests that assess development of both visual perceptive (Coloured Progressive Matrices) and graphic representational abilities (Test of Human Figure Drawings and the Family Drawing Test). The instrument "cognitive" was the "Deaf Children Series", arranged by us, that consists of a mental status examination (MSE) that evaluates: level of cognitive (knowledge-related) ability, emotional mood, and speech and thought patterns at the time of evaluation. Deaf children show a reduced responsiveness to the expressions of sadness on the perceptive side. Through the test, we observed a psychodynamic defense mechanism considering perceptive understanding performance. On the contrary, in normal hearing children, the emotion 'fear' is the most difficult to identify. Deaf children seem to be more susceptible to recognition of visual emotions. Furthermore, deaf children present significant problem-solving skills and emotional recognition skills, possibly as a result of their hearing impairment. © Copyright by Società Italiana di Otorinolaringologia e Chirurgia Cervico-Facciale, Rome, Italy.
Testing continuum descriptions of low-Mach-number shock structures
NASA Technical Reports Server (NTRS)
Pham-Van-diep, Gerald C.; Erwin, Daniel A.; Muntz, E. P.
1991-01-01
Numerical experiments have been performed on normal shock waves with Monte Carlo Direct Simulations (MCDS's) to investigate the validity of continuum theories at very low Mach numbers. Results from the Navier-Stokes and the Burnett equations are compared to MCDS's for both hard-sphere and Maxwell gases. It is found that the maximum-slope shock thicknesses are described equally well (within the MCDS computational scatter) by either of the continuum formulations for Mach numbers smaller than about 1.2. For Mach numbers greater that 1.2, the Burnett predictions are more accurate than the Navier-Stokes results. Temperature-density profile separations are best described by the Burnett equations for Mach numbers greater than about 1.3. At lower Mach numbers the MCDS scatter is too great to differentiate between the two continuum theories. For all Mach numbers above one, the shock shapes are more accurately described by the Burnett equations.
Inter-individual cognitive variability in children with Asperger's syndrome
Gonzalez-Gadea, Maria Luz; Tripicchio, Paula; Rattazzi, Alexia; Baez, Sandra; Marino, Julian; Roca, Maria; Manes, Facundo; Ibanez, Agustin
2014-01-01
Multiple studies have tried to establish the distinctive profile of individuals with Asperger's syndrome (AS). However, recent reports suggest that adults with AS feature heterogeneous cognitive profiles. The present study explores inter-individual variability in children with AS through group comparison and multiple case series analysis. All participants completed an extended battery including measures of fluid and crystallized intelligence, executive functions, theory of mind, and classical neuropsychological tests. Significant group differences were found in theory of mind and other domains related to global information processing. However, the AS group showed high inter-individual variability (both sub- and supra-normal performance) on most cognitive tasks. Furthermore, high fluid intelligence correlated with less general cognitive impairment, high cognitive flexibility, and speed of motor processing. In light of these findings, we propose that children with AS are characterized by a distinct, uneven pattern of cognitive strengths and weaknesses. PMID:25132817
Huang-Pollock, Cynthia L; Nigg, Joel T; Carr, Thomas H
2005-11-01
Whether selective attention is a primary deficit in childhood Attention Deficit Hyperactivity Disorder (ADHD) remains in active debate. We used the perceptual load paradigm to examine both early and late selective attention in children with the Primarily Inattentive (ADHD-I) and Combined subtypes (ADHD-C) of ADHD. No evidence emerged for selective attention deficits in either of the subtypes, but sluggish cognitive tempo was associated with abnormal early selection. At least some, and possibly most, children with DSM-IV ADHD have normal selective attention. Results support the move away from theories of attention dysfunction as primary in ADHD-C. In ADHD-I, this was one of the first formal tests of posterior attention network dysfunction, and results did not support that theory. However, ADHD children with sluggish cognitive tempo (SCT) warrant more study for possible early selective attention deficits.
Shock loading predictions from application of indicial theory to shock-turbulence interactions
NASA Technical Reports Server (NTRS)
Keefe, Laurence R.; Nixon, David
1991-01-01
A sequence of steps that permits prediction of some of the characteristics of the pressure field beneath a fluctuating shock wave from knowledge of the oncoming turbulent boundary layer is presented. The theory first predicts the power spectrum and pdf of the position and velocity of the shock wave, which are then used to obtain the shock frequency distribution, and the pdf of the pressure field, as a function of position within the interaction region. To test the validity of the crucial assumption of linearity, the indicial response of a normal shock is calculated from numerical simulation. This indicial response, after being fit by a simple relaxation model, is used to predict the shock position and velocity spectra, along with the shock passage frequency distribution. The low frequency portion of the shock spectra, where most of the energy is concentrated, is satisfactorily predicted by this method.
Hawks, Steven R; Madanat, Hala; Smith, Terisue; De La Cruz, Natalie
2008-01-01
In this exploratory study, the authors evaluated the impact of an elective college course on dieting levels, eating styles, and body image among college women. Participants were a convenience sample of 29 self-selected female students at a western university who were mostly white, normal-weight seniors with significant dieting experience. The authors used valid and reliable instruments to collect data both before and after testing. An instructor conducted the program in an undergraduate course that met twice weekly for 15 weeks. Theory-based lessons focused on resisting media pressure, modifying dietary restraint, eating in response to hunger (intrinsic eating), and achieving healthy body image. Dependent variables included intrinsic eating, dieting involvement, emotional eating, body image, and self-esteem. A comparison of pretest and posttest scores identified significant improvements for most measures. A theory-driven elective course implemented within a college setting may improve women's eating styles and body image.
NASTRAN flutter analysis of advanced turbopropellers
NASA Technical Reports Server (NTRS)
Elchuri, V.; Smith, G. C. C.
1982-01-01
An existing capability developed to conduct modal flutter analysis of tuned bladed-shrouded discs in NASTRAN was modified and applied to investigate the subsonic unstalled flutter characteristics of advanced turbopropellers. The modifications pertain to the inclusion of oscillatory modal aerodynamic loads of blades with large (backward and forward) variable sweep. The two dimensional subsonic cascade unsteady aerodynamic theory was applied in a strip theory manner with appropriate modifications for the sweep effects. Each strip is associated with a chord selected normal to any spanwise reference curve such as the blade leading edge. The stability of three operating conditions of a 10-bladed propeller is analyzed. Each of these operating conditions is iterated once to determine the flutter boundary. A 5-bladed propeller is also analyzed at one operating condition to investigate stability. Analytical results obtained are in very good agreement with those from wind tunnel tests.
Noar, Seth M; Mehrotra, Purnima
2011-03-01
Traditional theory testing commonly applies cross-sectional (and occasionally longitudinal) survey research to test health behavior theory. Since such correlational research cannot demonstrate causality, a number of researchers have called for the increased use of experimental methods for theory testing. We introduce the multi-methodological theory-testing (MMTT) framework for testing health behavior theory. The MMTT framework introduces a set of principles that broaden the perspective of how we view evidence for health behavior theory. It suggests that while correlational survey research designs represent one method of testing theory, the weaknesses of this approach demand that complementary approaches be applied. Such approaches include randomized lab and field experiments, mediation analysis of theory-based interventions, and meta-analysis. These alternative approaches to theory testing can demonstrate causality in a much more robust way than is possible with correlational survey research methods. Such approaches should thus be increasingly applied in order to more completely and rigorously test health behavior theory. Greater application of research derived from the MMTT may lead researchers to refine and modify theory and ultimately make theory more valuable to practitioners. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
The basis of musical consonance as revealed by congenital amusia
Cousineau, Marion; McDermott, Josh H.; Peretz, Isabelle
2012-01-01
Some combinations of musical notes sound pleasing and are termed “consonant,” but others sound unpleasant and are termed “dissonant.” The distinction between consonance and dissonance plays a central role in Western music, and its origins have posed one of the oldest and most debated problems in perception. In modern times, dissonance has been widely believed to be the product of “beating”: interference between frequency components in the cochlea that has been believed to be more pronounced in dissonant than consonant sounds. However, harmonic frequency relations, a higher-order sound attribute closely related to pitch perception, has also been proposed to account for consonance. To tease apart theories of musical consonance, we tested sound preferences in individuals with congenital amusia, a neurogenetic disorder characterized by abnormal pitch perception. We assessed amusics’ preferences for musical chords as well as for the isolated acoustic properties of beating and harmonicity. In contrast to control subjects, amusic listeners showed no preference for consonance, rating the pleasantness of consonant chords no higher than that of dissonant chords. Amusics also failed to exhibit the normally observed preference for harmonic over inharmonic tones, nor could they discriminate such tones from each other. Despite these abnormalities, amusics exhibited normal preferences and discrimination for stimuli with and without beating. This dissociation indicates that, contrary to classic theories, beating is unlikely to underlie consonance. Our results instead suggest the need to integrate harmonicity as a foundation of music preferences, and illustrate how amusia may be used to investigate normal auditory function. PMID:23150582
Krstacic, Goran; Krstacic, Antonija; Smalcelj, Anton; Milicic, Davor; Jembrek-Gostovic, Mirjana
2007-04-01
Dynamic analysis techniques may quantify abnormalities in heart rate variability (HRV) based on nonlinear and fractal analysis (chaos theory). The article emphasizes clinical and prognostic significance of dynamic changes in short-time series applied on patients with coronary heart disease (CHD) during the exercise electrocardiograph (ECG) test. The subjects were included in the series after complete cardiovascular diagnostic data. Series of R-R and ST-T intervals were obtained from exercise ECG data after sampling digitally. The range rescaled analysis method determined the fractal dimension of the intervals. To quantify fractal long-range correlation's properties of heart rate variability, the detrended fluctuation analysis technique was used. Approximate entropy (ApEn) was applied to quantify the regularity and complexity of time series, as well as unpredictability of fluctuations in time series. It was found that the short-term fractal scaling exponent (alpha(1)) is significantly lower in patients with CHD (0.93 +/- 0.07 vs 1.09 +/- 0.04; P < 0.001). The patients with CHD had higher fractal dimension in each exercise test program separately, as well as in exercise program at all. ApEn was significant lower in CHD group in both RR and ST-T ECG intervals (P < 0.001). The nonlinear dynamic methods could have clinical and prognostic applicability also in short-time ECG series. Dynamic analysis based on chaos theory during the exercise ECG test point out the multifractal time series in CHD patients who loss normal fractal characteristics and regularity in HRV. Nonlinear analysis technique may complement traditional ECG analysis.
Seth, Anil K.
2014-01-01
Normal perception involves experiencing objects within perceptual scenes as real, as existing in the world. This property of “perceptual presence” has motivated “sensorimotor theories” which understand perception to involve the mastery of sensorimotor contingencies. However, the mechanistic basis of sensorimotor contingencies and their mastery has remained unclear. Sensorimotor theory also struggles to explain instances of perception, such as synesthesia, that appear to lack perceptual presence and for which relevant sensorimotor contingencies are difficult to identify. On alternative “predictive processing” theories, perceptual content emerges from probabilistic inference on the external causes of sensory signals, however, this view has addressed neither the problem of perceptual presence nor synesthesia. Here, I describe a theory of predictive perception of sensorimotor contingencies which (1) accounts for perceptual presence in normal perception, as well as its absence in synesthesia, and (2) operationalizes the notion of sensorimotor contingencies and their mastery. The core idea is that generative models underlying perception incorporate explicitly counterfactual elements related to how sensory inputs would change on the basis of a broad repertoire of possible actions, even if those actions are not performed. These “counterfactually-rich” generative models encode sensorimotor contingencies related to repertoires of sensorimotor dependencies, with counterfactual richness determining the degree of perceptual presence associated with a stimulus. While the generative models underlying normal perception are typically counterfactually rich (reflecting a large repertoire of possible sensorimotor dependencies), those underlying synesthetic concurrents are hypothesized to be counterfactually poor. In addition to accounting for the phenomenology of synesthesia, the theory naturally accommodates phenomenological differences between a range of experiential states including dreaming, hallucination, and the like. It may also lead to a new view of the (in)determinacy of normal perception. PMID:24446823
Publication Bias in Meta-Analysis: Confidence Intervals for Rosenthal's Fail-Safe Number.
Fragkos, Konstantinos C; Tsagris, Michail; Frangos, Christos C
2014-01-01
The purpose of the present paper is to assess the efficacy of confidence intervals for Rosenthal's fail-safe number. Although Rosenthal's estimator is highly used by researchers, its statistical properties are largely unexplored. First of all, we developed statistical theory which allowed us to produce confidence intervals for Rosenthal's fail-safe number. This was produced by discerning whether the number of studies analysed in a meta-analysis is fixed or random. Each case produces different variance estimators. For a given number of studies and a given distribution, we provided five variance estimators. Confidence intervals are examined with a normal approximation and a nonparametric bootstrap. The accuracy of the different confidence interval estimates was then tested by methods of simulation under different distributional assumptions. The half normal distribution variance estimator has the best probability coverage. Finally, we provide a table of lower confidence intervals for Rosenthal's estimator.
Publication Bias in Meta-Analysis: Confidence Intervals for Rosenthal's Fail-Safe Number
Fragkos, Konstantinos C.; Tsagris, Michail; Frangos, Christos C.
2014-01-01
The purpose of the present paper is to assess the efficacy of confidence intervals for Rosenthal's fail-safe number. Although Rosenthal's estimator is highly used by researchers, its statistical properties are largely unexplored. First of all, we developed statistical theory which allowed us to produce confidence intervals for Rosenthal's fail-safe number. This was produced by discerning whether the number of studies analysed in a meta-analysis is fixed or random. Each case produces different variance estimators. For a given number of studies and a given distribution, we provided five variance estimators. Confidence intervals are examined with a normal approximation and a nonparametric bootstrap. The accuracy of the different confidence interval estimates was then tested by methods of simulation under different distributional assumptions. The half normal distribution variance estimator has the best probability coverage. Finally, we provide a table of lower confidence intervals for Rosenthal's estimator. PMID:27437470
Predicting financial market crashes using ghost singularities.
Smug, Damian; Ashwin, Peter; Sornette, Didier
2018-01-01
We analyse the behaviour of a non-linear model of coupled stock and bond prices exhibiting periodically collapsing bubbles. By using the formalism of dynamical system theory, we explain what drives the bubbles and how foreshocks or aftershocks are generated. A dynamical phase space representation of that system coupled with standard multiplicative noise rationalises the log-periodic power law singularity pattern documented in many historical financial bubbles. The notion of 'ghosts of finite-time singularities' is introduced and used to estimate the end of an evolving bubble, using finite-time singularities of an approximate normal form near the bifurcation point. We test the forecasting skill of this method on different stochastic price realisations and compare with Monte Carlo simulations of the full system. Remarkably, the approximate normal form is significantly more precise and less biased. Moreover, the method of ghosts of singularities is less sensitive to the noise realisation, thus providing more robust forecasts.
A linear induction motor with a coated conductor superconducting secondary
NASA Astrophysics Data System (ADS)
Chen, Xin; Zheng, Shijun; Li, Jing; Ma, Guang Tong; Yen, Fei
2018-07-01
A linear induction motor system composed of a high-Tc superconducting secondary with close-ended coils made of REBCO coated conductor wire was designed and tested experimentally. The measured thrust, normal force and power loss are presented and explained by combining the flux dynamics inside superconductors with existing linear drive theory. It is found that an inherent capacitive component associated to the flux motion of vortices in the Type-II superconductor reduces the impedance of the coils; from such, the associated Lorentz forces are drastically increased. The resulting breakout thrust of the designed linear motor system was found to be extremely high (up to 4.7 kN/m2) while the associated normal forces only a fraction of the thrust. Compared to its conventional counterparts, high-Tc superconducting secondaries appear to be more feasible for use in maglev propulsion and electromagnetic launchers.
Time- & Load-Dependence of Triboelectric Effect.
Pan, Shuaihang; Yin, Nian; Zhang, Zhinan
2018-02-06
Time- and load-dependent friction behavior is considered as important for a long time, due to its time-evolution and force-driving characteristics. However, its electronic behavior, mainly considered in triboelectric effect, has almost never been given the full attention and analyses from the above point of view. In this paper, by experimenting with fcc-latticed aluminum and copper friction pairs, the mechanical and electronic behaviors of friction contacts are correlated by time and load analyses, and the behind physical understanding is provided. Most importantly, the difference of "response lag" in force and electricity is discussed, the extreme points of coefficient of friction with the increasing normal loads are observed and explained with the surface properties and dynamical behaviors (i.e. wear), and the micro and macro theories linking tribo-electricity to normal load and wear (i.e. the physical explanation between coupled electrical and mechanical phenomena) are successfully developed and tested.
Predicting financial market crashes using ghost singularities
2018-01-01
We analyse the behaviour of a non-linear model of coupled stock and bond prices exhibiting periodically collapsing bubbles. By using the formalism of dynamical system theory, we explain what drives the bubbles and how foreshocks or aftershocks are generated. A dynamical phase space representation of that system coupled with standard multiplicative noise rationalises the log-periodic power law singularity pattern documented in many historical financial bubbles. The notion of ‘ghosts of finite-time singularities’ is introduced and used to estimate the end of an evolving bubble, using finite-time singularities of an approximate normal form near the bifurcation point. We test the forecasting skill of this method on different stochastic price realisations and compare with Monte Carlo simulations of the full system. Remarkably, the approximate normal form is significantly more precise and less biased. Moreover, the method of ghosts of singularities is less sensitive to the noise realisation, thus providing more robust forecasts. PMID:29596485
A physiologically-based model for simulation of color vision deficiency.
Machado, Gustavo M; Oliveira, Manuel M; Fernandes, Leandro A F
2009-01-01
Color vision deficiency (CVD) affects approximately 200 million people worldwide, compromising the ability of these individuals to effectively perform color and visualization-related tasks. This has a significant impact on their private and professional lives. We present a physiologically-based model for simulating color vision. Our model is based on the stage theory of human color vision and is derived from data reported in electrophysiological studies. It is the first model to consistently handle normal color vision, anomalous trichromacy, and dichromacy in a unified way. We have validated the proposed model through an experimental evaluation involving groups of color vision deficient individuals and normal color vision ones. Our model can provide insights and feedback on how to improve visualization experiences for individuals with CVD. It also provides a framework for testing hypotheses about some aspects of the retinal photoreceptors in color vision deficient individuals.
Applications of non-parametric statistics and analysis of variance on sample variances
NASA Technical Reports Server (NTRS)
Myers, R. H.
1981-01-01
Nonparametric methods that are available for NASA-type applications are discussed. An attempt will be made here to survey what can be used, to attempt recommendations as to when each would be applicable, and to compare the methods, when possible, with the usual normal-theory procedures that are avavilable for the Gaussion analog. It is important here to point out the hypotheses that are being tested, the assumptions that are being made, and limitations of the nonparametric procedures. The appropriateness of doing analysis of variance on sample variances are also discussed and studied. This procedure is followed in several NASA simulation projects. On the surface this would appear to be reasonably sound procedure. However, difficulties involved center around the normality problem and the basic homogeneous variance assumption that is mase in usual analysis of variance problems. These difficulties discussed and guidelines given for using the methods.
NASA Astrophysics Data System (ADS)
Collignon, Clément; Fauqué, Benoît; Cavanna, Antonella; Gennser, Ulf; Mailly, Dominique; Behnia, Kamran
2017-12-01
We present a study of the lower critical field, Hc 1, of SrTi1 -xNbxO3 as a function of carrier concentration with the aim of quantifying the superfluid density. At low carrier concentration (i.e., the underdoped side), superfluid density and the carrier concentration in the normal state are equal within experimental margin. A significant deviation between the two numbers starts at optimal doping and gradually increases with doping. The inverse of the penetration depth and the critical temperature follow parallel evolutions as in the case of cuprate superconductors. In the overdoped regime, the zero-temperature superfluid density becomes much lower than the normal-state carrier density before vanishing all together. We show that the density mismatch and the clean-to-dirty crossover are concomitant. Our results imply that the discrepancy between normal and superconducting densities is expected whenever the superconducting gap becomes small enough to put the system in the dirty limit. A quantitative test of the dirty BCS theory is not straightforward, due to the multiplicity of the bands in superconducting strontium titanate.
Fracture mechanics analysis for various fiber/matrix interface loadings
NASA Technical Reports Server (NTRS)
Naik, Rajiv A.; Crews, John H., Jr.
1992-01-01
Fiber/matrix (F/M) cracking was analyzed to provide better understanding and guidance in developing F/M interface fracture toughness tests. Two configurations, corresponding to F/M cracking at a broken fiber and at the free edge, were investigated. The effects of mechanical loading, thermal cooldown, and friction were investigated. Each configuration was analyzed for two loadings: longitudinal and normal to the fiber. A nonlinear finite element analysis was performed to model friction and slip at the F/M interface. A new procedure for fitting a square-root singularity to calculated stresses was developed to determine stress intensity factors (K sub I and K sub II) for a bimaterial interface crack. For the case of F/M cracking at a broken fiber with longitudinal loading, crack tip conditions were strongly influenced by interface friction. As a result, an F/M interface toughness test based on this case was not recommended because nonlinear data analysis methods would be required. For the free edge crack configuration, both mechanical and thermal loading caused crack opening, theory avoiding fractional effects. A F/M interface toughness test based on this configuration would provide data for K(sub I/K(sub II) ratios of about 0.7 and 1.6 for fiber and radial normal loading, respectively. However, thermal effects must be accounted for in the data analysis.
NASA Astrophysics Data System (ADS)
Pu, Yang; Chen, Jun; Wang, Wubao
2014-02-01
The scattering coefficient, μs, the anisotropy factor, g, the scattering phase function, p(θ), and the angular dependence of scattering intensity distributions of human cancerous and normal prostate tissues were systematically investigated as a function of wavelength, scattering angle and scattering particle size using Mie theory and experimental parameters. The Matlab-based codes using Mie theory for both spherical and cylindrical models were developed and applied for studying the light propagation and the key scattering properties of the prostate tissues. The optical and structural parameters of tissue such as the index of refraction of cytoplasm, size of nuclei, and the diameter of the nucleoli for cancerous and normal human prostate tissues obtained from the previous biological, biomedical and bio-optic studies were used for Mie theory simulation and calculation. The wavelength dependence of scattering coefficient and anisotropy factor were investigated in the wide spectral range from 300 nm to 1200 nm. The scattering particle size dependence of μs, g, and scattering angular distributions were studied for cancerous and normal prostate tissues. The results show that cancerous prostate tissue containing larger size scattering particles has more contribution to the forward scattering in comparison with the normal prostate tissue. In addition to the conventional simulation model that approximately considers the scattering particle as sphere, the cylinder model which is more suitable for fiber-like tissue frame components such as collagen and elastin was used for developing a computation code to study angular dependence of scattering in prostate tissues. To the best of our knowledge, this is the first study to deal with both spherical and cylindrical scattering particles in prostate tissues.
On the tensile strength of soil grains in Hertzian response
NASA Astrophysics Data System (ADS)
Nadimi, Sadegh; Fonseca, Joana
2017-06-01
The breakage initiation of soil grains is controlled by its tensile capacity. Despite the importance of tensile strength, it is often disregarded due to difficulties in measurement. This paper presents an experimental and numerical investigation on the effect of tensile strength on Hertzian response of a single soil grain. Hertz theory is commonly used in numerical simulation to present the contact constitutive behaviour of a purely elastic grain under normal loading. This normal force:displacement comes from stress distribution and concentration inside the grain. When the stress reaches the tensile capacity, a crack initiates. A series of numerical tests have been conducted to determine the sensitivity of Hertzian response to the selected tensile strength used as an input data. An elastic-damage constitutive model has been employed for spherical grains in a combined finite-discrete element framework. The interpretation of results was enriched by considering previous theoretical work. In addition, systematic experimental tests have been carried out on both spherical glass beads and grains of two different sands, i.e. Leighton Buzzard silica sand and coarse carbonate sand from Persian Gulf. The preliminary results suggest that lower tensile strength leads to a softer response under normal loading. The wider range of responses obtained for the carbonate sand, are believed to be related to the large variety of grain shape associated with bioclastic origin of the constituent grains.
1991-03-04
term that describes inextensional motion. The first equation represents the normal stress at the midsurface of the shell, which is equal to the...that the normal velocity at the midsurface of the shell is proportional to the normal derivative of the total pressw e. The scattered pressure ps can
Experimental and theoretical sound transmission. [reduction of interior noise in aircraft
NASA Technical Reports Server (NTRS)
Roskam, J.; Muirhead, V. U.; Smith, H. W.; Durenberger, D. W.
1978-01-01
The capabilities of the Kansas University- Flight Research Center for investigating panel sound transmission as a step toward the reduction of interior noise in general aviation aircraft were discussed. Data obtained on panels with holes, on honeycomb panels, and on various panel treatments at normal incidence were documented. The design of equipment for panel transmission loss tests at nonnormal (slanted) sound incidence was described. A comprehensive theory-based prediction method was developed and shows good agreement with experimental observations of the stiffness controlled, the region, the resonance controlled region, and the mass-law region of panel vibration.
A product Pearson-type VII density distribution
NASA Astrophysics Data System (ADS)
Nadarajah, Saralees; Kotz, Samuel
2008-01-01
The Pearson-type VII distributions (containing the Student's t distributions) are becoming increasing prominent and are being considered as competitors to the normal distribution. Motivated by real examples in decision sciences, Bayesian statistics, probability theory and Physics, a new Pearson-type VII distribution is introduced by taking the product of two Pearson-type VII pdfs. Various structural properties of this distribution are derived, including its cdf, moments, mean deviation about the mean, mean deviation about the median, entropy, asymptotic distribution of the extreme order statistics, maximum likelihood estimates and the Fisher information matrix. Finally, an application to a Bayesian testing problem is illustrated.
NASA Technical Reports Server (NTRS)
Budd, P. A.
1981-01-01
The secondary electron emission coefficient was measured for a charged polymer (FEP-Teflon) with normally and obliquely incident primary electrons. Theories of secondary emission are reviewed and the experimental data is compared to these theories. Results were obtained for angles of incidence up to 60 deg in normal electric fields of 1500 V/mm. Additional measurements in the range from 50 to 70 deg were made in regions where the normal and tangential fields were approximately equal. The initial input angles and measured output point of the electron beam could be analyzed with computer simulations in order to determine the field within the chamber. When the field is known, the trajectories can be calculated for impacting electrons having various energies and angles of incidence. There was close agreement between the experimental results and the commonly assumed theoretical model in the presence of normal electric fields for angles of incidence up to 60 deg. High angle results obtained in the presence of tangential electric fields did not agree with the theoretical models.
[Normal aging of frontal lobe functions].
Calso, Cristina; Besnard, Jérémy; Allain, Philippe
2016-03-01
Normal aging in individuals is often associated with morphological, metabolic and cognitive changes, which particularly concern the cerebral frontal regions. Starting from the "frontal lobe hypothesis of cognitive aging" (West, 1996), the present review is based on the neuroanatomical model developed by Stuss (2008), introducing four categories of frontal lobe functions: executive control, behavioural and emotional self-regulation and decision-making, energization and meta-cognitive functions. The selected studies only address the changes of one at least of these functions. The results suggest a deterioration of several cognitive frontal abilities in normal aging: flexibility, inhibition, planning, verbal fluency, implicit decision-making, second-order and affective theory of mind. Normal aging seems also to be characterised by a general reduction in processing speed observed during neuropsychological assessment (Salthouse, 1996). Nevertheless many cognitive functions remain preserved such as automatic or non-conscious inhibition, specific capacities of flexibility and first-order theory of mind. Therefore normal aging doesn't seem to be associated with a global cognitive decline but rather with a selective change in some frontal systems, conclusion which should be taken into account for designing caring programs in normal aging.
Winter, D A
1989-12-01
The biomechanical (kinetic) analysis of human gait reveals the integrated and detailed motor patterns that are essential in pinpointing the abnormal patterns in pathological gait. In a similar manner, these motor patterns (moments, powers, and EMGs) can be used to identify synergies and to validate theories of CNS control. Based on kinetic and EMG patterns for a wide range of normal subjects and cadences, evidence is presented that both supports and negates the central pattern generator theory of locomotion. Adaptive motor patterns that are evident in peripheral gait pathologies reinforce a strong peripheral rather than a central control. Finally, a three-component subtask theory of human gait is presented and is supported by reference to the motor patterns seen in a normal gait. The identified subtasks are (a) support (against collapse during stance); (b) dynamic balance of the upper body, also during stance; and (c) feedforward control of the foot trajectory to achieve safe ground clearance and a gentle heel contact.
Vibration Control in Turbomachinery Using Active Magnetic Journal Bearings
NASA Technical Reports Server (NTRS)
Knight, Josiah D.
1996-01-01
The effective use of active magnetic bearings for vibration control in turbomachinery depends on an understanding of the forces available from a magnetic bearing actuator. The purpose of this project was to characterize the forces as functions shaft position. Both numerical and experimental studies were done to determine the characteristics of the forces exerted on a stationary shaft by a magnetic bearing actuator. The numerical studies were based on finite element computations and included both linear and nonlinear magnetization functions. Measurements of the force versus position of a nonrotating shaft were made using two separate measurement rigs, one based on strain gage measurement of forces, the other based on deflections of a calibrated beam. The general trends of the measured principal forces agree with the predictions of the theory while the magnitudes of forces are somewhat smaller than those predicted. Other aspects of theory are not confirmed by the measurements. The measured forces in the normal direction are larger than those predicted by theory when the rotor has a normal eccentricity. Over the ranges of position examined, the data indicate an approximately linear relationship between the normal eccentricity of the shaft and the ratio of normal to principal force. The constant of proportionality seems to be larger at lower currents, but for all cases examined its value is between 0.14 and 0.17. The nonlinear theory predicts the existence of normal forces, but has not predicted such a large constant of proportionality for the ratio. The type of coupling illustrated by these measurements would not tend to cause whirl, because the coupling coefficients have the same sign, unlike the case of a fluid film bearing, where the normal stiffness coefficients often have opposite signs. They might, however, tend to cause other self-excited behavior. This possibility must be considered when designing magnetic bearings for flexible rotor applications, such as gas turbines and other turbomachinery.
2007-05-01
sufficient for explaining how theory -of- mind emerges in normally developing children . As confirmation of its plausibility, our theory explains the... autism . While there are a number of different substrate elements that we believe are operative during theory of mind computations, three elements in...15. SUBJECT TERMS PMESII, multiple representations, integrated reasoning, hybrid systems, social cognition, theory of mind 16. SECURITY
Wang, Yimin; Bowman, Joel M
2013-10-21
We present a theory of mode-specific tunneling that makes use of the general tunneling path along the imaginary-frequency normal mode of the saddle point, Qim, and the associated relaxed potential, V(Qim) [Y. Wang and J. M. Bowman, J. Chem. Phys. 129, 121103 (2008)]. The novel aspect of the theory is the projection of the normal modes of a minimum onto the Qim path and the determination of turning points on V(Qim). From that projection, the change in tunneling upon mode excitation can be calculated. If the projection is zero, no enhancement of tunneling is predicted. In that case vibrationally adiabatic (VA) theory could apply. However, if the projection is large then VA theory is not applicable. The approach is applied to mode-specific tunneling in full-dimensional malonaldehyde, using an accurate full-dimensional potential energy surface. Results are in semi-quantitative agreement with experiment for modes that show large enhancement of the tunneling, relative to the ground state tunneling splitting. For the six out-of-plane modes, which have zero projection on the planar Qim path, VA theory does apply, and results from that theory agree qualitatively and even semi-quantitatively with experiment. We also verify the failure of simple VA theory for modes that show large enhancement of tunneling.
An analytical and experimental study of crack extension in center-notched composites
NASA Technical Reports Server (NTRS)
Beuth, Jack L., Jr.; Herakovich, Carl T.
1987-01-01
The normal stress ratio theory for crack extension in anisotropic materials is studied analytically and experimentally. The theory is applied within a microscopic-level analysis of a single center notch of arbitrary orientation in a unidirectional composite material. The bulk of the analytical work of this study applies an elasticity solution for an infinite plate with a center line to obtain critical stress and crack growth direction predictions. An elasticity solution for an infinite plate with a center elliptical flaw is also used to obtain qualitative predictions of the location of crack initiation on the border of a rounded notch tip. The analytical portion of the study includes the formulation of a new crack growth theory that includes local shear stress. Normal stress ratio theory predictions are obtained for notched unidirectional tensile coupons and unidirectional Iosipescu shear specimens. These predictions are subsequently compared to experimental results.
NASA Astrophysics Data System (ADS)
Auluck, S. K. H.
2017-11-01
This paper continues earlier discussion [S. K. H. Auluck, Phys. Plasmas 21, 102515 (2014)] concerning the formulation of conservation laws of mass, momentum, and energy in a local curvilinear coordinate system in the dense plasma focus. This formulation makes use of the revised Gratton-Vargas snowplow model [S. K. H. Auluck, Phys. Plasmas 20, 112501 (2013)], which provides an analytically defined imaginary surface in three dimensions which resembles the experimentally determined shape of the plasma. Unit vectors along the local tangent to this surface, along the azimuth, and along the local normal define a right-handed orthogonal local curvilinear coordinate system. The simplifying assumption that physical quantities have significant variation only along the normal enables writing laws of conservation of mass, momentum, and energy in the form of effectively one-dimensional hyperbolic conservation law equations using expressions for various differential operators derived for this coordinate system. This formulation demonstrates the highly non-trivial result that the axial magnetic field and toroidally streaming fast ions, experimentally observed by multiple prestigious laboratories, are natural consequences of conservation of mass, momentum, and energy in the curved geometry of the dense plasma focus current sheath. The present paper continues the discussion in the context of a 3-region shock structure similar to the one experimentally observed: an unperturbed region followed by a hydrodynamic shock containing some current followed by a magnetic piston. Rankine-Hugoniot conditions are derived, and expressions are obtained for the specific volumes and pressures using the mass-flux between the hydrodynamic shock and the magnetic piston and current fraction in the hydrodynamic shock as unknown parameters. For the special case of a magnetic piston that remains continuously in contact with the fluid being pushed, the theory gives closed form algebraic results for the fraction of current flowing in the hydrodynamic shock, specific volume, pressure, and fluid velocity of the hydrodynamic shock region, the tangential, normal, and azimuthal components of velocity in the magnetized plasma, the density of the magnetized plasma, the normal and tangential components of the magnetic field, and the tangential, normal, and azimuthal components of the electric field. This explains the occurrence of azimuthally streaming high energy deuterons experimentally observed by Frascati and Stuttgart. The expression derived for the azimuthal component of vector potential can serve as the basis for a proposed experimental test of the theory.
Theory and Experiment Analysis of Two-Dimensional Acousto-Optic Interaction.
1995-01-03
The universal coupled wave equation of two dimensional acousto optic effect has been deduced and the solution of normal Raman-Hath acousto optic diffraction...was derived from it. The theory was compared with the experimental results of a two dimensional acousto optic device consisting of two one dimensional modulators. The experiment results agree with the theory. (AN)
A Complete Multimode Equivalent-Circuit Theory for Electrical Design
Williams, Dylan F.; Hayden, Leonard A.; Marks, Roger B.
1997-01-01
This work presents a complete equivalent-circuit theory for lossy multimode transmission lines. Its voltages and currents are based on general linear combinations of standard normalized modal voltages and currents. The theory includes new expressions for transmission line impedance matrices, symmetry and lossless conditions, source representations, and the thermal noise of passive multiports. PMID:27805153
Therapeutic Treatment of Early Disturbances in the Mother-Child Interaction.
ERIC Educational Resources Information Center
Broden, Margareta Berg
A theory of normal mother-infant relationship based on Margaret Mahler's theories is the basis of a treatment program for disturbed mother/infant relationships. This theory includes the concept of symbiosis which for the child is an undifferentiated condition, a fusion with the mother where the two have a common outward border, thereby protecting…
ERIC Educational Resources Information Center
Yelboga, Atilla; Tavsancil, Ezel
2010-01-01
In this research, the classical test theory and generalizability theory analyses were carried out with the data obtained by a job performance scale for the years 2005 and 2006. The reliability coefficients obtained (estimated) from the classical test theory and generalizability theory analyses were compared. In classical test theory, test retest…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sweezy, Jeremy Ed
A photon next-event fluence estimator at a point has been implemented in the Monte Carlo Application Toolkit (MCATK). The next-event estimator provides an expected value estimator for the flux at a point due to all source and collision events. An advantage of the next-event estimator over track-length estimators, which are normally employed in MCATK, is that flux estimates can be made in locations that have no random walk particle tracks. The next-event estimator allows users to calculate radiographs and estimate response for detectors outside of the modeled geometry. The next-event estimator is not yet accessable through the MCATK FlatAPI formore » C and Fortran. The next-event estimator in MCATK has been tested against MCNP6 using 5 suites of test problems. No issues were found in the MCATK implementation. One issue was found in the exclusion radius approximation in MCNP6. The theory, implementation, and testing are described in this document.« less
Flowing gas, non-nuclear experiments on the gas core reactor
NASA Technical Reports Server (NTRS)
Kunze, J. F.; Suckling, D. H.; Copper, C. G.
1972-01-01
Flow tests were conducted on models of the gas core (cavity) reactor. Variations in cavity wall and injection configurations were aimed at establishing flow patterns that give a maximum of the nuclear criticality eigenvalue. Correlation with the nuclear effect was made using multigroup diffusion theory normalized by previous benchmark critical experiments. Air was used to simulate the hydrogen propellant in the flow tests, and smoked air, argon, or freon to simulate the central nuclear fuel gas. All tests were run in the down-firing direction so that gravitational effects simulated the acceleration effect of a rocket. Results show that acceptable flow patterns with high volume fraction for the simulated nuclear fuel gas and high flow rate ratios of propellant to fuel can be obtained. Using a point injector for the fuel, good flow patterns are obtained by directing the outer gas at high velocity along the cavity wall, using louvered or oblique-angle-honeycomb injection schemes.
Lateral spread of sonic boom measurements from US Air Force boomfile flight tests
NASA Technical Reports Server (NTRS)
Downing, J. Micah
1992-01-01
A series of sonic boom flight tests were conducted by the US Air Force at Edwards AFB in 1987 with current supersonic DOD aircraft. These tests involved 43 flights by various aircraft at different Mach number and altitude combinations. The measured peak overpressures to predicted values as a function of lateral distance are compared. Some of the flights are combined into five groups because of the varying profiles and the limited number of sonic booms obtained during this study. The peak overpressures and the lateral distances are normalized with respect to the Carlson method predicted centerline overpressures and lateral cutoff distances, respectively, to facilitate comparisons between sonic boom data from similar flight profiles. It is demonstrated that the data agrees with sonic boom theory and previous studies and adds to the existing sonic boom database by including sonic boom signatures, tracking, and weather data in a digital format.
Geometrically Nonlinear Transient Analysis of Laminated Composite Plates.
1982-03-01
theory (CPT), in which normals to the midsurface before deformation are assumed to remain straight and normal to the midsurface after deformation (i.e...the plate are negligible when compared to the inplane stresses, and normals to the plate midsurface before deformation remain straight but not...necessarily normal to the midsurface after deformation. $ Equations of motion The plate under consideration is composed of a finite number of orthotropic
Boeschen Hospers, J Mirjam; Smits, Niels; Smits, Cas; Stam, Mariska; Terwee, Caroline B; Kramer, Sophia E
2016-04-01
We reevaluated the psychometric properties of the Amsterdam Inventory for Auditory Disability and Handicap (AIADH; Kramer, Kapteyn, Festen, & Tobi, 1995) using item response theory. Item response theory describes item functioning along an ability continuum. Cross-sectional data from 2,352 adults with and without hearing impairment, ages 18-70 years, were analyzed. They completed the AIADH in the web-based prospective cohort study "Netherlands Longitudinal Study on Hearing." A graded response model was fitted to the AIADH data. Category response curves, item information curves, and the standard error as a function of self-reported hearing ability were plotted. The graded response model showed a good fit. Item information curves were most reliable for adults who reported having hearing disability and less reliable for adults with normal hearing. The standard error plot showed that self-reported hearing ability is most reliably measured for adults reporting mild up to moderate hearing disability. This is one of the few item response theory studies on audiological self-reports. All AIADH items could be hierarchically placed on the self-reported hearing ability continuum, meaning they measure the same construct. This provides a promising basis for developing a clinically useful computerized adaptive test, where item selection adapts to the hearing ability of individuals, resulting in efficient assessment of hearing disability.
Graph theory network function in Parkinson's disease assessed with electroencephalography.
Utianski, Rene L; Caviness, John N; van Straaten, Elisabeth C W; Beach, Thomas G; Dugger, Brittany N; Shill, Holly A; Driver-Dunckley, Erika D; Sabbagh, Marwan N; Mehta, Shyamal; Adler, Charles H; Hentz, Joseph G
2016-05-01
To determine what differences exist in graph theory network measures derived from electroencephalography (EEG), between Parkinson's disease (PD) patients who are cognitively normal (PD-CN) and matched healthy controls; and between PD-CN and PD dementia (PD-D). EEG recordings were analyzed via graph theory network analysis to quantify changes in global efficiency and local integration. This included minimal spanning tree analysis. T-tests and correlations were used to assess differences between groups and assess the relationship with cognitive performance. Network measures showed increased local integration across all frequency bands between control and PD-CN; in contrast, decreased local integration occurred in PD-D when compared to PD-CN in the alpha1 frequency band. Differences found in PD-MCI mirrored PD-D. Correlations were found between network measures and assessments of global cognitive performance in PD. Our results reveal distinct patterns of band and network measure type alteration and breakdown for PD, as well as with cognitive decline in PD. These patterns suggest specific ways that interaction between cortical areas becomes abnormal and contributes to PD symptoms at various stages. Graph theory analysis by EEG suggests that network alteration and breakdown are robust attributes of PD cortical dysfunction pathophysiology. Copyright © 2016 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Laqua, Henryk; Kussmann, Jörg; Ochsenfeld, Christian
2018-03-01
The correct description of multi-reference electronic ground states within Kohn-Sham density functional theory (DFT) requires an ensemble-state representation, employing fractionally occupied orbitals. However, the use of fractional orbital occupation leads to non-normalized exact-exchange holes, resulting in large fractional-spin errors for conventional approximative density functionals. In this communication, we present a simple approach to directly include the exact-exchange-hole normalization into DFT. Compared to conventional functionals, our model strongly improves the description for multi-reference systems, while preserving the accuracy in the single-reference case. We analyze the performance of our proposed method at the example of spin-averaged atoms and spin-restricted bond dissociation energy surfaces.
Laqua, Henryk; Kussmann, Jörg; Ochsenfeld, Christian
2018-03-28
The correct description of multi-reference electronic ground states within Kohn-Sham density functional theory (DFT) requires an ensemble-state representation, employing fractionally occupied orbitals. However, the use of fractional orbital occupation leads to non-normalized exact-exchange holes, resulting in large fractional-spin errors for conventional approximative density functionals. In this communication, we present a simple approach to directly include the exact-exchange-hole normalization into DFT. Compared to conventional functionals, our model strongly improves the description for multi-reference systems, while preserving the accuracy in the single-reference case. We analyze the performance of our proposed method at the example of spin-averaged atoms and spin-restricted bond dissociation energy surfaces.
Normal stress differences and beyond-Navier-Stokes hydrodynamics
NASA Astrophysics Data System (ADS)
Alam, Meheboob; Saha, Saikat
2017-06-01
A recently proposed beyond-Navier-Stokes order hydrodynamic theory for dry granular fluids is revisited by focussing on the behaviour of the stress tensor and the scaling of related transport coefficients in the dense limit. For the homogeneous shear flow, it is shown that the eigen-directions of the second-moment tensor and those of the shear tensor become co-axial, thus making the first normal stress difference (N1) to zero in the same limit. In contrast, the origin of the second normal stress difference (N2) is tied to the `excess' temperature along the mean-vorticity direction and the imposed shear field, respectively, in the dilute and dense flows. The scaling relations for transport coefficients are suggested based on the present theory.
Time-dependent reliability analysis of ceramic engine components
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.
1993-01-01
The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing either the power or Paris law relations. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Two example problems demonstrating proof testing and fatigue parameter estimation are given.
Gerling, J; de Paz, H; Schroth, V; Bach, M; Kommerell, G
2000-06-01
The theory of the "Measuring and Correction Methods of H.-J. Haase" (MCH) states that a small misalignment of one eye, called fixation disparity, indicates a difficulty in overcoming a "vergence position of rest" that is different from ortho position. This difficulty, so the theory, can cause asthenopic complaints, such as headaches, and these complaints can be relieved by prisms. The theory further claims that fixation disparity can be ascertained by a series of tests which depend on the subject's perception. The tests most decisive for the diagnosis of a so-called fixation disparity type 2 consist of stereo displays. The magnitude of the prism that allows the subject to see the test configurations in symmetry is thought to be the one that corrects the "vergence position of rest". Nine subjects with healthy eyes in whom a "fixation disparity type 2" had been diagnosed were selected for the study. Misalignment of the eyes was determined according to the principle of the unilateral cover test. Targets identical for both eyes were presented on the screen of the Polatest E. Then, the target was deleted for one eye and the ensuing position change of the other eye was measured, using the search coil technique. This test was performed both with and without the MCH prism. In all 9 subjects the misalignment was less than 10 minutes of arc, i.e. in the range of normal fixation instability. Averaging across the 9 subjects, the deviation of the eye (misaligned according to MCH) was 0.79 +/- 3.45 minutes of arc in the direction opposed to that predicted by the MCH, a value not significantly different from zero. The MCH prism elicited a fusional vergence movement the magnitude of which corresponded to the magnitude of the MCH prism. Ascertaining fixation disparity with the MCH is unreliable. Accordingly, it appears dubious to correct a "vergence position of rest" on the basis of the MCH.
aCLIMAX 4.0.1, The new version of the software for analyzing and interpreting INS spectra
NASA Astrophysics Data System (ADS)
Ramirez-Cuesta, A. J.
2004-03-01
In Inelastic Neutron Scattering Spectroscopy, the neutron scattering intensity is plotted versus neutron energy loss giving a spectrum that looks like an infrared or a Raman spectrum. Unlike IR or Raman, INS does not have selection rules, i.e. all transitions are in principle observable. This particular characteristic makes INS a test bed for Density Functional Theory calculations of vibrational modes. aCLIMAX is the first user friendly program, within the Windows environment, that uses the output of normal modes to generate the calculated INS of the model molecule, making a lot easier to establish a connection between theory and experiment. Program summaryTitle of program: aCLIMAX 4.0.1 Catalogue identifier: ADSW Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADSW Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Operating systems: Windows 95 onwards, except Windows ME where it does not work Programming language used: Visual Basic Memory requirements: 64 MB No. of processors: 1 Has the code been parallelized: No No. of bytes in distributed program, including test data, etc.: 2 432 775 No. of lines in distributed program, including test data, etc.: 17 998 Distribution format: tar gzip file Nature of physical problem: Calculation of the Inelastic Neutron Scattering Spectra from DFT calculations of the vibrational density of states for molecules. Method of solution: INS spectral intensity calculated from normal modes analysis. Isolated molecule approximation. Typical time of running: From few seconds to few minutes depending on the size of the molecule. Unusual features of the program: Special care has to be taken in the case of computers that have different regional options than the English speaking countries, the decimal separator has to be set as "." (dot) instead of the usual "," (comma) that most countries use.
Normal Psychosexual Development
ERIC Educational Resources Information Center
Rutter, Michael
1971-01-01
Normal sexual development is reviewed with respect to physical maturation, sexual interests, sex drive", psychosexual competence and maturity, gender role, object choice, children's concepts of sexual differences, sex role preference and standards, and psychosexual stages. Biologic, psychoanalytic and psychosocial theories are briefly considered.…
In vivo diagnosis of skin cancer using polarized and multiple scattered light spectroscopy
NASA Astrophysics Data System (ADS)
Bartlett, Matthew Allen
This thesis research presents the development of a non-invasive diagnostic technique for distinguishing between skin cancer, moles, and normal skin using polarized and multiple scattered light spectroscopy. Polarized light incident on the skin is single scattered by the epidermal layer and multiple scattered by the dermal layer. The epidermal light maintains its initial polarization while the light from the dermal layer becomes randomized and multiple scattered. Mie theory was used to model the epidermal light as the scattering from the intercellular organelles. The dermal signal was modeled as the diffusion of light through a localized semi-homogeneous volume. These models were confirmed using skin phantom experiments, studied with in vitro cell cultures, and applied to human skin for in vivo testing. A CCD-based spectroscopy system was developed to perform all these experiments. The probe and the theory were tested on skin phantoms of latex spheres on top of a solid phantom. We next extended our phantom study to include in vitro cells on top of the solid phantom. Optical fluorescent microscope images revealed at least four distinct scatterers including mitochondria, nucleoli, nuclei, and cell membranes. Single scattering measurements on the mammalian cells consistently produced PSD's in the size range of the mitochondria. The clinical portion of the study consisted of in vivo measurements on cancer, mole, and normal skin spots. The clinical study combined the single scattering model from the phantom and in vitro cell studies with the diffusion model for multiple scattered light. When parameters from both layers were combined, we found that a sensitivity of 100% and 77% can be obtained for detecting cancers and moles, respectively, given the number of lesions examined.
Impaired face detection may explain some but not all cases of developmental prosopagnosia.
Dalrymple, Kirsten A; Duchaine, Brad
2016-05-01
Developmental prosopagnosia (DP) is defined by severe face recognition difficulties due to the failure to develop the visual mechanisms for processing faces. The two-process theory of face recognition (Morton & Johnson, 1991) implies that DP could result from a failure of an innate face detection system; this failure could prevent an individual from then tuning higher-level processes for face recognition (Johnson, 2005). Work with adults indicates that some individuals with DP have normal face detection whereas others are impaired. However, face detection has not been addressed in children with DP, even though their results may be especially informative because they have had less opportunity to develop strategies that could mask detection deficits. We tested the face detection abilities of seven children with DP. Four were impaired at face detection to some degree (i.e. abnormally slow, or failed to find faces) while the remaining three children had normal face detection. Hence, the cases with impaired detection are consistent with the two-process account suggesting that DP could result from a failure of face detection. However, the cases with normal detection implicate a higher-level origin. The dissociation between normal face detection and impaired identity perception also indicates that these abilities depend on different neurocognitive processes. © 2015 John Wiley & Sons Ltd.
Multivariate meta-analysis: a robust approach based on the theory of U-statistic.
Ma, Yan; Mazumdar, Madhu
2011-10-30
Meta-analysis is the methodology for combining findings from similar research studies asking the same question. When the question of interest involves multiple outcomes, multivariate meta-analysis is used to synthesize the outcomes simultaneously taking into account the correlation between the outcomes. Likelihood-based approaches, in particular restricted maximum likelihood (REML) method, are commonly utilized in this context. REML assumes a multivariate normal distribution for the random-effects model. This assumption is difficult to verify, especially for meta-analysis with small number of component studies. The use of REML also requires iterative estimation between parameters, needing moderately high computation time, especially when the dimension of outcomes is large. A multivariate method of moments (MMM) is available and is shown to perform equally well to REML. However, there is a lack of information on the performance of these two methods when the true data distribution is far from normality. In this paper, we propose a new nonparametric and non-iterative method for multivariate meta-analysis on the basis of the theory of U-statistic and compare the properties of these three procedures under both normal and skewed data through simulation studies. It is shown that the effect on estimates from REML because of non-normal data distribution is marginal and that the estimates from MMM and U-statistic-based approaches are very similar. Therefore, we conclude that for performing multivariate meta-analysis, the U-statistic estimation procedure is a viable alternative to REML and MMM. Easy implementation of all three methods are illustrated by their application to data from two published meta-analysis from the fields of hip fracture and periodontal disease. We discuss ideas for future research based on U-statistic for testing significance of between-study heterogeneity and for extending the work to meta-regression setting. Copyright © 2011 John Wiley & Sons, Ltd.
Failure Study of Composite Materials by the Yeh-Stratton Criterion
NASA Technical Reports Server (NTRS)
Yeh, Hsien-Yang; Richards, W. Lance
1997-01-01
The newly developed Yeh-Stratton (Y-S) Strength Criterion was used to study the failure of composite materials with central holes and normal cracks. To evaluate the interaction parameters for the Y-S failure theory, it is necessary to perform several biaxial loading tests. However, it is indisputable that the inhomogeneous and anisotropic nature of composite materials have made their own contribution to the complication of the biaxial testing problem. To avoid the difficulties of performing many biaxial tests and still consider the effects of the interaction term in the Y-S Criterion, a simple modification of the Y-S Criterion was developed. The preliminary predictions by the modified Y-S Criterion were relatively conservative compared to the testing data. Thus, the modified Y-S Criterion could be used as a design tool. To further understand the composite failure problem, an investigation of the damage zone in front of the crack tip coupled with the Y-S Criterion is imperative.
On the reversibility of the Meissner effect and the angular momentum puzzle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hirsch, J.E., E-mail: jhirsch@ucsd.edu
It is generally believed that the laws of thermodynamics govern superconductivity as an equilibrium state of matter, and hence that the normal-superconductor transition in a magnetic field is reversible under ideal conditions. Because eddy currents are generated during the transition as the magnetic flux changes, the transition has to proceed infinitely slowly to generate no entropy. Experiments showed that to a high degree of accuracy no entropy was generated in these transitions. However, in this paper we point out that for the length of times over which these experiments extended, a much higher degree of irreversibility due to decay ofmore » eddy currents should have been detected than was actually observed. We also point out that within the conventional theory of superconductivity no explanation exists for why no Joule heat is generated in the superconductor to normal transition when the supercurrent stops. In addition we point out that within the conventional theory of superconductivity no mechanism exists for the transfer of momentum between the supercurrent and the body as a whole, which is necessary to ensure that the transition in the presence of a magnetic field respects momentum conservation. We propose a solution to all these questions based on the alternative theory of hole superconductivity. The theory proposes that in the normal-superconductor transition there is a flow and backflow of charge in direction perpendicular to the phase boundary when the phase boundary moves. We show that this flow and backflow explains the absence of Joule heat generated by Faraday eddy currents, the absence of Joule heat generated in the process of the supercurrent stopping, and the reversible transfer of momentum between the supercurrent and the body, provided the current carriers in the normal state are holes. - Highlights: • The normal-superconductor phase transition is reversible. • Within the conventional theory, Foucault currents give rise to irreversibility. • To suppress Foucault currents, charge has to flow in direction perpendicular to the phase boundary. • The charge carriers have to be holes. • This solves also the angular momentum puzzle associated with the Meissner effect.« less
Lai, Anita; Haligua, Alexis; Dylan Bould, M; Everett, Tobias; Gale, Mark; Pigford, Ashlee-Ann; Boet, Sylvain
2016-08-01
Simulation training has been shown to be an effective way to teach crisis resource management (CRM) skills. Deliberate practice theory states that learners need to actively practice so that learning is effective. However, many residency programs have limited opportunities for learners to be "active" participants in simulation exercises. This study compares the effectiveness of learning CRM skills when being an active participant versus being an observer participant in simulation followed by a debriefing. Participants were randomized to two groups: active or observer. Active participants managed a simulated crisis scenario (pre-test) while paired observer participants viewed the scenario via video transmission. Then, a trained instructor debriefed participants on CRM principles. On the same day, each participant individually managed another simulated crisis scenario (post-test) and completed a post-test questionnaire. Two independent, blinded raters evaluated all videos using the Ottawa Global Rating Scale (GRS). Thirty-nine residents were included in the analysis. Normally distributed data were analyzed using paired and unpaired t-tests. Inter-rater reliability was 0.64. Active participants significantly improved from pre-test to post-test (P=0.015). There was no significant difference between the post-test performance of active participants compared to observer participants (P=0.12). We found that learning CRM principles was not superior when learners were active participants compared to being observers followed by debriefing. These findings challenge the deliberate practice theory claiming that learning requires active practice. Assigning residents as observers in simulation training and involving them in debriefing is still beneficial. Copyright © 2016 Société française d'anesthésie et de réanimation (Sfar). Published by Elsevier Masson SAS. All rights reserved.
NASA Technical Reports Server (NTRS)
Tessler, A.; Annett, M. S.; Gendron, G.
2001-01-01
A {1,2}-order theory for laminated composite and sandwich plates is extended to include thermoelastic effects. The theory incorporates all three-dimensional strains and stresses. Mixed-field assumptions are introduced which include linear in-plane displacements, parabolic transverse displacement and shear strains, and a cubic distribution of the transverse normal stress. Least squares strain compatibility conditions and exact traction boundary conditions are enforced to yield higher polynomial degree distributions for the transverse shear strains and transverse normal stress through the plate thickness. The principle of virtual work is used to derive a 10th-order system of equilibrium equations and associated Poisson boundary conditions. The predictive capability of the theory is demonstrated using a closed-form analytic solution for a simply-supported rectangular plate subjected to a linearly varying temperature field across the thickness. Several thin and moderately thick laminated composite and sandwich plates are analyzed. Numerical comparisons are made with corresponding solutions of the first-order shear deformation theory and three-dimensional elasticity theory. These results, which closely approximate the three-dimensional elasticity solutions, demonstrate that through - the - thickness deformations even in relatively thin and, especially in thick. composite and sandwich laminates can be significant under severe thermal gradients. The {1,2}-order kinematic assumptions insure an overall accurate theory that is in general superior and, in some cases, equivalent to the first-order theory.
Audiometric Predictions Using SFOAE and Middle-Ear Measurements
Ellison, John C.; Keefe, Douglas H.
2006-01-01
Objective The goals of the study are to determine how well stimulus-frequency otoacoustic emissions (SFOAEs) identify hearing loss, classify hearing loss as mild or moderate-severe, and correlate with pure-tone thresholds in a population of adults with normal middle-ear function. Other goals are to determine if middle-ear function as assessed by wideband acoustic transfer function (ATF) measurements in the ear canal account for the variability in normal thresholds, and if the inclusion of ATFs improves the ability of SFOAEs to identify hearing loss and predict pure-tone thresholds. Design The total suppressed SFOAE signal and its corresponding noise were recorded in 85 ears (22 normal ears and 63 ears with sensorineural hearing loss) at octave frequencies from 0.5 – 8 kHz using a nonlinear residual method. SFOAEs were recorded a second time in three impaired ears to assess repeatability. Ambient-pressure ATFs were obtained in all but one of these 85 ears, and were also obtained from an additional 31 normal-hearing subjects in whom SFOAE data were not obtained. Pure-tone air-and bone-conduction thresholds and 226-Hz tympanograms were obtained on all subjects. Normal tympanometry and the absence of air-bone gaps were used to screen subjects for normal middle-ear function. Clinical decision theory was used to assess the performance of SFOAE and ATF predictors in classifying ears as normal or impaired, and linear regression analysis was used to test the ability of SFOAE and ATF variables to predict the air-conduction audiogram. Results The ability of SFOAEs to classify ears as normal or hearing impaired was significant at all test frequencies. The ability of SFOAEs to classify impaired ears as either mild or moderate-severe was significant at test frequencies from 0.5 to 4 kHz. SFOAEs were present in cases of severe hearing loss. SFOAEs were also significantly correlated with air-conduction thresholds from 0.5 to 8 kHz. The best performance occurred using the SFOAE signal-to-noise ratio (S/N) as the predictor, and the overall best performance was at 2 kHz. The SFOAE S/N measures were repeatable to within 3.5 dB in impaired ears. The ATF measures explained up to 25% of the variance in the normal audiogram; however, ATF measures did not improve SFOAEs predictors of hearing loss except at 4 kHz. Conclusions In common with other OAE types, SFOAEs are capable of identifying the presence of hearing loss. In particular, SFOAEs performed better than distortion-product and click-evoked OAEs in predicting auditory status at 0.5 kHz; SFOAE performance was similar to that of other OAE types at higher frequencies except for a slight performance reduction at 4 kHz. Because SFOAEs were detected in ears with mild to severe cases of hearing loss they may also provide an estimate of the classification of hearing loss. Although SFOAEs were significantly correlated with hearing threshold, they do not appear to have clinical utility in predicting a specific behavioral threshold. Information on middle-ear status as assessed by ATF measures offered minimal improvement in SFOAE predictions of auditory status in a population of normal and impaired ears with normal middle-ear function. However, ATF variables did explain a significant fraction of the variability in the audiograms of normal ears, suggesting that audiometric thresholds in normal ears are partially constrained by middle-ear function as assessed by ATF tests. PMID:16230898
Relativistic scattered-wave theory. II - Normalization and symmetrization. [of Dirac wavefunctions
NASA Technical Reports Server (NTRS)
Yang, C. Y.
1978-01-01
Formalisms for normalization and symmetrization of one-electron Dirac scattered-wave wavefunctions are presented. The normalization integral consists of one-dimensional radial integrals for the spherical regions and an analytic expression for the intersphere region. Symmetrization drastically reduces the size of the secular matrix to be solved. Examples for planar Pb2Se2 and tetrahedral Pd4 are discussed.
Davis, Joe M
2011-10-28
General equations are derived for the distribution of minimum resolution between two chromatographic peaks, when peak heights in a multi-component chromatogram follow a continuous statistical distribution. The derivation draws on published theory by relating the area under the distribution of minimum resolution to the area under the distribution of the ratio of peak heights, which in turn is derived from the peak-height distribution. Two procedures are proposed for the equations' numerical solution. The procedures are applied to the log-normal distribution, which recently was reported to describe the distribution of component concentrations in three complex natural mixtures. For published statistical parameters of these mixtures, the distribution of minimum resolution is similar to that for the commonly assumed exponential distribution of peak heights used in statistical-overlap theory. However, these two distributions of minimum resolution can differ markedly, depending on the scale parameter of the log-normal distribution. Theory for the computation of the distribution of minimum resolution is extended to other cases of interest. With the log-normal distribution of peak heights as an example, the distribution of minimum resolution is computed when small peaks are lost due to noise or detection limits, and when the height of at least one peak is less than an upper limit. The distribution of minimum resolution shifts slightly to lower resolution values in the first case and to markedly larger resolution values in the second one. The theory and numerical procedure are confirmed by Monte Carlo simulation. Copyright © 2011 Elsevier B.V. All rights reserved.
2010-01-01
Background Patients-Reported Outcomes (PRO) are increasingly used in clinical and epidemiological research. Two main types of analytical strategies can be found for these data: classical test theory (CTT) based on the observed scores and models coming from Item Response Theory (IRT). However, whether IRT or CTT would be the most appropriate method to analyse PRO data remains unknown. The statistical properties of CTT and IRT, regarding power and corresponding effect sizes, were compared. Methods Two-group cross-sectional studies were simulated for the comparison of PRO data using IRT or CTT-based analysis. For IRT, different scenarios were investigated according to whether items or person parameters were assumed to be known, to a certain extent for item parameters, from good to poor precision, or unknown and therefore had to be estimated. The powers obtained with IRT or CTT were compared and parameters having the strongest impact on them were identified. Results When person parameters were assumed to be unknown and items parameters to be either known or not, the power achieved using IRT or CTT were similar and always lower than the expected power using the well-known sample size formula for normally distributed endpoints. The number of items had a substantial impact on power for both methods. Conclusion Without any missing data, IRT and CTT seem to provide comparable power. The classical sample size formula for CTT seems to be adequate under some conditions but is not appropriate for IRT. In IRT, it seems important to take account of the number of items to obtain an accurate formula. PMID:20338031
Nonlinear adaptive networks: A little theory, a few applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, R.D.; Qian, S.; Barnes, C.W.
1990-01-01
We present the theory of nonlinear adaptive networks and discuss a few applications. In particular, we review the theory of feedforward backpropagation networks. We than present the theory of the Connectionist Normalized Linear Spline network in both its feedforward and iterated modes. Also, we briefly discuss the theory of stochastic cellular automata. We then discuss applications to chaotic time series tidal prediction in Venice Lagoon, sonar transient detection, control of nonlinear processes, balancing a double inverted pendulum and design advice for free electron lasers. 26 refs., 23 figs.
NASA Astrophysics Data System (ADS)
Kumar, Rajneesh; Singh, Kulwinder; Pathania, Devinder Singh
2017-07-01
The purpose of this paper is to study the variations in temperature, radial and normal displacement, normal stress, shear stress and couple stress in a micropolar thermoelastic solid in the context of fractional order theory of thermoelasticity. Eigen value approach together with Laplace and Hankel transforms are employed to obtain the general solution of the problem. The field variables corresponding to different fractional order theories of thermoelasticity have been obtained in the transformed domain. The general solution is applied to an infinite space subjected to a concentrated load at the origin. To obtained solution in the physical domain numerical inversion technique has been applied and numerically computed results are depicted graphically to analyze the effects of fractional order parameter on the field variables.
Hultman, Lill; Forinder, Ulla; Pergert, Pernilla
2016-01-01
The purpose of the study was to explore how adolescents with disabilities experience everyday life with personal assistants. In this qualitative study, individual interviews were conducted at 35 occasions with 16 Swedish adolescents with disabilities, in the ages 16-21. Data were analyzed using grounded theory methodology. The adolescents' main concern was to achieve normality, which is about doing rather than being normal. They try to resolve this by assisted normality utilizing personal assistance. Assisted normality can be obtained by the existing relationship, the cooperation between the assistant and the adolescent and the situational placement of the assistant. Normality is obstructed by physical, social and psychological barriers. This study is from the adolescents' perspective and has implications for understanding the value of having access to personal assistance in order to achieve assisted normality and enable social interaction in everyday life. Access to personal assistance is important to enable social interaction in everyday life. A good and functional relationship is enabled through the existing relation, co-operation and situational placement of the assistant. If the assistant is not properly sensitized, young people risk turning into objects of care. Access to personal assistants cannot compensate for disabling barriers in the society as for example lack of acceptance.
Calculations of current-induced forces on moored tankers, using the theory of manoeuvring ships
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mirza, S.
1996-12-31
The knowledge of current induced loads on moored tankers is important in the design of mooring lines. Normally, these current loads are determined from controlled laboratory experiments and field tests or from the Oil Companies International Marine Forum (OCIMF) data (1977). Chakrabarti (1995) mentions that the validity of some of this data is doubtful, and he conducted some tank tests. To save time involved in preparation of elaborate tank tests, it will be useful to have some analytical tools to calculate the current induced loads. In this paper, an attempt has been made to calculate the lateral forces in currentmore » only conditions, using the theory of manoeuvring ships. The manoeuvring model was developed by Wellicome (1981). The sway forces on the hull are modelled by conformal transformation of the hull into a circle plane and applying the flow field. The forces on the bilge keel are modelled by vortex panel method. The results for the simulation are compared with the test results of Chakrabarti (1995). There is good correlation between the experimental and theoretical results for the case of hull with bilge keels. This is true for the streaming flow velocity up to an angle of 45 to the longitudinal direction of the hull. For the case of bare hull, the computational model grossly underpredicts the sway forces. This may be due to the dominance of viscous forces than the potential ones.« less
FROM ORDER TO CHAOS IN EARTH SATELLITE ORBITS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gkolias, Ioannis; Gachet, Fabien; Daquin, Jérôme
We consider Earth satellite orbits in the range of semimajor axes where the perturbing effects of Earth’s oblateness and lunisolar gravity are of comparable order. This range covers the medium-Earth orbits (MEO) of the Global Navigation Satellite Systems and the geosynchronous orbits (GEO) of the communication satellites. We recall a secular and quadrupolar model, based on the Milankovitch vector formulation of perturbation theory, which governs the long-term orbital evolution subject to the predominant gravitational interactions. We study the global dynamics of this two-and-a-half degrees-of-freedom Hamiltonian system by means of the fast Lyapunov indicator (FLI), used in a statistical sense. Specifically,more » we characterize the degree of chaoticity of the action space using angle-averaged normalized FLI maps, thereby overcoming the angle dependencies of the conventional stability maps. Emphasis is placed upon the phase-space structures near secular resonances, which are of primary importance to the space debris community. We confirm and quantify the transition from order to chaos in MEO, stemming from the critical inclinations and find that highly inclined GEO orbits are particularly unstable. Despite their reputed normality, Earth satellite orbits can possess an extraordinarily rich spectrum of dynamical behaviors and, from a mathematical perspective, have all the complications that make them very interesting candidates for testing the modern tools of chaos theory.« less
[Analysis of the heart sound with arrhythmia based on nonlinear chaos theory].
Ding, Xiaorong; Guo, Xingming; Zhong, Lisha; Xiao, Shouzhong
2012-10-01
In this paper, a new method based on the nonlinear chaos theory was proposed to study the arrhythmia with the combination of the correlation dimension and largest Lyapunov exponent, through computing and analyzing these two parameters of 30 cases normal heart sound and 30 cases with arrhythmia. The results showed that the two parameters of the heart sounds with arrhythmia were higher than those with the normal, and there was significant difference between these two kinds of heart sounds. That is probably due to the irregularity of the arrhythmia which causes the decrease of predictability, and it's more complex than the normal heart sound. Therefore, the correlation dimension and the largest Lyapunov exponent can be used to analyze the arrhythmia and for its feature extraction.
Kellermann, Tanja S; Bonilha, Leonardo; Eskandari, Ramin; Garcia-Ramos, Camille; Lin, Jack J; Hermann, Bruce P
2016-10-01
Normal cognitive function is defined by harmonious interaction among multiple neuropsychological domains. Epilepsy has a disruptive effect on cognition, but how diverse cognitive abilities differentially interact with one another compared with healthy controls (HC) is unclear. This study used graph theory to analyze the community structure of cognitive networks in adults with temporal lobe epilepsy (TLE) compared with that in HC. Neuropsychological assessment was performed in 100 patients with TLE and 82 HC. For each group, an adjacency matrix was constructed representing pair-wise correlation coefficients between raw scores obtained in each possible test combination. For each cognitive network, each node corresponded to a cognitive test; each link corresponded to the correlation coefficient between tests. Global network structure, community structure, and node-wise graph theory properties were qualitatively assessed. The community structure in patients with TLE was composed of fewer, larger, more mixed modules, characterizing three main modules representing close relationships between the following: 1) aspects of executive function (EF), verbal and visual memory, 2) speed and fluency, and 3) speed, EF, perception, language, intelligence, and nonverbal memory. Conversely, controls exhibited a relative division between cognitive functions, segregating into more numerous, smaller modules consisting of the following: 1) verbal memory, 2) language, perception, and intelligence, 3) speed and fluency, and 4) visual memory and EF. Overall node-wise clustering coefficient and efficiency were increased in TLE. Adults with TLE demonstrate a less clear and poorly structured segregation between multiple cognitive domains. This panorama suggests a higher degree of interdependency across multiple cognitive domains in TLE, possibly indicating compensatory mechanisms to overcome functional impairments. Copyright © 2016 Elsevier Inc. All rights reserved.
Robust controller design for flexible structures using normalized coprime factor plant descriptions
NASA Technical Reports Server (NTRS)
Armstrong, Ernest S.
1993-01-01
Stabilization is a fundamental requirement in the design of feedback compensators for flexible structures. The search for the largest neighborhood around a given design plant for which a single controller produces closed-loop stability can be formulated as an H(sub infinity) control problem. The use of normalized coprime factor plant descriptions, in which the plant perturbations are defined as additive modifications to the coprime factors, leads to a closed-form expression for the maximum neighborhood boundary allowing optimal and suboptimal H(sub infinity) compensators to be computed directly without the usual gamma iteration. A summary of the theory on robust stabilization using normalized coprime factor plant descriptions is presented, and the application of the theory to the computation of robustly stable compensators for the phase version of the Control-Structures Interaction (CSI) Evolutionary Model is described. Results from the application indicate that the suboptimal version of the theory has the potential of providing the bases for the computation of low-authority compensators that are robustly stable to expected variations in design model parameters and additive unmodeled dynamics.
Vibrational Modes of Oblate Clouds of Charge
NASA Astrophysics Data System (ADS)
Jenkins, Thomas; Spencer, Ross L.
2000-10-01
When a nonneutral plasma confined in a Penning trap is allowed time to expand, its shape at global thermal equilibrium is that of a thin oblate spheroid [D. L. Paulson et al., Phys. Plasmas 5, 345 (1998)]. Oscillations similar to those of a drumhead can be externally induced in such a plasma. Although a theory developed by Dubin predicts the frequencies of the various normal modes of oscillation [Phys. Rev. Lett. 66, 2076 (1991)], this theory assumes that the plasma has zero temperature and is confined by an ideal quadrupole electric field. Neither of these conditions is strictly true in experiments [C. S. Weimer et al., Phys. Rev. A 49, 3842 (1994)] where physical properties of the plasma are deduced from measurements of these frequencies, causing the measurements and ideal theory to differ by about 20%. We reformulate the problem of the normal oscillatory modes as a principal-value integral eigenvalue equation, including finite-temperature and non-ideal confinement effects. The equation is solved numerically to obtain the plasma's normal mode frequencies and shapes; reasonable agreement with experiment is obtained.
Notes on power of normality tests of error terms in regression models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Střelec, Luboš
2015-03-10
Normality is one of the basic assumptions in applying statistical procedures. For example in linear regression most of the inferential procedures are based on the assumption of normality, i.e. the disturbance vector is assumed to be normally distributed. Failure to assess non-normality of the error terms may lead to incorrect results of usual statistical inference techniques such as t-test or F-test. Thus, error terms should be normally distributed in order to allow us to make exact inferences. As a consequence, normally distributed stochastic errors are necessary in order to make a not misleading inferences which explains a necessity and importancemore » of robust tests of normality. Therefore, the aim of this contribution is to discuss normality testing of error terms in regression models. In this contribution, we introduce the general RT class of robust tests for normality, and present and discuss the trade-off between power and robustness of selected classical and robust normality tests of error terms in regression models.« less
A unified theory of bone healing and nonunion: BHN theory.
Elliott, D S; Newman, K J H; Forward, D P; Hahn, D M; Ollivere, B; Kojima, K; Handley, R; Rossiter, N D; Wixted, J J; Smith, R M; Moran, C G
2016-07-01
This article presents a unified clinical theory that links established facts about the physiology of bone and homeostasis, with those involved in the healing of fractures and the development of nonunion. The key to this theory is the concept that the tissue that forms in and around a fracture should be considered a specific functional entity. This 'bone-healing unit' produces a physiological response to its biological and mechanical environment, which leads to the normal healing of bone. This tissue responds to mechanical forces and functions according to Wolff's law, Perren's strain theory and Frost's concept of the "mechanostat". In response to the local mechanical environment, the bone-healing unit normally changes with time, producing different tissues that can tolerate various levels of strain. The normal result is the formation of bone that bridges the fracture - healing by callus. Nonunion occurs when the bone-healing unit fails either due to mechanical or biological problems or a combination of both. In clinical practice, the majority of nonunions are due to mechanical problems with instability, resulting in too much strain at the fracture site. In most nonunions, there is an intact bone-healing unit. We suggest that this maintains its biological potential to heal, but fails to function due to the mechanical conditions. The theory predicts the healing pattern of multifragmentary fractures and the observed morphological characteristics of different nonunions. It suggests that the majority of nonunions will heal if the correct mechanical environment is produced by surgery, without the need for biological adjuncts such as autologous bone graft. Cite this article: Bone Joint J 2016;98-B:884-91. ©2016 The British Editorial Society of Bone & Joint Surgery.
Earthquakes triggered by fluid extraction
Segall, P.
1989-01-01
Seismicity is correlated in space and time with production from some oil and gas fields where pore pressures have declined by several tens of megapascals. Reverse faulting has occurred both above and below petroleum reservoirs, and normal faulting has occurred on the flanks of at least one reservoir. The theory of poroelasticity requires that fluid extraction locally alter the state of stress. Calculations with simple geometries predict stress perturbations that are consistent with observed earthquake locations and focal mechanisms. Measurements of surface displacement and strain, pore pressure, stress, and poroelastic rock properties in such areas could be used to test theoretical predictions and improve our understanding of earthquake mechanics. -Author
Evidence for perceptual deficits in associative visual (prosop)agnosia: a single-case study.
Delvenne, Jean François; Seron, Xavier; Coyette, Françoise; Rossion, Bruno
2004-01-01
Associative visual agnosia is classically defined as normal visual perception stripped of its meaning [Archiv für Psychiatrie und Nervenkrankheiten 21 (1890) 22/English translation: Cognitive Neuropsychol. 5 (1988) 155]: these patients cannot access to their stored visual memories to categorize the objects nonetheless perceived correctly. However, according to an influential theory of visual agnosia [Farah, Visual Agnosia: Disorders of Object Recognition and What They Tell Us about Normal Vision, MIT Press, Cambridge, MA, 1990], visual associative agnosics necessarily present perceptual deficits that are the cause of their impairment at object recognition Here we report a detailed investigation of a patient with bilateral occipito-temporal lesions strongly impaired at object and face recognition. NS presents normal drawing copy, and normal performance at object and face matching tasks as used in classical neuropsychological tests. However, when tested with several computer tasks using carefully controlled visual stimuli and taking both his accuracy rate and response times into account, NS was found to have abnormal performances at high-level visual processing of objects and faces. Albeit presenting a different pattern of deficits than previously described in integrative agnosic patients such as HJA and LH, his deficits were characterized by an inability to integrate individual parts into a whole percept, as suggested by his failure at processing structurally impossible three-dimensional (3D) objects, an absence of face inversion effects and an advantage at detecting and matching single parts. Taken together, these observations question the idea of separate visual representations for object/face perception and object/face knowledge derived from investigations of visual associative (prosop)agnosia, and they raise some methodological issues in the analysis of single-case studies of (prosop)agnosic patients.
Scalar utility theory and proportional processing: what does it actually imply?
Rosenström, Tom; Wiesner, Karoline; Houston, Alasdair I
2017-01-01
Scalar Utility Theory (SUT) is a model used to predict animal and human choice behaviour in the context of reward amount, delay to reward, and variability in these quantities (risk preferences). This article reviews and extends SUT, deriving novel predictions. We show that, contrary to what has been implied in the literature, (1) SUT can predict both risk averse and risk prone behaviour for both reward amounts and delays to reward depending on experimental parameters, (2) SUT implies violations of several concepts of rational behaviour (e.g. it violates strong stochastic transitivity and its equivalents, and leads to probability matching) and (3) SUT can predict, but does not always predict, a linear relationship between risk sensitivity in choices and coefficient of variation in the decision-making experiment. SUT derives from Scalar Expectancy Theory which models uncertainty in behavioural timing using a normal distribution. We show that the above conclusions also hold for other distributions, such as the inverse Gaussian distribution derived from drift-diffusion models. A straightforward way to test the key assumptions of SUT is suggested and possible extensions, future prospects and mechanistic underpinnings are discussed. PMID:27288541
McVay, Jennifer C.; Kane, Michael J.
2012-01-01
Some people are better readers than others, and this variation in comprehension ability is predicted by measures of working memory capacity (WMC). The primary goal of this study was to investigate the mediating role of mind wandering experiences in the association between WMC and normal individual differences in reading comprehension, as predicted by the executive-attention theory of WMC (e.g., Engle & Kane, 2004). We used a latent-variable, structural-equation-model approach, testing skilled adult readers on three WMC span tasks, seven varied reading comprehension tasks, and three attention-control tasks. Mind wandering was assessed using experimenter-scheduled thought probes during four different tasks (two reading, two attention-control tasks). The results support the executive-attention theory of WMC. Mind wandering across the four tasks loaded onto a single latent factor, reflecting a stable individual difference. Most importantly, mind wandering was a significant mediator in the relationship between WMC and reading comprehension, suggesting that the WMC-comprehension correlation is driven, in part, by attention control over intruding thoughts. We discuss implications for theories of WMC, attention control, and reading comprehension. PMID:21875246
Scalar utility theory and proportional processing: What does it actually imply?
Rosenström, Tom; Wiesner, Karoline; Houston, Alasdair I
2016-09-07
Scalar Utility Theory (SUT) is a model used to predict animal and human choice behaviour in the context of reward amount, delay to reward, and variability in these quantities (risk preferences). This article reviews and extends SUT, deriving novel predictions. We show that, contrary to what has been implied in the literature, (1) SUT can predict both risk averse and risk prone behaviour for both reward amounts and delays to reward depending on experimental parameters, (2) SUT implies violations of several concepts of rational behaviour (e.g. it violates strong stochastic transitivity and its equivalents, and leads to probability matching) and (3) SUT can predict, but does not always predict, a linear relationship between risk sensitivity in choices and coefficient of variation in the decision-making experiment. SUT derives from Scalar Expectancy Theory which models uncertainty in behavioural timing using a normal distribution. We show that the above conclusions also hold for other distributions, such as the inverse Gaussian distribution derived from drift-diffusion models. A straightforward way to test the key assumptions of SUT is suggested and possible extensions, future prospects and mechanistic underpinnings are discussed. Copyright © 2016 Elsevier Ltd. All rights reserved.
Power of tests of normality for detecting contaminated normal samples
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thode, H.C. Jr.; Smith, L.A.; Finch, S.J.
1981-01-01
Seventeen tests of normality or goodness of fit were evaluated for power at detecting a contaminated normal sample. This study used 1000 replications each of samples of size 12, 17, 25, 33, 50, and 100 from six different contaminated normal distributions. The kurtosis test was the most powerful over all sample sizes and contaminations. The Hogg and weighted Kolmogorov-Smirnov tests were second. The Kolmogorov-Smirnov, chi-squared, Anderson-Darling, and Cramer-von-Mises tests had very low power at detecting contaminated normal random variables. Tables of the power of the tests and the power curves of certain tests are given.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Comandi, G.L.; Chiofalo, M.L.; Toncelli, R.
Recent theoretical work suggests that violation of the equivalence principle might be revealed in a measurement of the fractional differential acceleration {eta} between two test bodies-of different compositions, falling in the gravitational field of a source mass--if the measurement is made to the level of {eta}{approx_equal}10{sup -13} or better. This being within the reach of ground based experiments gives them a new impetus. However, while slowly rotating torsion balances in ground laboratories are close to reaching this level, only an experiment performed in a low orbit around the Earth is likely to provide a much better accuracy. We report onmore » the progress made with the 'Galileo Galilei on the ground' (GGG) experiment, which aims to compete with torsion balances using an instrument design also capable of being converted into a much higher sensitivity space test. In the present and following articles (Part I and Part II), we demonstrate that the dynamical response of the GGG differential accelerometer set into supercritical rotation-in particular, its normal modes (Part I) and rejection of common mode effects (Part II)-can be predicted by means of a simple but effective model that embodies all the relevant physics. Analytical solutions are obtained under special limits, which provide the theoretical understanding. A simulation environment is set up, obtaining a quantitative agreement with the available experimental data on the frequencies of the normal modes and on the whirling behavior. This is a needed and reliable tool for controlling and separating perturbative effects from the expected signal, as well as for planning the optimization of the apparatus.« less
NASA Astrophysics Data System (ADS)
Bernat, Amir S.; Bar-Am, Kfir; Cataldo, Leigh; Bolton, Frank J.; Kahn, Bruce S.; Levitz, David
2018-02-01
Cervical cancer is a leading cause of death for women in low resource settings. In order to better detect cervical dysplasia, a low cost multi-spectral colposcope was developed utilizing low costs LEDs and an area scan camera. The device is capable of both traditional colposcopic imaging and multi-spectral image capture. Following initial bench testing, the device was deployed to a gynecology clinic where it was used to image patients in a colposcopy setting. Both traditional colposcopic images and spectral data from patients were uploaded to a cloud server for remote analysis. Multi-spectral imaging ( 30 second capture) took place before any clinical procedure; the standard of care was followed thereafter. If acetic acid was used in the standard of care, a post-acetowhitening colposcopic image was also captured. In analyzing the data, normal and abnormal regions were identified in the colposcopic images by an expert clinician. Spectral data were fit to a theoretical model based on diffusion theory, yielding information on scattering and absorption parameters. Data were grouped according to clinician labeling of the tissue, as well as any additional clinical test results available (Pap, HPV, biopsy). Altogether, N=20 patients were imaged in this study, with 9 of them abnormal. In comparing normal and abnormal regions of interest from patients, substantial differences were measured in blood content, while differences in oxygen saturation parameters were more subtle. These results suggest that optical measurements made using low cost spectral imaging systems can distinguish between normal and pathological tissues.
1981-06-15
relationships 5 3. Normalized energy in ambiguity function for i = 0 14 k ilI SACLANTCEN SR-50 A RESUME OF STOCHASTIC, TIME-VARYING, LINEAR SYSTEM THEORY WITH...the order in which systems are concatenated is unimportant. These results are exactly analogous to the results of time-invariant linear system theory in...REFERENCES 1. MEIER, L. A rdsum6 of deterministic time-varying linear system theory with application to active sonar signal processing problems, SACLANTCEN
2012-06-09
employed theories are the Euler-Bernoulli beam theory (EBT) and the Timoshenko beam theory ( TBT ). The major deficiency associated with the EBT is failure to...account for defor- mations associated with shearing. The TBT relaxes the normality assumption of the EBT and admits a constant state of shear strain...on a given cross-section. As a result, the TBT necessitates the use of shear correction coefficients in order to accurately predict transverse
Justification of Paternalism in Education.
ERIC Educational Resources Information Center
Nordenbo, Sven Erik
1986-01-01
A systematic presentation is given of the theories of justification normally applied to paternalistic acts: (1) pseudo-paternalism, (2) consequentialism, and (3) consent-based theories. The validity of four common arguments for educational paternalism is discussed: education is necessary, children are ignorant, children are unable to choose, and…
Are Prospective English Teachers Linguistically Intelligent?
ERIC Educational Resources Information Center
Tezel, Kadir Vefa
2017-01-01
Language is normally associated with linguistic capabilities of individuals. In the theory of multiple intelligences, language is considered to be related primarily to linguistic intelligence. Using the theory of Multiple Intelligences as its starting point, this descriptive survey study investigated to what extent prospective English teachers'…
Quasi-normal modes of holographic system with Weyl correction and momentum dissipation
NASA Astrophysics Data System (ADS)
Wu, Jian-Pin; Liu, Peng
2018-05-01
We study the charge response in complex frequency plane and the quasi-normal modes (QNMs) of the boundary quantum field theory with momentum dissipation dual to a probe generalized Maxwell system with Weyl correction. When the strength of the momentum dissipation α ˆ is small, the pole structure of the conductivity is similar to the case without the momentum dissipation. The qualitative correspondence between the poles of the real part of the conductivity of the original theory and the ones of its electromagnetic (EM) dual theory approximately holds when γ → - γ with γ being the Weyl coupling parameter. While the strong momentum dissipation alters the pole structure such that most of the poles locate at the purely imaginary axis. At this moment, the correspondence between the poles of the original theory and its EM dual one is violated when γ → - γ. In addition, for the dominant pole, the EM duality almost holds when γ → - γ for all α ˆ except for a small region of α ˆ .
1992-03-14
overdoped Lal. 66 Sr0 34 CuO4 . 1. Introduction Understanding the normal state charge and spin dynamics of cuprates is closely tied to an explanation of high...frequency of the tank circuit of 160 MHz. As predicted by theory [191, the SQUID noise is reduced significantly when using the higher frequency. This...emphasized that the spin excitation gap is not decreasing with temperature as expected in the classical BCS theory . An other astonishing result is
Renormalization group, normal form theory and the Ising model
NASA Astrophysics Data System (ADS)
Raju, Archishman; Hayden, Lorien; Clement, Colin; Liarte, Danilo; Sethna, James
The results of the renormalization group are commonly advertised as the existence of power law singularities at critical points. Logarithmic and exponential corrections are seen as special cases and dealt with on a case-by-case basis. We propose to systematize computing the singularities in the renormalization group using perturbative normal form theory. This gives us a way to classify all such singularities in a unified framework and to generate a systematic machinery to do scaling collapses. We show that this procedure leads to some new results even in classic cases like the Ising model and has general applicability.
Spectral statistics of the acoustic stadium
NASA Astrophysics Data System (ADS)
Méndez-Sánchez, R. A.; Báez, G.; Leyvraz, F.; Seligman, T. H.
2014-01-01
We calculate the normal-mode frequencies and wave amplitudes of the two-dimensional acoustical stadium. We also obtain the statistical properties of the acoustical spectrum and show that they agree with the results given by random matrix theory. Some normal-mode wave amplitudes showing scarring are presented.
Ganiyu-Dada, Z; Bowcock, S
2011-12-01
Repeating normal laboratory tests can waste resources. This study aimed to quantify unnecessary repeat haematinic tests taken from the elderly in a district general hospital. Haematinic tests (ferritin, B12, serum folate) from patients age ≥ 70 years were reviewed for repeat tests during an 8-week period. Questionnaires were given to doctors to establish when the considered repeating a 'borderline low normal' result to be clinically justifiable. 7.7% of all haematinic tests were repeat tests and of these, the majority (83%) was performed following a previously normal result. Thirteen of 24 doctors believed repeating a normal result at the bottom of the normal range ('borderline low normal') was justifiable. After excluding 'borderline low normal' results, 6.0% (at minimum) of repeat tests were done following a previous normal result and were unnecessary. This audit showed that there are a significant number of unnecessary repeat haematinic tests being performed. © 2011 Blackwell Publishing Ltd.
Viscous pressure correction in the irrotational flow outside Prandtl's boundary layer
NASA Astrophysics Data System (ADS)
Joseph, Daniel; Wang, Jing
2004-11-01
We argue that boundary layers on solid with irrotational motion outside are like a gas bubble because the shear stress vanishes at the edge of the boundary layer but the irrotational shear stress does not. This discrepancy induces a pressure correction and an additional drag which can be advertised as due to the viscous dissipation of the irrotational flow. Typically, this extra correction to the drag would be relatively small. A much more interesting implication of the extra pressure theory arises from the consideration of the effects of viscosity on the normal stress on a solid boundary which are entirely neglected in Prandtl's theory. It is very well known and easily demonstrated that as a consequence of the continuity equation the viscous normal stress must vanish on a rigid solid. It follows that all the greatly important effects of viscosity on the normal stress are buried in the pressure and the leading order effects of viscosity on the normal stress can be obtained from the viscous correction of viscous potential flow.
2016-01-01
Wilson's disease typically presents symptoms associated with liver damage or neuropsychiatric disturbances, while endocrinologic abnormalities are rare. We report an unprecedented case of hypopituitarism in a patient with Wilson's disease. A 40-year-old woman presented with depression, general weakness and anorexia. Laboratory tests and imaging studies were compatible with liver cirrhosis due to Wilson's disease. Basal hormone levels and pituitary function tests indicated secondary hypothyroidism and adrenal insufficiency due to hypopituitarism. Brain MRI showed T2 hyperintense signals in both basal ganglia and midbrain but the pituitary imaging was normal. She is currently receiving chelation therapy along with thyroid hormone and steroid replacement. There may be a relationship between Wilson's disease and hypopituitarism. Copper deposition or secondary neuronal damage in the pituitary may be a possible explanation for this theory. PMID:27478349
Lee, Hae Won; Kang, Jin Du; Yeo, Chang Woo; Yoon, Sung Woon; Lee, Kwang Jae; Choi, Mun Ki
2016-08-01
Wilson's disease typically presents symptoms associated with liver damage or neuropsychiatric disturbances, while endocrinologic abnormalities are rare. We report an unprecedented case of hypopituitarism in a patient with Wilson's disease. A 40-year-old woman presented with depression, general weakness and anorexia. Laboratory tests and imaging studies were compatible with liver cirrhosis due to Wilson's disease. Basal hormone levels and pituitary function tests indicated secondary hypothyroidism and adrenal insufficiency due to hypopituitarism. Brain MRI showed T2 hyperintense signals in both basal ganglia and midbrain but the pituitary imaging was normal. She is currently receiving chelation therapy along with thyroid hormone and steroid replacement. There may be a relationship between Wilson's disease and hypopituitarism. Copper deposition or secondary neuronal damage in the pituitary may be a possible explanation for this theory.
NASA Technical Reports Server (NTRS)
Miles, J. H.
1974-01-01
A rational function is presented for the acoustic spectra generated by deflection of engine exhaust jets for under-the-wing and over-the-wing versions of externally blown flaps. The functional representation is intended to provide a means for compact storage of data and for data analysis. The expressions are based on Fourier transform functions for the Strouhal normalized pressure spectral density, and on a correction for reflection effects based on the N-independent-source model of P. Thomas extended by use of a reflected ray transfer function. Curve fit comparisons are presented for blown flap data taken from turbofan engine tests and from large scale cold-flow model tests. Application of the rational function to scrubbing noise theory is also indicated.
Nishiyama, Takeshi; Suzuki, Masako; Adachi, Katsunori; Sumi, Satoshi; Okada, Kensuke; Kishino, Hirohisa; Sakai, Saeko; Kamio, Yoko; Kojima, Masayo; Suzuki, Sadao; Kanne, Stephen M
2014-05-01
We comprehensively compared all available questionnaires for measuring quantitative autistic traits (QATs) in terms of reliability and construct validity in 3,147 non-clinical and 60 clinical subjects with normal intelligence. We examined four full-length forms, the Subthreshold Autism Trait Questionnaire (SATQ), the Broader Autism Phenotype Questionnaire, the Social Responsiveness Scale2-Adult Self report (SRS2-AS), and the Autism-Spectrum Quotient (AQ). The SRS2-AS and the AQ each had several short forms that we also examined, bringing the total to 11 forms. Though all QAT questionnaires showed acceptable levels of test-retest reliability, the AQ and SRS2-AS, including their short forms, exhibited poor internal consistency and discriminant validity, respectively. The SATQ excelled in terms of classical test theory and due to its short length.
ERIC Educational Resources Information Center
Magno, Carlo
2009-01-01
The present report demonstrates the difference between classical test theory (CTT) and item response theory (IRT) approach using an actual test data for chemistry junior high school students. The CTT and IRT were compared across two samples and two forms of test on their item difficulty, internal consistency, and measurement errors. The specific…
Zhang, Zhongrui; Zhong, Quanlin; Niklas, Karl J; Cai, Liang; Yang, Yusheng; Cheng, Dongliang
2016-08-24
Metabolic scaling theory (MST) posits that the scaling exponents among plant height H, diameter D, and biomass M will covary across phyletically diverse species. However, the relationships between scaling exponents and normalization constants remain unclear. Therefore, we developed a predictive model for the covariation of H, D, and stem volume V scaling relationships and used data from Chinese fir (Cunninghamia lanceolata) in Jiangxi province, China to test it. As predicted by the model and supported by the data, normalization constants are positively correlated with their associated scaling exponents for D vs. V and H vs. V, whereas normalization constants are negatively correlated with the scaling exponents of H vs. D. The prediction model also yielded reliable estimations of V (mean absolute percentage error = 10.5 ± 0.32 SE across 12 model calibrated sites). These results (1) support a totally new covariation scaling model, (2) indicate that differences in stem volume scaling relationships at the intra-specific level are driven by anatomical or ecophysiological responses to site quality and/or management practices, and (3) provide an accurate non-destructive method for predicting Chinese fir stem volume.
When speaker identity is unavoidable: Neural processing of speaker identity cues in natural speech.
Tuninetti, Alba; Chládková, Kateřina; Peter, Varghese; Schiller, Niels O; Escudero, Paola
2017-11-01
Speech sound acoustic properties vary largely across speakers and accents. When perceiving speech, adult listeners normally disregard non-linguistic variation caused by speaker or accent differences, in order to comprehend the linguistic message, e.g. to correctly identify a speech sound or a word. Here we tested whether the process of normalizing speaker and accent differences, facilitating the recognition of linguistic information, is found at the level of neural processing, and whether it is modulated by the listeners' native language. In a multi-deviant oddball paradigm, native and nonnative speakers of Dutch were exposed to naturally-produced Dutch vowels varying in speaker, sex, accent, and phoneme identity. Unexpectedly, the analysis of mismatch negativity (MMN) amplitudes elicited by each type of change shows a large degree of early perceptual sensitivity to non-linguistic cues. This finding on perception of naturally-produced stimuli contrasts with previous studies examining the perception of synthetic stimuli wherein adult listeners automatically disregard acoustic cues to speaker identity. The present finding bears relevance to speech normalization theories, suggesting that at an unattended level of processing, listeners are indeed sensitive to changes in fundamental frequency in natural speech tokens. Copyright © 2017 Elsevier Inc. All rights reserved.
String-Coupled Pendulum Oscillators: Theory and Experiment.
ERIC Educational Resources Information Center
Moloney, Michael J.
1978-01-01
A coupled-oscillator system is given which is readily set up, using only household materials. The normal-mode analysis of this system is worked out, and an experiment or demonstration is recommended in which one verifies the theory by measuring two times and four lengths. (Author/GA)
Integrated modeling and robust control for full-envelope flight of robotic helicopters
NASA Astrophysics Data System (ADS)
La Civita, Marco
Robotic helicopters have attracted a great deal of interest from the university, the industry, and the military world. They are versatile machines and there is a large number of important missions that they could accomplish. Nonetheless, there are only a handful of documented examples of robotic-helicopter applications in real-world scenarios. This situation is mainly due to the poor flight performance that can be achieved and---more important---guaranteed under automatic control. Given the maturity of control theory, and given the large body of knowledge in helicopter dynamics, it seems that the lack of success in flying high-performance controllers for robotic helicopters, especially by academic groups and by small industries, has nothing to do with helicopters or control theory as such. The problem lies instead in the large amount of time and resources needed to synthesize, test, and implement new control systems with the approach normally followed in the aeronautical industry. This thesis attempts to provide a solution by presenting a modeling and control framework that minimizes the time, cost, and both human and physical resources necessary to design high-performance flight controllers. The work is divided in two main parts. The first consists of the development of a modeling technique that allows the designer to obtain a high-fidelity model adequate for both real-time simulation and controller design, with few flight, ground, and wind-tunnel tests and a modest level of complexity in the dynamic equations. The second consists of the exploitation of the predictive capabilities of the model and of the robust stability and performance guarantees of the Hinfinity loop-shaping control theory to reduce the number of iterations of the design/simulated-evaluation/flight-test-evaluation procedure. The effectiveness of this strategy is demonstrated by designing and flight testing a wide-envelope high-performance controller for the Carnegie Mellon University robotic helicopter.
NASA Astrophysics Data System (ADS)
Anderson, Philip W.; Casey, Philip A.
2010-04-01
We present a formalism for dealing directly with the effects of the Gutzwiller projection implicit in the t-J model which is widely believed to underlie the phenomenology of the high-Tc cuprates. We suggest that a true Bardeen-Cooper-Schrieffer condensation from a Fermi liquid state takes place, but in the unphysical space prior to projection. At low doping, however, instead of a hidden Fermi liquid one gets a 'hidden' non-superconducting resonating valence bond state which develops hole pockets upon doping. The theory which results upon projection does not follow conventional rules of diagram theory and in fact in the normal state is a Z = 0 non-Fermi liquid. Anomalous properties of the 'strange metal' normal state are predicted and compared against experimental findings.
Application of data fusion technology based on D-S evidence theory in fire detection
NASA Astrophysics Data System (ADS)
Cai, Zhishan; Chen, Musheng
2015-12-01
Judgment and identification based on single fire characteristic parameter information in fire detection is subject to environmental disturbances, and accordingly its detection performance is limited with the increase of false positive rate and false negative rate. The compound fire detector employs information fusion technology to judge and identify multiple fire characteristic parameters in order to improve the reliability and accuracy of fire detection. The D-S evidence theory is applied to the multi-sensor data-fusion: first normalize the data from all sensors to obtain the normalized basic probability function of the fire occurrence; then conduct the fusion processing using the D-S evidence theory; finally give the judgment results. The results show that the method meets the goal of accurate fire signal identification and increases the accuracy of fire alarm, and therefore is simple and effective.
Tensor products of process matrices with indefinite causal structure
NASA Astrophysics Data System (ADS)
Jia, Ding; Sakharwade, Nitica
2018-03-01
Theories with indefinite causal structure have been studied from both the fundamental perspective of quantum gravity and the practical perspective of information processing. In this paper we point out a restriction in forming tensor products of objects with indefinite causal structure in certain models: there exist both classical and quantum objects the tensor products of which violate the normalization condition of probabilities, if all local operations are allowed. We obtain a necessary and sufficient condition for when such unrestricted tensor products of multipartite objects are (in)valid. This poses a challenge to extending communication theory to indefinite causal structures, as the tensor product is the fundamental ingredient in the asymptotic setting of communication theory. We discuss a few options to evade this issue. In particular, we show that the sequential asymptotic setting does not suffer the violation of normalization.
The E-prints and The Popper: Falsifying Some Recent Cosmological Models with Pencil and Paper
NASA Astrophysics Data System (ADS)
Sandora, McCullen
Various recent experiments indicate that the pace of our universe's present expansion is accelerating. This comes as a surprise, since this is not possible for normal matter obeying Einstein's equations of general relativity. Various mechanisms that alter the behavior of gravity on very large distance scales have since been proposed to explain this observation, to the point where new ideas appear in the literature faster than the old ones may be fully appraised. This dissertation aims to find new ways to test some of these proposed explanations, using a variety of methods. The first strategy is to look for signatures the models would imprint in arenas where the behavior of gravity is well understood. We use this to place strong constraints on nondynamical negative energy fields, as well as extra degrees of freedom that would be able to screen a large vacuum energy. We also develop ways to check the mathematical consistency of massive gravity theories, and rule out partially nonlinear massless theories.
Probing the interface theory of perception: Reply to commentaries.
Hoffman, Donald D; Singh, Manish; Prakash, Chetan
2015-12-01
We propose that selection favors nonveridical perceptions that are tuned to fitness. Current textbooks assert, to the contrary, that perception is useful because, in the normal case, it is veridical. Intuition, both lay and expert, clearly sides with the textbooks. We thus expected that some commentators would reject our proposal and provide counterarguments that could stimulate a productive debate. We are pleased that several commentators did indeed rise to the occasion and have argued against our proposal. We are also pleased that several others found our proposal worth exploring and have offered ways to test it, develop it, and link it more deeply to the history of ideas in the science and philosophy of perception. To both groups of commentators: thank you. Point and counterpoint, backed by data and theory, is the essence of science. We hope that the exchange recorded here will advance the scientific understanding of perception and its evolution. In what follows, we respond to the commentaries in alphabetical order.
Stepp, Stephanie D; Yu, Lan; Miller, Joshua D; Hallquist, Michael N; Trull, Timothy J; Pilkonis, Paul A
2012-04-01
Mounting evidence suggests that several inventories assessing both normal personality and personality disorders measure common dimensional personality traits (i.e., Antagonism, Constraint, Emotional Instability, Extraversion, and Unconventionality), albeit providing unique information along the underlying trait continuum. We used Widiger and Simonsen's (2005) pantheoretical integrative model of dimensional personality assessment as a guide to create item pools. We then used Item Response Theory (IRT) to compare the assessment of these five personality traits across three established dimensional measures of personality: the Schedule for Nonadaptive and Adaptive Personality (SNAP), the Temperament and Character Inventory (TCI), and the Revised NEO Personality Inventory (NEO PI-R). We found that items from each inventory map onto these five common personality traits in predictable ways. The IRT analyses, however, documented considerable variability in the item and test information derived from each inventory. Our findings support the notion that the integration of multiple perspectives will provide greater information about personality while minimizing the weaknesses of any single instrument.
Detecting labor using graph theory on connectivity matrices of uterine EMG.
Al-Omar, S; Diab, A; Nader, N; Khalil, M; Karlsson, B; Marque, C
2015-08-01
Premature labor is one of the most serious health problems in the developed world. One of the main reasons for this is that no good way exists to distinguish true labor from normal pregnancy contractions. The aim of this paper is to investigate if the application of graph theory techniques to multi-electrode uterine EMG signals can improve the discrimination between pregnancy contractions and labor. To test our methods we first applied them to synthetic graphs where we detected some differences in the parameters results and changes in the graph model from pregnancy-like graphs to labor-like graphs. Then, we applied the same methods to real signals. We obtained the best differentiation between pregnancy and labor through the same parameters. Major improvements in differentiating between pregnancy and labor were obtained using a low pass windowing preprocessing step. Results show that real graphs generally became more organized when moving from pregnancy, where the graph showed random characteristics, to labor where the graph became a more small-world like graph.
Stepp, Stephanie D.; Yu, Lan; Miller, Joshua D.; Hallquist, Michael N.; Trull, Timothy J.; Pilkonis, Paul A.
2013-01-01
Mounting evidence suggests that several inventories assessing both normal personality and personality disorders measure common dimensional personality traits (i.e., Antagonism, Constraint, Emotional Instability, Extraversion, and Unconventionality), albeit providing unique information along the underlying trait continuum. We used Widiger and Simonsen’s (2005) pantheoretical integrative model of dimensional personality assessment as a guide to create item pools. We then used Item Response Theory (IRT) to compare the assessment of these five personality traits across three established dimensional measures of personality: the Schedule for Nonadaptive and Adaptive Personality (SNAP), the Temperament and Character Inventory (TCI), and the Revised NEO Personality Inventory (NEO PI-R). We found that items from each inventory map onto these five common personality traits in predictable ways. The IRT analyses, however, documented considerable variability in the item and test information derived from each inventory. Our findings support the notion that the integration of multiple perspectives will provide greater information about personality while minimizing the weaknesses of any single instrument. PMID:22452759
Innovations in Basic Flight Training for the Indonesian Air Force
1990-12-01
microeconomic theory that could approximate the optimum mix of training hours between an aircraft and simulator, and therefore improve cost effectiveness...The microeconomic theory being used is normally employed when showing production with two variable inputs. An example of variable inputs would be labor...NAS Corpus Christi, Texas, Aerodynamics of the T-34C, 1989. 26. Naval Air Training Command, NAS Corpus Christi, Texas, Meteorological Theory Workbook
Extensions of the Theory of the Electron-Phonon Interaction in Metals: A Collection.
1983-11-03
accepted The measured zero -field susceptibility is given 50 . . . . 26 GENERALIZATION OF THE THEORY OF THE ELECTRON-... 1199 JP by X.P_ IM T V.IM 0... Generalization of the Theory of the Electron-Phonon Inter- action: Thermodynamic Formulation of Superconducting- and Normal-State Properties...A microscopic treatment of the consequences for supercon- ductivity of a nonconstant electronic density of states is presented. Generalized
Mass Media Theory, Leveraging Relationships, and Reliable Strategic Communication Effects
2008-03-19
other people who are in the same social and cultural groups. Families respond to patriarchs and matriarchs , congregations respond to pastors, and teens...media to self-correct behavior in order to make society seem more “normal.” Verbal and Written Message- Centric Theories Premise of Theory Magic...Effects Harmony and Balance People gravitate toward information they already believe. Structural Functionalism When society begins to seem
[Mourning and depression, from the attachment theory perspective].
Wolfberg, Elsa; Ekboir, Alberto; Faiman, Graciela; Finzi, Josefina; Freedman, Margarita; Heath, Adela; Martínez de Cipolatti, María C
2011-01-01
Since depression, according to OMS, is such a worldwide condition, it is necessary to be able to distinguish a normal mourning from a pathological mourning and a depression, so as to qualify patients and health professionals to be able to support a normal mourning without medicating it nor hurrying (hasting) it, as well as being able to treat a depression adequately when it appears as a complication. Attachment theory focuses on mourning after loss with notions such as 1- acceptance of search for the lost person as a normal fact; 2- that mourning in children may have non-pathological outcomes; 3- that a non-processed mourning may be transmitted in an intergenerational way, and 4- also defines which elements may determine a pathological mourning or a depression. A clinical case is presented with an analysis of these notions.
Kinematical Test Theories for Special Relativity
NASA Astrophysics Data System (ADS)
Lämmerzahl, Claus; Braxmaier, Claus; Dittus, Hansjörg; Müller, Holger; Peters, Achim; Schiller, Stephan
A comparison of certain kinematical test theories for Special Relativity including the Robertson and Mansouri-Sext test theories is presented and the accuracy of the experimental results testing Special Relativity are expressed in terms of the parameters appearing in these test theories. The theoretical results are applied to the most precise experimental results obtained recently for the isotropy of light propagation and the constancy of the speed of light.
NASA Astrophysics Data System (ADS)
Kitt, R.; Kalda, J.
2006-03-01
The question of optimal portfolio is addressed. The conventional Markowitz portfolio optimisation is discussed and the shortcomings due to non-Gaussian security returns are outlined. A method is proposed to minimise the likelihood of extreme non-Gaussian drawdowns of the portfolio value. The theory is called Leptokurtic, because it minimises the effects from “fat tails” of returns. The leptokurtic portfolio theory provides an optimal portfolio for investors, who define their risk-aversion as unwillingness to experience sharp drawdowns in asset prices. Two types of risks in asset returns are defined: a fluctuation risk, that has Gaussian distribution, and a drawdown risk, that deals with distribution tails. These risks are quantitatively measured by defining the “noise kernel” — an ellipsoidal cloud of points in the space of asset returns. The size of the ellipse is controlled with the threshold parameter: the larger the threshold parameter, the larger return are accepted for investors as normal fluctuations. The return vectors falling into the kernel are used for calculation of fluctuation risk. Analogously, the data points falling outside the kernel are used for the calculation of drawdown risks. As a result the portfolio optimisation problem becomes three-dimensional: in addition to the return, there are two types of risks involved. Optimal portfolio for drawdown-averse investors is the portfolio minimising variance outside the noise kernel. The theory has been tested with MSCI North America, Europe and Pacific total return stock indices.
Analysis of Layered Composite Plates Accounting for Large Deflections and Transverse Shear Strains.
1981-05-01
composite plates than isotropic plates. The classical thin- plate theory (CPT) assumes that normals to the midsurface before deformation remain straight...and normal to the midsurface after deformation, implying that thickness shear deformation effects are negligible. As a result, the natural
On the reversibility of the Meissner effect and the angular momentum puzzle
NASA Astrophysics Data System (ADS)
Hirsch, J. E.
2016-10-01
It is generally believed that the laws of thermodynamics govern superconductivity as an equilibrium state of matter, and hence that the normal-superconductor transition in a magnetic field is reversible under ideal conditions. Because eddy currents are generated during the transition as the magnetic flux changes, the transition has to proceed infinitely slowly to generate no entropy. Experiments showed that to a high degree of accuracy no entropy was generated in these transitions. However, in this paper we point out that for the length of times over which these experiments extended, a much higher degree of irreversibility due to decay of eddy currents should have been detected than was actually observed. We also point out that within the conventional theory of superconductivity no explanation exists for why no Joule heat is generated in the superconductor to normal transition when the supercurrent stops. In addition we point out that within the conventional theory of superconductivity no mechanism exists for the transfer of momentum between the supercurrent and the body as a whole, which is necessary to ensure that the transition in the presence of a magnetic field respects momentum conservation. We propose a solution to all these questions based on the alternative theory of hole superconductivity. The theory proposes that in the normal-superconductor transition there is a flow and backflow of charge in direction perpendicular to the phase boundary when the phase boundary moves. We show that this flow and backflow explains the absence of Joule heat generated by Faraday eddy currents, the absence of Joule heat generated in the process of the supercurrent stopping, and the reversible transfer of momentum between the supercurrent and the body, provided the current carriers in the normal state are holes.
Raykov, Tenko; Marcoulides, George A
2016-04-01
The frequently neglected and often misunderstood relationship between classical test theory and item response theory is discussed for the unidimensional case with binary measures and no guessing. It is pointed out that popular item response models can be directly obtained from classical test theory-based models by accounting for the discrete nature of the observed items. Two distinct observational equivalence approaches are outlined that render the item response models from corresponding classical test theory-based models, and can each be used to obtain the former from the latter models. Similarly, classical test theory models can be furnished using the reverse application of either of those approaches from corresponding item response models.
Bernstein, Lynne E.; Eberhardt, Silvio P.; Auer, Edward T.
2014-01-01
Training with audiovisual (AV) speech has been shown to promote auditory perceptual learning of vocoded acoustic speech by adults with normal hearing. In Experiment 1, we investigated whether AV speech promotes auditory-only (AO) perceptual learning in prelingually deafened adults with late-acquired cochlear implants. Participants were assigned to learn associations between spoken disyllabic C(=consonant)V(=vowel)CVC non-sense words and non-sense pictures (fribbles), under AV and then AO (AV-AO; or counter-balanced AO then AV, AO-AV, during Periods 1 then 2) training conditions. After training on each list of paired-associates (PA), testing was carried out AO. Across all training, AO PA test scores improved (7.2 percentage points) as did identification of consonants in new untrained CVCVC stimuli (3.5 percentage points). However, there was evidence that AV training impeded immediate AO perceptual learning: During Period-1, training scores across AV and AO conditions were not different, but AO test scores were dramatically lower in the AV-trained participants. During Period-2 AO training, the AV-AO participants obtained significantly higher AO test scores, demonstrating their ability to learn the auditory speech. Across both orders of training, whenever training was AV, AO test scores were significantly lower than training scores. Experiment 2 repeated the procedures with vocoded speech and 43 normal-hearing adults. Following AV training, their AO test scores were as high as or higher than following AO training. Also, their CVCVC identification scores patterned differently than those of the cochlear implant users. In Experiment 1, initial consonants were most accurate, and in Experiment 2, medial consonants were most accurate. We suggest that our results are consistent with a multisensory reverse hierarchy theory, which predicts that, whenever possible, perceivers carry out perceptual tasks immediately based on the experience and biases they bring to the task. We point out that while AV training could be an impediment to immediate unisensory perceptual learning in cochlear implant patients, it was also associated with higher scores during training. PMID:25206344
Bernstein, Lynne E; Eberhardt, Silvio P; Auer, Edward T
2014-01-01
Training with audiovisual (AV) speech has been shown to promote auditory perceptual learning of vocoded acoustic speech by adults with normal hearing. In Experiment 1, we investigated whether AV speech promotes auditory-only (AO) perceptual learning in prelingually deafened adults with late-acquired cochlear implants. Participants were assigned to learn associations between spoken disyllabic C(=consonant)V(=vowel)CVC non-sense words and non-sense pictures (fribbles), under AV and then AO (AV-AO; or counter-balanced AO then AV, AO-AV, during Periods 1 then 2) training conditions. After training on each list of paired-associates (PA), testing was carried out AO. Across all training, AO PA test scores improved (7.2 percentage points) as did identification of consonants in new untrained CVCVC stimuli (3.5 percentage points). However, there was evidence that AV training impeded immediate AO perceptual learning: During Period-1, training scores across AV and AO conditions were not different, but AO test scores were dramatically lower in the AV-trained participants. During Period-2 AO training, the AV-AO participants obtained significantly higher AO test scores, demonstrating their ability to learn the auditory speech. Across both orders of training, whenever training was AV, AO test scores were significantly lower than training scores. Experiment 2 repeated the procedures with vocoded speech and 43 normal-hearing adults. Following AV training, their AO test scores were as high as or higher than following AO training. Also, their CVCVC identification scores patterned differently than those of the cochlear implant users. In Experiment 1, initial consonants were most accurate, and in Experiment 2, medial consonants were most accurate. We suggest that our results are consistent with a multisensory reverse hierarchy theory, which predicts that, whenever possible, perceivers carry out perceptual tasks immediately based on the experience and biases they bring to the task. We point out that while AV training could be an impediment to immediate unisensory perceptual learning in cochlear implant patients, it was also associated with higher scores during training.
Elaboration of the α-model derived from the BCS theory of superconductivity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnston, David C.
2013-10-14
The single-band α-model of superconductivity (Padamsee et al 1973 J. Low Temp. Phys. 12 387) is a popular model that was adapted from the single-band Bardeen–Cooper–Schrieffer (BCS) theory of superconductivity mainly to allow fits to electronic heat capacity versus temperature T data that deviate from the BCS prediction. The model assumes that the normalized superconducting order parameter Δ(T)/Δ(0) and therefore the normalized London penetration depth λL(T)/λL(0) are the same as in BCS theory, calculated using the BCS value αBCS ≈ 1.764 of α ≡ Δ(0)/kBTc, where kB is The single-band α-model of superconductivity (Padamsee et al 1973 J. Low Temp.more » Phys. 12 387) is a popular model that was adapted from the single-band Bardeen–Cooper–Schrieffer (BCS) theory of superconductivity mainly to allow fits to electronic heat capacity versus temperature T data that deviate from the BCS prediction. The model assumes that the normalized superconducting order parameter Δ(T)/Δ(0) and therefore the normalized London penetration depth λL(T)/λL(0) are the same as in BCS theory, calculated using the BCS value αBCS ≈ 1.764 of α ≡ Δ(0)/kBTc, where kB is Boltzmann's constant and Tc is the superconducting transition temperature. On the other hand, to calculate the electronic free energy, entropy, heat capacity and thermodynamic critical field versus T, the α-model takes α to be an adjustable parameter. Here we write the BCS equations and limiting behaviors for the superconducting state thermodynamic properties explicitly in terms of α, as needed for calculations within the α-model, and present plots of the results versus T and α that are compared with the respective BCS predictions. Mechanisms such as gap anisotropy and strong coupling that can cause deviations of the thermodynamics from the BCS predictions, especially the heat capacity jump at Tc, are considered. Extensions of the α-model that have appeared in the literature, such as the two-band model, are also discussed. Tables of values of Δ(T)/Δ(0), the normalized London parameter Λ(T)/Λ(0) and λL(T)/λL(0) calculated from the BCS theory using α = αBCS are provided, which are the same in the α-model by assumption. Tables of values of the entropy, heat capacity and thermodynamic critical field versus T for seven values of α, including αBCS, are also presented.« less
Estimating macroporosity in a forest watershed by use of a tension infiltrometer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watson, K.W.; Luxmoore, R.J.
The ability to obtain sufficient field hydrologic data at reasonable cost can be an important limiting factor in applying transport models. A procedure is described for using ponded-flow- and tension-infiltration measurements to calculate transport parameters in a forest watershed. Thirty infiltration measurements were taken under ponded-flow conditions and at 3, 6, and 15 cm (H/sub 2/O) tension. It was assumed from capillarity theory that pores > 0.1-, 0.05-, and 0.02-cm diam, respectively, were excluded from the transport process during the tension infiltration measurements. Under ponded flow, 73% of the flux was conducted through macropores (i.e., pores > 0.1-cm diam.). Anmore » estimated 96% of the water flux was transmitted through only 0.32% of the soil volume. In general the larger the total water flux the larger the macropore contribution to total water flux. The Shapiro-Wilk normality test indicated that water flux through both matrix pore space and macropores was log-normally distributed in space.« less
Communicative competence in Alzheimer's disease: metaphor and sarcasm comprehension.
Maki, Yohko; Yamaguchi, Tomoharu; Koeda, Tatsuya; Yamaguchi, Haruyasu
2013-02-01
The purpose of this study was to evaluate the deficits of metaphor and sarcasm comprehension in Alzheimer's disease (AD), as pragmatic interpretation such as metaphor and sarcasm comprehension is required in social communication. A total of 31 young normal controls, 104 aged normal controls (ANC), 42 patients with amnesic mild cognitive impairment (aMCI), and 30 patients with mild AD were evaluated by Metaphoric and Sarcastic Scenario Test, which consists of 5 metaphoric and 5 sarcastic questions with 5 answer choices. Scores were analyzed using the repeated measures analysis of variance (metaphor/sarcasm vs 4 participant groups). Sarcasm comprehension, which requires second-order Theory of Mind (ToM), started to deteriorate in ANC, and metaphor comprehension, which requires first-order ToM, started to deteriorate in aMCI, and both deteriorated as disease progressed. Literal interpretation of pragmatic language is characteristic in patients with mild AD. Such misinterpretation would result in social miscommunication, even if they still retained semantic-lexical competence.
Seeman, Philip
2014-05-01
A surprising finding was made by JG Kidd (1909-1991) that guinea pig serum could make tumours disappear in mice. A later finding made by JD Broome (1939-) showed that asparaginase could suppress or kill tumour cells. However, the major mystery was why were only tumour cells but not normal cells affected by the asparaginase? The biology underlying this mechanism was unravelled by a young post-doctoral student, Bertha K Madras (1942-) who hypothesized that cells with low asparagine synthetase are those that die following treatment with asparaginase. To test her theory, Madras developed an assay for asparagine synthetase. The hypothesis was supported by the results that cells with normal asparagine synthetase were protected, while cells with low levels of this enzyme were killed by asparaginase. The findings provide a clinical guide for the use of asparaginase in acute lymphoblastic leukaemia in children and adults. © IMechE 2013 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Investigation of current transfer in built-up superconductors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, J.R.; Dresner, L.; Lue, J.W.
1977-01-01
Superconductors carrying 10 kA or more have been widely suggested for use in fusion research and reactor magnets. Built-up or cable conductors have been proposed in which superconductor is concentrated in part of the conductor or part of the strands while the stabilizer occupies the rest. This scheme leads to substantial saving in manufacturing cost and to reduction of ac losses. Simplified analysis indicates that the current transfer from superconducting wire to normal wire takes place over a characteristic length depending on the resistivity of the contact barrier, the resistivity of the stabilizer, and the geometry of the conductor. Furthermore,more » the cold-end recovery suffers a reduction. Two types of conductors were constructed for the experimental test. Triplex conductors consisting of either three superconducting wires or two superconducting plus one copper wire were used to simulate cables. Laminated superconductor and copper strips with different soldering bonds were used for build-ups. Normal zone propagation and recovery experiments have been performed and results are compared with the theory.« less
NASA Astrophysics Data System (ADS)
Lazanja, David; Boozer, Allen
2006-10-01
Given the total magnetic field on a toroidal plasma surface, a method for decomposing the field into a part due to internal currents (often the plasma) and a part due to external currents is presented. The method exploits Laplace theory which is valid in the vacuum region between the plasma surface and the chamber walls. The method is developed for the full three dimensional case which is necessary for studying stellarator plasma configurations. A change in the plasma shape is produced by the total normal field perturbation on the plasma surface. This method allows a separation of the total normal field perturbation into a part produced by external currents and a part produced by the plasma response. There are immediate applications to coil design. The computational procedure is based on Merkel's 1986 work on vacuum field computations. Several test cases are presented for toroidal surfaces which verify the method and computational robustness of the code.
A Sociological Journey into Sexuality.
ERIC Educational Resources Information Center
Reiss, Ira L.
1986-01-01
Proposes that sexuality is universally linked to the social structure in three specific areas: (a) marital jealousy, (b) gender role power, and (c) beliefs about normality. Variations and interrelations of these three linkages are explained by the logical structure of this sociological theory. The relevance of this theory for the applied…
The Evolution of Human Longevity: Toward a Biocultural Theory.
ERIC Educational Resources Information Center
Mayer, Peter J.
Homo sapiens is the only extant species for which there exists a significant post-reproductive period in the normal lifespan. Explanations for the evolution of this species-specific trait are possible through "non-deterministic" theories of aging positing "wear and tear" or the failure of nature to eliminate imperfection, or…
Cultural Descriptions as Political Cultural Acts: An Exploration
ERIC Educational Resources Information Center
Holliday, Adrian
2010-01-01
Interculturality may be something normal which everyone possesses to a degree. However, dominant neo-essentialist theories of culture give the impression that we are too different to easily cross-cultural boundaries. These theories support the development of academic disciplines and the need for professional certainty in intercultural training.…
Standard-Chinese Lexical Neighborhood Test in normal-hearing young children.
Liu, Chang; Liu, Sha; Zhang, Ning; Yang, Yilin; Kong, Ying; Zhang, Luo
2011-06-01
The purposes of the present study were to establish the Standard-Chinese version of Lexical Neighborhood Test (LNT) and to examine the lexical and age effects on spoken-word recognition in normal-hearing children. Six lists of monosyllabic and six lists of disyllabic words (20 words/list) were selected from the database of daily speech materials for normal-hearing (NH) children of ages 3-5 years. The lists were further divided into "easy" and "hard" halves according to the word frequency and neighborhood density in the database based on the theory of Neighborhood Activation Model (NAM). Ninety-six NH children (age ranged between 4.0 and 7.0 years) were divided into three different age groups of 1-year intervals. Speech-perception tests were conducted using the Standard-Chinese monosyllabic and disyllabic LNT. The inter-list performance was found to be equivalent and inter-rater reliability was high with 92.5-95% consistency. Results of word-recognition scores showed that the lexical effects were all significant. Children scored higher with disyllabic words than with monosyllabic words. "Easy" words scored higher than "hard" words. The word-recognition performance also increased with age in each lexical category. A multiple linear regression analysis showed that neighborhood density, age, and word frequency appeared to have increasingly more contributions to Chinese word recognition. The results of the present study indicated that performances of Chinese word recognition were influenced by word frequency, age, and neighborhood density, with word frequency playing a major role. These results were consistent with those in other languages, supporting the application of NAM in the Chinese language. The development of Standard-Chinese version of LNT and the establishment of a database of children of 4-6 years old can provide a reliable means for spoken-word recognition test in children with hearing impairment. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
ERIC Educational Resources Information Center
Stukuls, Henry I.
Eighteen retarded Ss (mean IQ 50 and mean age 14 years) and 18 normal Ss (mean IQ 100 and mean age 7 years) participated in a study to isolate variables that differentially control discrimination learning and retention processes, and to evaluate contrasting theories on discrimination learning and menory processes of retarded and normal children.…
NASA Astrophysics Data System (ADS)
Gavroglu, Kostas
Practitioners of many (sub)-disciplines in the sciences are, at times, confronted with an apparent bliss which often turns into a nightmare: they are stuck with too good and too fertile a theory. 'Normal' science is surely a rewarding practice-but for that very reason it may, at times, also become boring. Theories or theoretical schemata may make successful predictions, may clarify 'mechanisms', they may show the way to further developments, and they may be amenable to non-controversial approximations. If one is really lucky, they may even-at least in principle-be able to answer all questions. There have-especially in the history of physics-been many such theories. Laplacian physics, ether physics and superstrings have historically defined the frameworks for such utopias where everything could be answerable, at least in principle. But one is truly at a loss when one is confronted with this in principle. In principle but not in practice? In principle but never? Confronted with the deadlocks that are implicit in such utopias, scientists started to collectively display a Procrustean psychopathology. They would prepare the beds and, yet, the theories would manage to trick the tricksters: almost all theories appeared to be fitting to any Procrustean bed. They were short and tall and normal at the same time.
Hagger, Martin S.; Gucciardi, Daniel F.; Chatzisarantis, Nikos L. D.
2017-01-01
Tests of social cognitive theories provide informative data on the factors that relate to health behavior, and the processes and mechanisms involved. In the present article, we contend that tests of social cognitive theories should adhere to the principles of nomological validity, defined as the degree to which predictions in a formal theoretical network are confirmed. We highlight the importance of nomological validity tests to ensure theory predictions can be disconfirmed through observation. We argue that researchers should be explicit on the conditions that lead to theory disconfirmation, and identify any auxiliary assumptions on which theory effects may be conditional. We contend that few researchers formally test the nomological validity of theories, or outline conditions that lead to model rejection and the auxiliary assumptions that may explain findings that run counter to hypotheses, raising potential for ‘falsification evasion.’ We present a brief analysis of studies (k = 122) testing four key social cognitive theories in health behavior to illustrate deficiencies in reporting theory tests and evaluations of nomological validity. Our analysis revealed that few articles report explicit statements suggesting that their findings support or reject the hypotheses of the theories tested, even when findings point to rejection. We illustrate the importance of explicit a priori specification of fundamental theory hypotheses and associated auxiliary assumptions, and identification of the conditions which would lead to rejection of theory predictions. We also demonstrate the value of confirmatory analytic techniques, meta-analytic structural equation modeling, and Bayesian analyses in providing robust converging evidence for nomological validity. We provide a set of guidelines for researchers on how to adopt and apply the nomological validity approach to testing health behavior models. PMID:29163307
NASA Astrophysics Data System (ADS)
Sen, Sangita; Shee, Avijit; Mukherjee, Debashis
2018-02-01
The orbital relaxation attendant on ionization is particularly important for the core electron ionization potential (core IP) of molecules. The Unitary Group Adapted State Universal Coupled Cluster (UGA-SUMRCC) theory, recently formulated and implemented by Sen et al. [J. Chem. Phys. 137, 074104 (2012)], is very effective in capturing orbital relaxation accompanying ionization or excitation of both the core and the valence electrons [S. Sen et al., Mol. Phys. 111, 2625 (2013); A. Shee et al., J. Chem. Theory Comput. 9, 2573 (2013)] while preserving the spin-symmetry of the target states and using the neutral closed-shell spatial orbitals of the ground state. Our Ansatz invokes a normal-ordered exponential representation of spin-free cluster-operators. The orbital relaxation induced by a specific set of cluster operators in our Ansatz is good enough to eliminate the need for different sets of orbitals for the ground and the core-ionized states. We call the single configuration state function (CSF) limit of this theory the Unitary Group Adapted Open-Shell Coupled Cluster (UGA-OSCC) theory. The aim of this paper is to comprehensively explore the efficacy of our Ansatz to describe orbital relaxation, using both theoretical analysis and numerical performance. Whenever warranted, we also make appropriate comparisons with other coupled-cluster theories. A physically motivated truncation of the chains of spin-free T-operators is also made possible by the normal-ordering, and the operational resemblance to single reference coupled-cluster theory allows easy implementation. Our test case is the prediction of the 1s core IP of molecules containing a single light- to medium-heavy nucleus and thus, in addition to demonstrating the orbital relaxation, we have addressed the scalar relativistic effects on the accuracy of the IPs by using a hierarchy of spin-free Hamiltonians in conjunction with our theory. Additionally, the contribution of the spin-free component of the two-electron Gaunt term, not usually taken into consideration, has been estimated at the Self-Consistent Field (ΔSCF) level and is found to become increasingly important and eventually quite prominent for molecules with third period atoms and below. The accuracies of the IPs computed using UGA-OSCC are found to be of the same order as the Coupled Cluster Singles Doubles (ΔCCSD) values while being free from spin contamination. Since the UGA-OSCC uses a common set of orbitals for the ground state and the ion, it obviates the need of two N5 AO to MO transformation in contrast to the ΔCCSD method.
Sen, Sangita; Shee, Avijit; Mukherjee, Debashis
2018-02-07
The orbital relaxation attendant on ionization is particularly important for the core electron ionization potential (core IP) of molecules. The Unitary Group Adapted State Universal Coupled Cluster (UGA-SUMRCC) theory, recently formulated and implemented by Sen et al. [J. Chem. Phys. 137, 074104 (2012)], is very effective in capturing orbital relaxation accompanying ionization or excitation of both the core and the valence electrons [S. Sen et al., Mol. Phys. 111, 2625 (2013); A. Shee et al., J. Chem. Theory Comput. 9, 2573 (2013)] while preserving the spin-symmetry of the target states and using the neutral closed-shell spatial orbitals of the ground state. Our Ansatz invokes a normal-ordered exponential representation of spin-free cluster-operators. The orbital relaxation induced by a specific set of cluster operators in our Ansatz is good enough to eliminate the need for different sets of orbitals for the ground and the core-ionized states. We call the single configuration state function (CSF) limit of this theory the Unitary Group Adapted Open-Shell Coupled Cluster (UGA-OSCC) theory. The aim of this paper is to comprehensively explore the efficacy of our Ansatz to describe orbital relaxation, using both theoretical analysis and numerical performance. Whenever warranted, we also make appropriate comparisons with other coupled-cluster theories. A physically motivated truncation of the chains of spin-free T-operators is also made possible by the normal-ordering, and the operational resemblance to single reference coupled-cluster theory allows easy implementation. Our test case is the prediction of the 1s core IP of molecules containing a single light- to medium-heavy nucleus and thus, in addition to demonstrating the orbital relaxation, we have addressed the scalar relativistic effects on the accuracy of the IPs by using a hierarchy of spin-free Hamiltonians in conjunction with our theory. Additionally, the contribution of the spin-free component of the two-electron Gaunt term, not usually taken into consideration, has been estimated at the Self-Consistent Field (ΔSCF) level and is found to become increasingly important and eventually quite prominent for molecules with third period atoms and below. The accuracies of the IPs computed using UGA-OSCC are found to be of the same order as the Coupled Cluster Singles Doubles (ΔCCSD) values while being free from spin contamination. Since the UGA-OSCC uses a common set of orbitals for the ground state and the ion, it obviates the need of two N 5 AO to MO transformation in contrast to the ΔCCSD method.
Theory into Practice: Advancing Normalization for the Child under Three
ERIC Educational Resources Information Center
Conklin-Moore, Alyssa
2017-01-01
Alyssa Conklin-Moore discusses normalization in the child under three from several perspectives. She takes an extensive look at the child, including orienting parents to the Montessori environment, the child's entrance into the environment, addressing the sensitive periods, and fostering independence, contribution, and community. She reminds the…
Local Influence and Robust Procedures for Mediation Analysis
ERIC Educational Resources Information Center
Zu, Jiyun; Yuan, Ke-Hai
2010-01-01
Existing studies of mediation models have been limited to normal-theory maximum likelihood (ML). Because real data in the social and behavioral sciences are seldom normally distributed and often contain outliers, classical methods generally lead to inefficient or biased parameter estimates. Consequently, the conclusions from a mediation analysis…
ERIC Educational Resources Information Center
Raykov, Tenko; Marcoulides, George A.
2016-01-01
The frequently neglected and often misunderstood relationship between classical test theory and item response theory is discussed for the unidimensional case with binary measures and no guessing. It is pointed out that popular item response models can be directly obtained from classical test theory-based models by accounting for the discrete…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Donnelly, William; Freidel, Laurent
We consider the problem of defining localized subsystems in gauge theory and gravity. Such systems are associated to spacelike hypersurfaces with boundaries and provide the natural setting for studying entanglement entropy of regions of space. We present a general formalism to associate a gauge-invariant classical phase space to a spatial slice with boundary by introducing new degrees of freedom on the boundary. In Yang-Mills theory the new degrees of freedom are a choice of gauge on the boundary, transformations of which are generated by the normal component of the nonabelian electric field. In general relativity the new degrees of freedommore » are the location of a codimension-2 surface and a choice of conformal normal frame. These degrees of freedom transform under a group of surface symmetries, consisting of diffeomorphisms of the codimension-2 boundary, and position-dependent linear deformations of its normal plane. We find the observables which generate these symmetries, consisting of the conformal normal metric and curvature of the normal connection. We discuss the implications for the problem of defining entanglement entropy in quantum gravity. Finally, our work suggests that the Bekenstein-Hawking entropy may arise from the different ways of gluing together two partial Cauchy surfaces at a cross-section of the horizon.« less
NASA Technical Reports Server (NTRS)
Elrod, D. A.; Childs, D. W.
1986-01-01
A brief review of current annular seal theory and a discussion of the predicted effect on stiffness of tapering the seal stator are presented. An outline of Nelson's analytical-computational method for determining rotordynamic coefficients for annular compressible-flow seals is included. Modifications to increase the maximum rotor speed of an existing air-seal test apparatus at Texas A&M University are described. Experimental results, including leakage, entrance-loss coefficients, pressure distributions, and normalized rotordynamic coefficients, are presented for four convergent-tapered, smooth-rotor, smooth-stator seals. A comparison of the test results shows that an inlet-to-exit clearance ratio of 1.5 to 2.0 provides the maximum direct stiffness, a clearance ratio of 2.5 provides the greatest stability, and a clearance ratio of 1.0 provides the least stability. The experimental results are compared to theoretical results from Nelson's analysis with good agreement. Test results for cross-coupled stiffness show less sensitivity of fluid prerotation than predicted.
NASA Technical Reports Server (NTRS)
Swanson, P. L.
1984-01-01
An experimental investigation of tensile rock fracture is presented with an emphasis on characterizing time dependent crack growth using the methods of fracture mechanics. Subcritical fracture experiments were performed in moist air on glass and five different rock types at crack velocities using the double torsion technique. The experimental results suggest that subcritical fracture resistance in polycrystals is dominated by microstructural effects. Evidence for gross violations of the assumptions of linear elastic fracture mechanics and double torsion theory was found in the tests on rocks. In an effort to obtain a better understanding of the physical breakdown processes associated with rock fracture, a series of nondestructive evaluation tests were performed during subcritical fracture experiments on glass and granite. Comparison of the observed process zone shape with that expected on the basis of a critical normal principal tensile stress criterion shows that the zone is much more elongated in the crack propagation direction than predicted by the continuum based microcracking model alone.
Rinehart, Nicole J; Bradshaw, John L; Tonge, Bruce J; Brereton, Avril V; Bellgrove, Mark A
2002-06-01
The repetitive, stereotyped, and obsessive behaviors that characterize autism may in part be attributable to disruption of the region of the fronto-striatal system, which mediates executive abilities. Neuropsychological testing has shown that children with autism exhibit set-shifting deficiencies on tests such as the Wisconsin Card Sorting task but show normal inhibitory ability on variants of the Stroop color-word test. According to Minshew and Goldstein's multiple primary deficit theory, the complexity of the executive functioning task is important in determining the performance of individuals with autism. This study employed a visual-spatial task (with a Stroop-type component) to examine the integrity of executive functioning, in particular inhibition, in autism (n = 12) and Asperger's disorder (n = 12) under increasing levels of cognitive complexity. Whereas the Asperger's disorder group performed similarly to age- and IQ-matched control participants, even at the higher levels of cognitive complexity, the high-functioning autism group displayed inhibitory deficits specifically associated with increasing cognitive load.
NASA Astrophysics Data System (ADS)
Yi, Faliu; Moon, Inkyu; Lee, Yeon H.
2015-01-01
Counting morphologically normal cells in human red blood cells (RBCs) is extremely beneficial in the health care field. We propose a three-dimensional (3-D) classification method of automatically determining the morphologically normal RBCs in the phase image of multiple human RBCs that are obtained by off-axis digital holographic microscopy (DHM). The RBC holograms are first recorded by DHM, and then the phase images of multiple RBCs are reconstructed by a computational numerical algorithm. To design the classifier, the three typical RBC shapes, which are stomatocyte, discocyte, and echinocyte, are used for training and testing. Nonmain or abnormal RBC shapes different from the three normal shapes are defined as the fourth category. Ten features, including projected surface area, average phase value, mean corpuscular hemoglobin, perimeter, mean corpuscular hemoglobin surface density, circularity, mean phase of center part, sphericity coefficient, elongation, and pallor, are extracted from each RBC after segmenting the reconstructed phase images by using a watershed transform algorithm. Moreover, four additional properties, such as projected surface area, perimeter, average phase value, and elongation, are measured from the inner part of each cell, which can give significant information beyond the previous 10 features for the separation of the RBC groups; these are verified in the experiment by the statistical method of Hotelling's T-square test. We also apply the principal component analysis algorithm to reduce the dimension number of variables and establish the Gaussian mixture densities using the projected data with the first eight principal components. Consequently, the Gaussian mixtures are used to design the discriminant functions based on Bayesian decision theory. To improve the performance of the Bayes classifier and the accuracy of estimation of its error rate, the leaving-one-out technique is applied. Experimental results show that the proposed method can yield good results for calculating the percentage of each typical normal RBC shape in a reconstructed phase image of multiple RBCs that will be favorable to the analysis of RBC-related diseases. In addition, we show that the discrimination performance for the counting of normal shapes of RBCs can be improved by using 3-D features of an RBC.
In the Shadow of E. H. Carr: The Evolution of International Politics
2012-06-01
promote the merits of cooperation and look to institutions as a method for ensuring peace. We examine Normal Angel’s liberal theory , Robert Keohane...pages, Carr divides the field into its ideational and material sides: utopianism and realism, ethics and politics, theory and practice, intellectualism...Carr believed that the current course of international politics could lead to the ruin of humanity. He did not believe that IR theories and practices
Drew, Sarah; Judge, Andrew; May, Carl; Farmer, Andrew; Cooper, Cyrus; Javaid, M Kassim; Gooberman-Hill, Rachael
2015-04-23
National and international guidance emphasizes the need for hospitals to have effective secondary fracture prevention services, to reduce the risk of future fractures in hip fracture patients. Variation exists in how hospitals organize these services, and there remain significant gaps in care. No research has systematically explored reasons for this to understand how to successfully implement these services. The objective of this study was to use extended Normalization Process Theory to understand how secondary fracture prevention services can be successfully implemented. Forty-three semi-structured interviews were conducted with healthcare professionals involved in delivering secondary fracture prevention within 11 hospitals that receive patients with acute hip fracture in one region in England. These included orthogeriatricians, fracture prevention nurses and service managers. Extended Normalization Process Theory was used to inform study design and analysis. Extended Normalization Process Theory specifies four constructs relating to collective action in service implementation: capacity, potential, capability and contribution. The capacity of healthcare professionals to co-operate and co-ordinate their actions was achieved using dedicated fracture prevention co-ordinators to organize important processes of care. However, participants described effective communication with GPs as challenging. Individual potential and commitment to operationalize services was generally high. Shared commitments were promoted through multi-disciplinary team working, facilitated by fracture prevention co-ordinators. Healthcare professionals had capacity to deliver multiple components of services when co-ordinators 'freed up' time. As key agents in its intervention, fracture prevention coordinators were therefore indispensable to effective implementation. Aside from difficulty of co-ordination with primary care, the intervention was highly workable and easily integrated into practice. Nevertheless, implementation was threatened by under-staffed and under-resourced services, lack of capacity to administer scans and poor patient access. To ensure ongoing service delivery, the contributions of healthcare professionals were shaped by planning, in multi-disciplinary team meetings, the use of clinical databases to identify patients and define the composition of clinical work and monitoring to improve clinical practice. Findings identify and describe elements needed to implement secondary fracture prevention services successfully. The study highlights the value of Normalization Process Theory to achieve comprehensive understanding of healthcare professionals' experiences in enacting a complex intervention.
Psychoanalysis and homosexuality: do we need a new theory?
Auchincloss, E L; Vaughan, S C
2001-01-01
No need exists, it is argued, for a new psychoanalytic theory of homosexuality. Certainly psychoanalysis should not be expected to generate such a theory using its own methodology alone. The preoccupation with producing such a theory avoids more important questions about psychoanalytic theory building raised by an examination of the long relationship between psychoanalysis and homosexuality. These questions concern the problems related to using psychoanalytic methodology (1) to construct categories (including the categories normal and abnormal), (2) to construct causal theory (the problems include the limitations of psychoanalytic developmental theory and a long-standing confusion between psychoanalytic developmental theory, psychoanalytic genetic reconstruction, and psychodynamics), and (3) to identify "bedrock." Finally, the question is addressed of what might be needed that is new in the psychoanalytic approach to homosexuality.
Visual recognition of permuted words
NASA Astrophysics Data System (ADS)
Rashid, Sheikh Faisal; Shafait, Faisal; Breuel, Thomas M.
2010-02-01
In current study we examine how letter permutation affects in visual recognition of words for two orthographically dissimilar languages, Urdu and German. We present the hypothesis that recognition or reading of permuted and non-permuted words are two distinct mental level processes, and that people use different strategies in handling permuted words as compared to normal words. A comparison between reading behavior of people in these languages is also presented. We present our study in context of dual route theories of reading and it is observed that the dual-route theory is consistent with explanation of our hypothesis of distinction in underlying cognitive behavior for reading permuted and non-permuted words. We conducted three experiments in lexical decision tasks to analyze how reading is degraded or affected by letter permutation. We performed analysis of variance (ANOVA), distribution free rank test, and t-test to determine the significance differences in response time latencies for two classes of data. Results showed that the recognition accuracy for permuted words is decreased 31% in case of Urdu and 11% in case of German language. We also found a considerable difference in reading behavior for cursive and alphabetic languages and it is observed that reading of Urdu is comparatively slower than reading of German due to characteristics of cursive script.
Zhang, Dengke; Pang, Yanxia; Cai, Weixiong; Fazio, Rachel L; Ge, Jianrong; Su, Qiaorong; Xu, Shuiqin; Pan, Yinan; Chen, Sanmei; Zhang, Hongwei
2016-08-01
Impairment of theory of mind (ToM) is a common phenomenon following traumatic brain injury (TBI) that has clear effects on patients' social functioning. A growing body of research has focused on this area, and several methods have been developed to assess ToM deficiency. Although an informant assessment scale would be useful for examining individuals with TBI, very few studies have adopted this approach. The purpose of the present study was to develop an informant assessment scale of ToM for adults with traumatic brain injury (IASToM-aTBI) and to test its reliability and validity with 196 adults with TBI and 80 normal adults. A 44-item scale was developed following a literature review, interviews with patient informants, consultations with experts, item analysis, and exploratory factor analysis (EFA). The following three common factors were extracted: social interaction, understanding of beliefs, and understanding of emotions. The psychometric analyses indicate that the scale has good internal consistency reliability, split-half reliability, test-retest reliability, inter-rater reliability, structural validity, discriminate validity and criterion validity. These results provide preliminary evidence that supports the reliability and validity of the IASToM-aTBI as a ToM assessment tool for adults with TBI.
Simplified conditions holding at the gas-liquid interface during evaporation
NASA Astrophysics Data System (ADS)
Morris, S. J. S.
2017-11-01
We show that on the gas side of the interface between a pure liquid and a binary mixture of its vapour with an insoluble gas, the normal derivative of vapour partial pressure pv satisfies ∂pv/∂n +αc/2 πpD (P -pv) (p -pv) = 0 . Constants α, c, D denote the dimensionless accommodation coefficient, a molecular speed and the diffusivity. Provided the continuum approximation holds within the gas, and α = O(1) , this boundary condition implies that evaporation can take one of two forms. (a) If the coexistence pressure P evaluated at the interface is less than the constant total gas pressure p, liquid at the interface is in local thermodynamic equilibrium with its vapour, and the evaporation rate is determined by diffusion through the gas. (b) Conversely, if P > p , gas at the interface consists of pure vapour, and the evaporation rate is determined by processes within the liquid. In the Wayner theory of the heated evaporating meniscus, such as that in a heat pipe, case (b) is assumed. As an application of our result, we show that some of the published experiments intended to test the Wayner theory instead operate under conditions in which case (a) holds. As a result, they do not perform the test intended.
Theory and experiment in gravitational physics
NASA Technical Reports Server (NTRS)
Will, C. M.
1981-01-01
New technological advances have made it feasible to conduct measurements with precision levels which are suitable for experimental tests of the theory of general relativity. This book has been designed to fill a new need for a complete treatment of techniques for analyzing gravitation theory and experience. The Einstein equivalence principle and the foundations of gravitation theory are considered, taking into account the Dicke framework, basic criteria for the viability of a gravitation theory, experimental tests of the Einstein equivalence principle, Schiff's conjecture, and a model theory devised by Lightman and Lee (1973). Gravitation as a geometric phenomenon is considered along with the parametrized post-Newtonian formalism, the classical tests, tests of the strong equivalence principle, gravitational radiation as a tool for testing relativistic gravity, the binary pulsar, and cosmological tests.
Theory and experiment in gravitational physics
NASA Astrophysics Data System (ADS)
Will, C. M.
New technological advances have made it feasible to conduct measurements with precision levels which are suitable for experimental tests of the theory of general relativity. This book has been designed to fill a new need for a complete treatment of techniques for analyzing gravitation theory and experience. The Einstein equivalence principle and the foundations of gravitation theory are considered, taking into account the Dicke framework, basic criteria for the viability of a gravitation theory, experimental tests of the Einstein equivalence principle, Schiff's conjecture, and a model theory devised by Lightman and Lee (1973). Gravitation as a geometric phenomenon is considered along with the parametrized post-Newtonian formalism, the classical tests, tests of the strong equivalence principle, gravitational radiation as a tool for testing relativistic gravity, the binary pulsar, and cosmological tests.
Mind-wandering, cognition, and performance: a theory-driven meta-analysis of attention regulation.
Randall, Jason G; Oswald, Frederick L; Beier, Margaret E
2014-11-01
The current meta-analysis accumulates empirical findings on the phenomenon of mind-wandering, integrating and interpreting findings in light of psychological theories of cognitive resource allocation. Cognitive resource theory emphasizes both individual differences in attentional resources and task demands together to predict variance in task performance. This theory motivated our conceptual and meta-analysis framework by introducing moderators indicative of task-demand to predict who is more likely to mind-wander under what conditions, and to predict when mind-wandering and task-related thought are more (or less) predictive of task performance. Predictions were tested via a random-effects meta-analysis of correlations obtained from normal adult samples (k = 88) based on measurement of specified episodes of off-task and/or on-task thought frequency and task performance. Results demonstrated that people with fewer cognitive resources tend to engage in more mind-wandering, whereas those with more cognitive resources are more likely to engage in task-related thought. Addressing predictions of resource theory, we found that greater time-on-task-although not greater task complexity-tended to strengthen the negative relation between cognitive resources and mind-wandering. Additionally, increases in mind-wandering were generally associated with decreases in task performance, whereas increases in task-related thought were associated with increased performance. Further supporting resource theory, the negative relation between mind-wandering and performance was more pronounced for more complex tasks, though not longer tasks. Complementarily, the positive association between task-related thought and performance was stronger for more complex tasks and for longer tasks. We conclude by discussing implications and future research directions for mind-wandering as a construct of interest in psychological research. (PsycINFO Database Record (c) 2014 APA, all rights reserved).
Testing 40 Predictions from the Transtheoretical Model Again, with Confidence
ERIC Educational Resources Information Center
Velicer, Wayne F.; Brick, Leslie Ann D.; Fava, Joseph L.; Prochaska, James O.
2013-01-01
Testing Theory-based Quantitative Predictions (TTQP) represents an alternative to traditional Null Hypothesis Significance Testing (NHST) procedures and is more appropriate for theory testing. The theory generates explicit effect size predictions and these effect size estimates, with related confidence intervals, are used to test the predictions.…
HIV-positive mothers and stigma.
Ingram, D; Hutchinson, S A
1999-01-01
Our purpose in this paper is to demonstrate how stigma pervades the lives of human immunodeficiency virus (HIV)-positive mothers and their children. Data from a grounded theory study on HIV-positive mothers are used to illustrate Goffman's theory of stigma. This research is an example of "emergent fit," where extant theory is discovered by the interpretive researchers to fit much of the data. The sample included 18 HIV-positive mothers who participated in in-depth interviews. The HIV-positive mothers valued being perceived as normal but acknowledged that normalcy was lost for them because of the stigma of HIV. Consequently, they tried to pass as normal by managing information and manipulating their environment. They attempted to cover up their illness by lying and pretending. Health care professionals can provide quality, client-centered care when they understand the power that stigma holds over these women and the strategies that effectively mitigate the stigma.
Experimental and Theoretical Study of a Rectangular Wing in a Vortical Wake at Low Speed
NASA Technical Reports Server (NTRS)
Smith, Willard G.; Lazzeroni, Frank A.
1960-01-01
A systematic study has been made, experimentally and theoretically, of the effects of a vortical wake on the aerodynamic characteristics of a rectangular wing at subsonic speed. The vortex generator and wing were mounted on a reflection plane to avoid body-wing interference. Vortex position, relative to the wing, was varied both in the spanwise direction and normal to the wing. Angle of attack of the wing was varied from -40 to +60. Both chordwise and spanwise pressure distributions were obtained with the wing in uniform and vortical flow fields. Stream surveys were made to determine the flow characteristics in the vortical wake. The vortex-induced lift was calculated by several theoretical methods including strip theory, reverse-flow theory, and reverse-flow theory including a finite vortex core. In addition, the Prandtl lifting-line theory and the Weissinger theory were used to calculate the spanwise distribution of vortex-induced loads. With reverse-flow theory, predictions of the interference lift were generally good, and with Weissinger's theory the agreement between the theoretical spanwise variation of induced load and the experimental variation was good. Results of the stream survey show that the vortex generated by a lifting surface of rectangular plan form tends to trail back streamwise from the tip and does not approach the theoretical location, or centroid of circulation, given by theory. This discrepancy introduced errors in the prediction of vortex interference, especially when the vortex core passed immediately outboard of the wing tip. The wake produced by the vortex generator in these tests was not fully rolled up into a circular vortex, and so lacked symmetry in the vertical direction of the transverse plane. It was found that the direction of circulation affected the induced loads on the wing either when the wing was at angle of attack or when the vortex was some distance away from the plane of the wing.
A Developmental Test of Mertonian Anomie Theory.
ERIC Educational Resources Information Center
Menard, Scott
1995-01-01
Carefully reviewed Merton's writings on anomie theory to construct a more complete and rigorous test of the theory for respondents in early, middle, and late adolescence. Concluded that misspecified models of strain theory have underestimated the predictive power of strain theory in general and of anomie theory in particular. (JBJ)
More than maths and mindreading: sex differences in empathizing/systemizing covariance.
Valla, Jeffrey M; Ganzel, Barbara L; Yoder, Keith J; Chen, Grace M; Lyman, Laura T; Sidari, Anthony P; Keller, Alex E; Maendel, Jeffrey W; Perlman, Jordan E; Wong, Stephanie K L; Belmonte, Matthew K
2010-08-01
Empathizing-Systemizing theory posits a continuum of cognitive traits extending from autism into normal cognitive variation. Covariance data on empathizing and systemizing traits have alternately suggested inversely dependent, independent, and sex-dependent (one sex dependent, the other independent) structures. A total of 144 normal undergraduates (65 men, 79 women) completed the Reading the Mind in the Eyes, Embedded Figures, and Benton face recognition tests, the Autism Spectrum Quotient, and measures of digit length ratio and field of study; some also completed tests of motion coherence threshold (64) and go/no-go motor inhibition (128). Empathizing and systemizing traits were independent in women, but largely dependent in men. In men, level of systemizing skill required by field of study was directly related to social interactive and mindreading deficits; men's social impairments correlated with prolonged go/no-go response times, and men tended to apply systemizing strategies to solve problems of empathizing or global processing: rapid perceptual disembedding predicted heightened sensitivity to facial emotion. In women, level of systemizing in field was related to male-typical digit ratios and autistic superiorities in detail orientation, but not to autistic social and communicative impairments; and perceptual disembedding was related to social interactive skills but independent of facial emotion and visual motion perception.
Duesberg, Peter; McCormack, Amanda
2013-01-01
Immortality is a common characteristic of cancers, but its origin and purpose are still unclear. Here we advance a karyotypic theory of immortality based on the theory that carcinogenesis is a form of speciation. Accordingly, cancers are generated from normal cells by random karyotypic rearrangements and selection for cancer-specific reproductive autonomy. Since such rearrangements unbalance long-established mitosis genes, cancer karyotypes vary spontaneously but are stabilized perpetually by clonal selections for autonomy. To test this theory we have analyzed neoplastic clones, presumably immortalized by transfection with overexpressed telomerase or with SV40 tumor virus, for the predicted clonal yet flexible karyotypes. The following results were obtained: (1) All immortal tumorigenic lines from cells transfected with overexpressed telomerase had clonal and flexible karyotypes; (2) Searching for the origin of such karyotypes, we found spontaneously increasing, random aneuploidy in human fibroblasts early after transfection with overexpressed telomerase; (3) Late after transfection, new immortal tumorigenic clones with new clonal and flexible karyotypes were found; (4) Testing immortality of one clone during 848 unselected generations showed the chromosome number was stable, but the copy numbers of 36% of chromosomes drifted ± 1; (5) Independent immortal tumorigenic clones with individual, flexible karyotypes arose after individual latencies; (6) Immortal tumorigenic clones with new flexible karyotypes also arose late from cells of a telomerase-deficient mouse rendered aneuploid by SV40 virus. Because immortality and tumorigenicity: (1) correlated exactly with individual clonal but flexible karyotypes; (2) originated simultaneously with such karyotypes; and (3) arose in the absence of telomerase, we conclude that clonal and flexible karyotypes generate the immortality of cancers. PMID:23388461
Personality dimensions of people who suffer from visual stress.
Hollis, J; Allen, P M; Fleischmann, D; Aulak, R
2007-11-01
Personality dimensions of participants who suffer from visual stress were compared with those of normal participants using the Eysenck Personality Inventory. Extraversion-Introversion scores showed no significant differences between the participants who suffered visual stress and those who were classified as normal. By contrast, significant differences were found between the normal participants and those with visual stress in respect of Neuroticism-Stability. These differences accord with Eysenck's personality theory which states that those who score highly on the neuroticism scale do so because they have a neurological system with a low threshold such that their neurological system is easily activated by external stimuli. The findings also relate directly to the theory of visual stress proposed by Wilkins which postulates that visual stress results from an excess of neural activity. The data may indicate that the excess activity is likely to be localised at particular neurological regions or neural processes.
Relaxation approximation in the theory of shear turbulence
NASA Technical Reports Server (NTRS)
Rubinstein, Robert
1995-01-01
Leslie's perturbative treatment of the direct interaction approximation for shear turbulence (Modern Developments in the Theory of Turbulence, 1972) is applied to derive a time dependent model for the Reynolds stresses. The stresses are decomposed into tensor components which satisfy coupled linear relaxation equations; the present theory therefore differs from phenomenological Reynolds stress closures in which the time derivatives of the stresses are expressed in terms of the stresses themselves. The theory accounts naturally for the time dependence of the Reynolds normal stress ratios in simple shear flow. The distortion of wavenumber space by the mean shear plays a crucial role in this theory.
A {3,2}-Order Bending Theory for Laminated Composite and Sandwich Beams
NASA Technical Reports Server (NTRS)
Cook, Geoffrey M.; Tessler, Alexander
1998-01-01
A higher-order bending theory is derived for laminated composite and sandwich beams thus extending the recent {1,2}-order theory to include third-order axial effect without introducing additional kinematic variables. The present theory is of order {3,2} and includes both transverse shear and transverse normal deformations. A closed-form solution to the cylindrical bending problem is derived and compared with the corresponding exact elasticity solution. The numerical comparisons are focused on the most challenging material systems and beam aspect ratios which include moderate-to-thick unsymmetric composite and sandwich laminates. Advantages and limitations of the theory are discussed.
Languages and Lives through a Critical Eye: The Case of Estonia
ERIC Educational Resources Information Center
Skerrett, Delaney Michael
2011-01-01
This article seeks to situate Estonian language use and policy within the emerging field of critical language policy and planning (LPP). Critical LPP draws on poststructuralist theory to deconstruct normalized categories that maintain systems of inequality. It is akin to the queer theory project for gender and sexuality. Since the country regained…
Theory of Mind in Williams Syndrome Assessed Using a Nonverbal Task
ERIC Educational Resources Information Center
Porter, Melanie A.; Coltheart, Max; Langdon, Robyn
2008-01-01
This study examined Theory of Mind in Williams syndrome (WS) and in normal chronological age-matched and mental age-matched control groups, using a picture sequencing task. This task assesses understanding of pretence, intention and false belief, while controlling for social-script knowledge and physical cause-and-effect reasoning. The task was…
Analysis of Multicomponent Adsorption Close to a Dew Point.
Shapiro; Stenby
1998-10-15
We develop the potential theory of multicomponent adsorption close to a dew point. The approach is based on an asymptotic adsorption equation (AAE) which is valid in a vicinity of the dew point. By this equation the thickness of the liquid film is expressed through thermodynamic characteristics of the bulk phase. The AAE makes it possible to study adsorption in the regions of both the normal and the retrograde condensation. A simple correlation of the Kelvin radius for capillary condensation and the thickness of the adsorbed film is established. Numerical testing shows good agreement between the AAE and the direct calculations, even if the mixture is not close to a dew point. Copyright 1998 Academic Press.
Rastatter, M; Dell, C W; McGuire, R A; Loren, C
1987-03-01
Previous studies investigating hemispheric organization for processing concrete and abstract nouns have provided conflicting results. Using manual reaction time tasks some studies have shown that the right hemisphere is capable of analyzing concrete words but not abstract. Others, however, have inferred that the left hemisphere is the sole analyzer of both types of lexicon. The present study tested these issues further by measuring vocal reaction times of normal subjects to unilaterally presented concrete and abstract items. Results were consistent with a model of functional localization which suggests that the minor hemisphere is capable of differentially processing both types of lexicon in the presence of a dominant left hemisphere.
Engine Data Interpretation System (EDIS)
NASA Technical Reports Server (NTRS)
Cost, Thomas L.; Hofmann, Martin O.
1990-01-01
A prototype of an expert system was developed which applies qualitative or model-based reasoning to the task of post-test analysis and diagnosis of data resulting from a rocket engine firing. A combined component-based and process theory approach is adopted as the basis for system modeling. Such an approach provides a framework for explaining both normal and deviant system behavior in terms of individual component functionality. The diagnosis function is applied to digitized sensor time-histories generated during engine firings. The generic system is applicable to any liquid rocket engine but was adapted specifically in this work to the Space Shuttle Main Engine (SSME). The system is applied to idealized data resulting from turbomachinery malfunction in the SSME.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Yimin; Miller, Wlliam H.
2006-02-22
One of the outstanding issues in the quantum instanton (QI) theory (or any transition state-type theory) for thermal rate constants of chemical reactions is the choice of an appropriate ''dividing surface'' (DS) that separates reactants and products. (In the general version of the QI theory, there are actually two dividing surfaces involved.) This paper shows one simple and general way for choosing DS's for use in QI Theory, namely using the family of (hyper) planes normal to the minimum energy path (MEP) on the potential energy surface at various distances s along it. Here the reaction coordinate is not onemore » of the dynamical coordinates of the system (which will in general be the Cartesian coordinates of the atoms), but rather simply a parameter which specifies the DS. It is also shown how this idea can be implemented for an N-atom system in 3d space in a way that preserves overall translational and rotational invariance. Numerical application to a simple system (the colliner H + H{sub 2} reaction) is presented to illustrate the procedure.« less
Local phase space and edge modes for diffeomorphism-invariant theories
NASA Astrophysics Data System (ADS)
Speranza, Antony J.
2018-02-01
We discuss an approach to characterizing local degrees of freedom of a subregion in diffeomorphism-invariant theories using the extended phase space of Donnelly and Freidel [36]. Such a characterization is important for defining local observables and entanglement entropy in gravitational theories. Traditional phase space constructions for subregions are not invariant with respect to diffeomorphisms that act at the boundary. The extended phase space remedies this problem by introducing edge mode fields at the boundary whose transformations under diffeomorphisms render the extended symplectic structure fully gauge invariant. In this work, we present a general construction for the edge mode symplectic structure. We show that the new fields satisfy a surface symmetry algebra generated by the Noether charges associated with the edge mode fields. For surface-preserving symmetries, the algebra is universal for all diffeomorphism-invariant theories, comprised of diffeomorphisms of the boundary, SL(2, ℝ) transformations of the normal plane, and, in some cases, normal shearing transformations. We also show that if boundary conditions are chosen such that surface translations are symmetries, the algebra acquires a central extension.
Estimating outflow facility through pressure dependent pathways of the human eye
Gardiner, Bruce S.
2017-01-01
We develop and test a new theory for pressure dependent outflow from the eye. The theory comprises three main parameters: (i) a constant hydraulic conductivity, (ii) an exponential decay constant and (iii) a no-flow intraocular pressure, from which the total pressure dependent outflow, average outflow facilities and local outflow facilities for the whole eye may be evaluated. We use a new notation to specify precisely the meaning of model parameters and so model outputs. Drawing on a range of published data, we apply the theory to animal eyes, enucleated eyes and in vivo human eyes, and demonstrate how to evaluate model parameters. It is shown that the theory can fit high quality experimental data remarkably well. The new theory predicts that outflow facilities and total pressure dependent outflow for the whole eye are more than twice as large as estimates based on the Goldman equation and fluorometric analysis of anterior aqueous outflow. It appears likely that this discrepancy can be largely explained by pseudofacility and aqueous flow through the retinal pigmented epithelium, while any residual discrepancy may be due to pathological processes in aged eyes. The model predicts that if the hydraulic conductivity is too small, or the exponential decay constant is too large, then intraocular eye pressure may become unstable when subjected to normal circadian changes in aqueous production. The model also predicts relationships between variables that may be helpful when planning future experiments, and the model generates many novel testable hypotheses. With additional research, the analysis described here may find application in the differential diagnosis, prognosis and monitoring of glaucoma. PMID:29261696
Typography and color: effects of salience and fluency on conscious recollective experience.
Wehr, Thomas; Wippich, Werner
2004-12-01
Within one experiment the central assumptions of the distinctiveness/fluency account of recollective experience were tested and contrasted with predictions of processing theory. To manipulate perceptual salience, the typography of words was varied. Effects of conceptual salience were induced by a variation of word color. In the study phase participants generated different word or object images according to presented words. To manipulate perceptual and conceptual fluency one test group underwent a priming procedure in the test phase, consisting of a recognition test, whereby some primes were identical to the target words typographically or by color and others were not. Additionally, all participants were asked to make judgments of recollective experience (remember, know, guess) after the old/new decisions. The results of the data analyses confirm the distinctiveness/fluency account. Words written in an unusual typography or color were judged significantly more often as "remembered" than normal words. The priming procedure uncovered some effects of fluency on reaction times: old/new decisions took less time if prime and target words were perceptually or conceptually identical.
Developing a Questionnaire for Iranian Women's Attitude on Medical Ethics in Vaginal Childbirth.
Mirzaee Rabor, Firoozeh; Taghipour, Ali; Mirzaee, Moghaddameh; Mirzaii Najmabadi, Khadigeh; Fazilat Pour, Masoud; Fattahi Masoum, Seyed Hosein
2015-12-01
Vaginal delivery is one of the challenging issues in medical ethics. It is important to use an appropriate instrument to assess medical ethics attitudes in normal delivery, but the lack of tool for this purpose is clear. The aim of this study was to develop and validate a questionnaire for the assessment of women's attitude on medical ethics application in normal vaginal delivery. This methodological study was carried out in Iran in 2013 - 2014. Medical ethics attitude in vaginal delivery questionnaire (MEAVDQ) was developed using the findings of a qualitative data obtained from a grounded theory research conducted on 20 women who had vaginal childbirth, in the first phase. Then, the validation criteria of this tool were tested by content and face validity in the second phase. Exploratory factor analysis was used for construct validity and reliability was also tested by Cronbach's alpha coefficient in the third phase of this study. SPSS version 13 was used in this study. The sample size for construct validity was 250 females who had normal vaginal childbirth. In the first phase of this study (tool development), by the use of four obtained categories and nine subcategories from grounded theory and literature review, three parts (98-items) of this tool were obtained (A, B and J). Part A explained the first principle of medical ethics, part B pointed to the second and third principles of medical ethics, and part J explained the fourth principle of medical ethics. After evaluating and confirming its face and content validity, 75 items remained in the questionnaire. In construct validity, by the employment of exploratory factor analysis, in parts A, B and J, 3, 7 and 3 factors were formed, respectively; and 62.8%, 64% and 51% of the total variances were explained by the obtained factors in parts A, B and J, respectively. The names of these factors in the three parts were achieved by consideration of the loading factor and medical ethics principles. The subscales of MEAVDQ showed significant reliability. In parts A, B and J, Cronbach's alpha coefficients were 0.76, 0.72 and 0.68, respectively and for the total questionnaire, it was 0.72. The results of the test-retest were satisfactory for all the items (ICC = 0.60 - 0.95). The present study showed that the 59-item MEAVDQ was a valid and reliable questionnaire for the assessment of women's attitudes toward medical ethics application in vaginal childbirth. This tool might assist specialists in making a judgment and plan appropriate for women in vaginal delivery management.
Karatekin, C; Asarnow, R F
1998-10-01
This study tested the hypotheses that visual search impairments in schizophrenia are due to a delay in initiation of search or a slow rate of serial search. We determined the specificity of these impairments by comparing children with schizophrenia to children with attention-deficit hyperactivity disorder (ADHD) and age-matched normal children. The hypotheses were tested within the framework of feature integration theory by administering children tasks tapping parallel and serial search. Search rate was estimated from the slope of the search functions, and duration of the initial stages of search from time to make the first saccade on each trial. As expected, manual response times were elevated in both clinical groups. Contrary to expectation, ADHD, but not schizophrenic, children were delayed in initiation of serial search. Finally, both groups showed a clear dissociation between intact parallel search rates and slowed serial search rates.
Social vulnerability and bullying in children with Asperger syndrome.
Sofronoff, Kate; Dark, Elizabeth; Stone, Valerie
2011-05-01
Children with Asperger syndrome (AS) have IQ within the normal range but specific impairments in theory of mind, social interaction and communication skills. The majority receive education in mainstream schools and research suggests they are bullied more than typically developing peers. The current study aimed to evaluate factors that predict bullying for such children and also to examine a new measure, the Social Vulnerability Scale (SVS). One hundred and thirty three parents of children with AS completed the SVS and of these 92 parents completed both the SVS and questionnaires measuring anxiety, anger, behaviour problems, social skills and bullying. Regression analyses revealed that these variables together strongly predicted bullying, but that social vulnerability was the strongest predictor. Test-re-test and internal consistency analyses of the SVS demonstrated sound psychometric properties and factor analyses revealed two sub-scales: gullibility and credulity. Limitations of the study are acknowledged and suggestions for future research discussed.
Lifetime Reliability Evaluation of Structural Ceramic Parts with the CARES/LIFE Computer Program
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.
1993-01-01
The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker equation. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), Weibull's normal stress averaging method (NSA), or Batdorf's theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Two example problems demonstrating cyclic fatigue parameter estimation and component reliability analysis with proof testing are included.
Non-Normality and Testing that a Correlation Equals Zero
ERIC Educational Resources Information Center
Levy, Kenneth J.
1977-01-01
The importance of the assumption of normality for testing that a bivariate normal correlation equals zero is examined. Both empirical and theoretical evidence suggest that such tests are robust with respect to violation of the normality assumption. (Author/JKS)
PE Metrics: Background, Testing Theory, and Methods
ERIC Educational Resources Information Center
Zhu, Weimo; Rink, Judy; Placek, Judith H.; Graber, Kim C.; Fox, Connie; Fisette, Jennifer L.; Dyson, Ben; Park, Youngsik; Avery, Marybell; Franck, Marian; Raynes, De
2011-01-01
New testing theories, concepts, and psychometric methods (e.g., item response theory, test equating, and item bank) developed during the past several decades have many advantages over previous theories and methods. In spite of their introduction to the field, they have not been fully accepted by physical educators. Further, the manner in which…
Test Theories, Educational Priorities and Reliability of Public Examinations in England
ERIC Educational Resources Information Center
Baird, Jo-Anne; Black, Paul
2013-01-01
Much has already been written on the controversies surrounding the use of different test theories in educational assessment. Other authors have noted the prevalence of classical test theory over item response theory in practice. This Special Issue draws together articles based upon work conducted on the Reliability Programme for England's…
ERIC Educational Resources Information Center
Furnham, Adrian
1984-01-01
Over 200 'normal' adolescents were administered self-report measures of personality (extraversion, neuroticism, and psychoticism), social skills, anomie, and delinquency in order to establish which of three theories best predicted delinquency. Eysenck's personality factors, particularly psychoticism, correlated most highly with delinquency. (RH)
Lesbian Mothers' Bids for Normalcy in Their Children's Schools
ERIC Educational Resources Information Center
Bower, Laura A.; Klecka, Cari L.
2009-01-01
Albeit growing in number, lesbian mothers and their children remain a statistical minority in schools. Lesbian mothers in this study described their families as "normal" or "just like any other family." From the perspective of queer theory, normal is a socially constructed and insidious concept. This study analyzes both the strategies participants…
Specific Language Impairment as a Period of Extended Optional Infinitive.
ERIC Educational Resources Information Center
Rice, Mabel L.; And Others
1995-01-01
This study evaluated an Extended Optional Infinitive theory of specific language impairment (SLI) in children, which suggests that SLI children omit finiteness markers longer than do normally developing children. Comparison of 18 SLI 5-year olds with 2 normally developing groups (ages 5 and 3) found that SLI subjects omitted finiteness markers…
Li, Heheng; Luo, Liangping; Huang, Li
2011-02-01
The present paper is aimed to study the fractal spectrum of the cerebral computerized tomography in 158 normal infants of different age groups, based on the calculation of chaotic theory. The distribution range of neonatal period was 1.88-1.90 (mean = 1.8913 +/- 0.0064); It reached a stable condition at the level of 1.89-1.90 during 1-12 months old (mean = 1.8927 +/- 0.0045); The normal range of 1-2 years old infants was 1.86-1.90 (mean = 1.8863 +/- 4 0.0085); It kept the invariance of the quantitative value among 1.88-1.91(mean = 1.8958 +/- 0.0083) during 2-3 years of age. ANOVA indicated there's no significant difference between boys and girls (F = 0.243, P > 0.05), but the difference of age groups was significant (F = 8.947, P < 0.001). The fractal dimension of cerebral computerized tomography in normal infants computed by box methods was maintained at an efficient stability from 1.86 to 1.91. It indicated that there exit some attractor modes in pediatric brain development.
NASA Astrophysics Data System (ADS)
Dykeman, Eric C.; Sankey, Otto F.
2010-02-01
We describe a technique for calculating the low-frequency mechanical modes and frequencies of a large symmetric biological molecule where the eigenvectors of the Hessian matrix are determined with full atomic detail. The method, which follows order N methods used in electronic structure theory, determines the subset of lowest-frequency modes while using group theory to reduce the complexity of the problem. We apply the method to three icosahedral viruses of various T numbers and sizes; the human viruses polio and hepatitis B, and the cowpea chlorotic mottle virus, a plant virus. From the normal-mode eigenvectors, we use a bond polarizability model to predict a low-frequency Raman scattering profile for the viruses. The full atomic detail in the displacement patterns combined with an empirical potential-energy model allows a comparison of the fully atomic normal modes with elastic network models and normal-mode analysis with only dihedral degrees of freedom. We find that coarse-graining normal-mode analysis (particularly the elastic network model) can predict the displacement patterns for the first few (˜10) low-frequency modes that are global and cooperative.
NASA Astrophysics Data System (ADS)
Mehrishal, Seyedahmad; Sharifzadeh, Mostafa; Shahriar, Korosh; Song, Jae-Jon
2016-12-01
Among all parameters that affect the friction of rocks, variable normal stress and slip rate are the most important second-order parameters. The shear-rate- and normal-stress-dependent friction behavior of rock discontinuities may significantly influence the dynamic responses of rock mass. In this research, two limestone rock types, which were travertine and onyx marble with slickenside and grinded #80 surfaces, were prepared and CNL direct shear tests were performed on the joints under various shear conditions. The shearing rate varied from 0.1 to 50 mm/min under different normal stresses (from 2 to 30 % of UCS) in both dry and wet conditions. Experiments showed that the friction coefficient of slickensided and ground #80 surfaces of limestone increased with the increasing shear velocity and decreased with the increasing normal stress. Micro-asperity interlocking between ground #80 surfaces showed higher wear and an increase in friction coefficient ( µ) compared to slickensided surfaces. Slickensided samples with moist surfaces showed an increase in the coefficient of friction compared to dry surfaces; however, on ground #80 surfaces, the moisture decreased the coefficient of friction to a smaller value. Slickenside of limestone typically slides stably in a dry condition and by stick-slip on moist surfaces. The observed shear-rate- and normal-stress-dependent friction behavior can be explained by a similar framework to that of the adhesion theory of friction and a friction mechanism that involves the competition between microscopic dilatant slip and surface asperity deformation. The results have important implications for understanding the behavior of basic and residual friction coefficients of limestone rock surfaces.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu Jialu; Yang Chunnuan; Cai Hao
2007-04-15
After finding the basic solutions of the linearized nonlinear Schroedinger equation by the method of separation of variables, the perturbation theory for the dark soliton solution is constructed by linear Green's function theory. In application to the self-induced Raman scattering, the adiabatic corrections to the soliton's parameters are obtained and the remaining correction term is given as a pure integral with respect to the continuous spectral parameter.
Kovalev, Vadim M; Tse, Wang-Kong
2017-11-22
We develop a microscopic theory for the relaxation dynamics of an optically pumped two-level system (TLS) coupled to a bath of weakly interacting Bose gas. Using Keldysh formalism and diagrammatic perturbation theory, expressions for the relaxation times of the TLS Rabi oscillations are derived when the boson bath is in the normal state and the Bose-Einstein condensate (BEC) state. We apply our general theory to consider an irradiated quantum dot coupled with a boson bath consisting of a two-dimensional dipolar exciton gas. When the bath is in the BEC regime, relaxation of the Rabi oscillations is due to both condensate and non-condensate fractions of the bath bosons for weak TLS-light coupling and pre dominantly due to the non-condensate fraction for strong TLS-light coupling. Our theory also shows that a phase transition of the bath from the normal to the BEC state strongly influences the relaxation rate of the TLS Rabi oscillations. The TLS relaxation rate is approximately independent of the pump field frequency and monotonically dependent on the field strength when the bath is in the low-temperature regime of the normal phase. Phase transition of the dipolar exciton gas leads to a non-monotonic dependence of the TLS relaxation rate on both the pump field frequency and field strength, providing a characteristic signature for the detection of BEC phase transition of the coupled dipolar exciton gas.
Jungert, Tomas; Hesser, Hugo; Träff, Ulf
2014-10-01
In social cognitive theory, self-efficacy is domain-specific. An alternative model, the cross-domain influence model, would predict that self-efficacy beliefs in one domain might influence performance in other domains. Research has also found that children who receive special instruction are not good at estimating their performance. The aim was to test two models of how self-efficacy beliefs influence achievement, and to contrast children receiving special instruction in mathematics with normally-achieving children. The participants were 73 fifth-grade children who receive special instruction and 70 children who do not receive any special instruction. In year four and five, the children's skills in mathematics and reading were assessed by national curriculum tests, and in their fifth year, self-efficacy in mathematics and reading were measured. Structural equation modeling showed that in domains where children do not receive special instruction in mathematics, self-efficacy is a mediating variable between earlier and later achievement in the same domain. Achievement in mathematics was not mediated by self-efficacy in mathematics for children who receive special instruction. For normal achieving children, earlier achievement in the language domain had an influence on later self-efficacy in the mathematics domain, and self-efficacy beliefs in different domains were correlated. Self-efficacy is mostly domain specific, but may play a different role in academic performance depending on whether children receive special instruction. The results of the present study provided some support of the Cross-Domain Influence Model for normal achieving children. © 2014 Scandinavian Psychological Associations and John Wiley & Sons Ltd.
An improved plate theory of order (1,2) for thick composite laminates
NASA Technical Reports Server (NTRS)
Tessler, A.
1992-01-01
A new (1,2)-order theory is proposed for the linear elasto-static analysis of laminated composite plates. The basic assumptions are those concerning the distribution through the laminate thickness of the displacements, transverse shear strains and the transverse normal stress, with these quantities regarded as some weighted averages of their exact elasticity theory representations. The displacement expansions are linear for the inplane components and quadratic for the transverse component, whereas the transverse shear strains and transverse normal stress are respectively quadratic and cubic through the thickness. The main distinguishing feature of the theory is that all strain and stress components are expressed in terms of the assumed displacements prior to the application of a variational principle. This is accomplished by an a priori least-square compatibility requirement for the transverse strains and by requiring exact stress boundary conditions at the top and bottom plate surfaces. Equations of equilibrium and associated Poisson boundary conditions are derived from the virtual work principle. It is shown that the theory is particularly suited for finite element discretization as it requires simple C(sup 0)- and C(sup -1)-continuous displacement interpolation fields. Analytic solutions for the problem of cylindrical bending are derived and compared with the exact elasticity solutions and those of our earlier (1,2)-order theory based on the assumed displacements and transverse strains.
NASA Astrophysics Data System (ADS)
Kohno, M.
2018-03-01
Adopting hyperon-nucleon and hyperon-nucleon-nucleon interactions parametrized in chiral effective field theory, single-particle potentials of the Λ and Σ hyperons are evaluated in symmetric nuclear matter and in pure neutron matter within the framework of lowest-order Bruckner theory. The chiral NLO interaction bears strong Λ N -Σ N coupling. Although the Λ potential is repulsive if the coupling is switched off, the Λ N -Σ N correlation brings about the attraction consistent with empirical data. The Σ potential is repulsive, which is also consistent with empirical information. The interesting result is that the Λ potential becomes shallower beyond normal density. This provides the possibility of solving the hyperon puzzle without introducing ad hoc assumptions. The effects of the Λ N N -Λ N N and Λ N N -Σ N N three-baryon forces are considered. These three-baryon forces are first reduced to normal-ordered effective two-baryon interactions in nuclear matter and then incorporated in the G -matrix equation. The repulsion from the Λ N N -Λ N N interaction is of the order of 5 MeV at normal density and becomes larger with increasing density. The effects of the Λ N N -Σ N N coupling compensate the repulsion at normal density. The net effect of the three-baryon interactions on the Λ single-particle potential is repulsive at higher densities.
Blanton, Hart; Jaccard, James
2006-01-01
Theories that posit multiplicative relationships between variables are common in psychology. A. G. Greenwald et al. recently presented a theory that explicated relationships between group identification, group attitudes, and self-esteem. Their theory posits a multiplicative relationship between concepts when predicting a criterion variable. Greenwald et al. suggested analytic strategies to test their multiplicative model that researchers might assume are appropriate for testing multiplicative models more generally. The theory and analytic strategies of Greenwald et al. are used as a case study to show the strong measurement assumptions that underlie certain tests of multiplicative models. It is shown that the approach used by Greenwald et al. can lead to declarations of theoretical support when the theory is wrong as well as rejection of the theory when the theory is correct. A simple strategy for testing multiplicative models that makes weaker measurement assumptions than the strategy proposed by Greenwald et al. is suggested and discussed.
Analyzing Test-Taking Behavior: Decision Theory Meets Psychometric Theory.
Budescu, David V; Bo, Yuanchao
2015-12-01
We investigate the implications of penalizing incorrect answers to multiple-choice tests, from the perspective of both test-takers and test-makers. To do so, we use a model that combines a well-known item response theory model with prospect theory (Kahneman and Tversky, Prospect theory: An analysis of decision under risk, Econometrica 47:263-91, 1979). Our results reveal that when test-takers are fully informed of the scoring rule, the use of any penalty has detrimental effects for both test-takers (they are always penalized in excess, particularly those who are risk averse and loss averse) and test-makers (the bias of the estimated scores, as well as the variance and skewness of their distribution, increase as a function of the severity of the penalty).
The Development of Early Pulsation Theory, or, How Cepheids Are Like Steam Engines
NASA Astrophysics Data System (ADS)
Stanley, M.
2012-06-01
The pulsation theory of Cepheid variable stars was a major breakthrough of early twentieth-century astrophysics. At the beginning of that century, the basic physics of normal stars was very poorly understood, and variable stars were even more mysterious. Breaking with accepted explanations in terms of eclipsing binaries, Harlow Shapley and A. S. Eddington pioneered novel theories that considered Cepheids as pulsating spheres of gas. Surprisingly, the pulsation theory not only depended on novel developments in stellar physics, but the theory also drove many of those developments. In particular, models of stars in radiative balance and theories of stellar energy were heavily inspired and shaped by ideas about variable stars. Further, the success of the pulsation theory helped justify the new approaches to astrophysics being developed before World War II.
Abnormal semantic knowledge in a case of developmental amnesia.
Blumenthal, Anna; Duke, Devin; Bowles, Ben; Gilboa, Asaf; Rosenbaum, R Shayna; Köhler, Stefan; McRae, Ken
2017-07-28
An important theory holds that semantic knowledge can develop independently of episodic memory. One strong source of evidence supporting this independence comes from the observation that individuals with early hippocampal damage leading to developmental amnesia generally perform normally on standard tests of semantic memory, despite their profound impairment in episodic memory. However, one aspect of semantic memory that has not been explored is conceptual structure. We built on the theoretically important distinction between intrinsic features of object concepts (e.g., shape, colour, parts) and extrinsic features (e.g., how something is used, where it is typically located). The accrual of extrinsic feature knowledge that is important for concepts such as chair or spoon may depend on binding mechanisms in the hippocampus. We tested HC, an individual with developmental amnesia due to a well-characterized lesion of the hippocampus, on her ability to generate semantic features for object concepts. HC generated fewer extrinsic features than controls, but a similar number of intrinsic features than controls. We also tested her on typicality ratings. Her typicality ratings were abnormal for nonliving things (which more strongly depend on extrinsic features), but normal for living things (which more strongly depend on intrinsic features). In contrast, NB, who has MTL but not hippocampal damage due to surgery, showed no impairments in either task. These results suggest that episodic and semantic memory are not entirely independent, and that the hippocampus is important for learning some aspects of conceptual knowledge. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Peter, Simon; Leine, Remco I.
2017-11-01
Phase resonance testing is one method for the experimental extraction of nonlinear normal modes. This paper proposes a novel method for nonlinear phase resonance testing. Firstly, the issue of appropriate excitation is approached on the basis of excitation power considerations. Therefore, power quantities known from nonlinear systems theory in electrical engineering are transferred to nonlinear structural dynamics applications. A new power-based nonlinear mode indicator function is derived, which is generally applicable, reliable and easy to implement in experiments. Secondly, the tuning of the excitation phase is automated by the use of a Phase-Locked-Loop controller. This method provides a very user-friendly and fast way for obtaining the backbone curve. Furthermore, the method allows to exploit specific advantages of phase control such as the robustness for lightly damped systems and the stabilization of unstable branches of the frequency response. The reduced tuning time for the excitation makes the commonly used free-decay measurements for the extraction of backbone curves unnecessary. Instead, steady-state measurements for every point of the curve are obtained. In conjunction with the new mode indicator function, the correlation of every measured point with the associated nonlinear normal mode of the underlying conservative system can be evaluated. Moreover, it is shown that the analysis of the excitation power helps to locate sources of inaccuracies in the force appropriation process. The method is illustrated by a numerical example and its functionality in experiments is demonstrated on a benchmark beam structure.
Sell, Stewart; Nicolini, Andrea; Ferrari, Paola; Biava, Pier M
2016-01-01
Current medical literature acknowledges that embryonic micro-environment is able to suppress tumor development. Administering carcinogenic substances during organogenesis in fact leads to embryonic malformations, but not to offspring tumor growth. Once organogenesis has ended, administration of carcinogenic substances causes a rise in offspring tumor development. These data indicate that cancer can be considered a deviation in normal development, which can be regulated by factors of the embryonic microenvironment. Furthermore, it has been demonstrated that teratoma differentiates into normal tissues once it is implanted in the embryo. Recently, it has been shown that implanting a melanoma in Zebrafish embryo did not result in a tumor development; however, it did in the adult specimen. This demonstrates that cancer cells can differentiate into normal tissues when implanted in the embryo. In addition, it was demonstrated that other tumors can revert into a normal phenotype and/or differentiate into normal tissue when implanted in the embryo. These studies led some authors to define cancer as a problem of developmental biology and to predict the present concept of "cancer stem cells theory". In this review, we record the most important researches about the reprogramming and differentiation treatments of cancer cells to better clarify how the substances taken from developing embryo or other biological substances can induce differentiation of malignant cells. Lastly, a model of cancer has been proposed here, conceived by one of us, which is consistent with the reality, as demonstrated by a great number of researches. This model integrates the theory of the "maturation arrest" of cancer cells as conceived by B. Pierce with the theory which describes cancer as a process of deterministic chaos determined by genetic and/or epigenetic alterations in differentiated cells, which leads a normal cell to become cancerous. All the researches here described demonstrated that cancer can be considered a problem of developmental biology and that one of the most important hallmarks of cancer is the loss of differentiation as already described by us in other articles.
NASA Astrophysics Data System (ADS)
Chen, Michael; Abdo-Sánchez, Elena; Epstein, Ariel; Eleftheriades, George V.
2018-03-01
Huygens' metasurfaces are electrically thin devices which allow arbitrary field transformations. Beam refraction is among the first demonstrations of realized metasurfaces. As previously shown for extreme-angle refraction, control over only the electric impedance and magnetic admittance of the Huygens' metasurface proved insufficient to produce the desired reflectionless field transformation. To maintain zero reflections for wide refraction angles, magnetoelectric coupling between the electric and magnetic response of the metasurface, leading to bianisotropy, can be introduced. In this paper, we report the theory, design, and experimental characterization of a reflectionless bianisotropic metasurface for extreme-angle refraction of a normally incident plane wave towards 71.8° at 20 GHz. The theory and design of three-layer asymmetric bianisotropic unit cells are discussed. The realized printed circuit board structure was tested via full-wave simulations as well as experimental characterization. To experimentally verify the prototype, two setups were used. A quasi-optical experiment was conducted to assess the specular reflections of the metasurface, while a far-field antenna measurement characterized its refraction nature. The measurements verify that the fabricated metasurface has negligible reflections and the majority of the scattered power is refracted to the desired Floquet mode. This provides an experimental demonstration of a reflectionless wide-angle refracting metasurface using a bianisotropic Huygens' metasurface at microwave frequencies.
ERIC Educational Resources Information Center
Cissna, Kenneth Norman
The purpose of this study was to test a theory of interpersonal communication in non-therapeutic relationships. The theory was derived primarily from the work of Carl Rogers and Robert Carkhuff in psychology and from Evelyn Sieburg's theory of interpersonal confirmation in speech communication. In order to test the three generated hypotheses, a…
Educational Measurement. Third Edition. American Council on Education Series on Higher Education.
ERIC Educational Resources Information Center
Linn, Robert L., Ed.
This collection explores the theory and applications of educational testing. It is divided into sections on theory and general principles of educational measurement, administration of tests and scoring, and applications of testing. The following chapters present information on test theory and use: (1) "Current Perspectives and Future…
Theory of Test Translation Error
ERIC Educational Resources Information Center
Solano-Flores, Guillermo; Backhoff, Eduardo; Contreras-Nino, Luis Angel
2009-01-01
In this article, we present a theory of test translation whose intent is to provide the conceptual foundation for effective, systematic work in the process of test translation and test translation review. According to the theory, translation error is multidimensional; it is not simply the consequence of defective translation but an inevitable fact…
NASA Astrophysics Data System (ADS)
Almrabat, Abdulhadi M.
The thesis presents the results of a study of the characterization and modeling of the stress and pore-fluid dependent acoustic properties of fractured porous rocks. A new laboratory High Pressure and High Temperature (HPHT) triaxial testing system was developed to characterize the seismic properties of sandstone under different levels of effective stress confinement and changes in pore-fluid composition. An intact and fractured of Berea sandstones core samples were used in the experimental studies. The laboratory test results were used to develop analytical models for stress-level and pore-fluid dependent seismic velocity of sandstones. Models for stress-dependent P and S-wave seismic velocities of sandstone were then developed based on the assumption that stress-dependencies come from the nonlinear elastic response of micro-fractures contained in the sample under normal and shear loading. The contact shear stiffness was assumed to increase linearly with the normal stress across a micro-fracture, while the contact normal stiffness was assumed to vary as a power law with the micro-fracture normal stress. Both nonlinear fracture normal and shear contact models were validated by experimental data available in the literature. To test the dependency of seismic velocity of sandstone on changes in pore-fluid composition, another series of tests were conducted where P and S-wave velocities were monitored during injection of supercritical CO 2 in samples of Berea sandstone initially saturated with saline water and under constant confining stress. Changes in seismic wave velocity were measured at different levels of supercritical CO2 saturation as the initial saline water as pore-fluid was displaced by supercritical CO 2. It was found that the P- iv wave velocity significantly decreased while the S-wave velocity remained almost constant as the sample supercritical CO2 saturation increased. The dependency of the seismic velocity on changes on pore fluid composition during injection of supercritical CO 2 in Berea sandstone was modeled using a re-derived Biot-Gassmann substitution theory. In using the Biot-Gassmann substitution theory, it was found necessary to account for the changes in the pore-fluid compressibility in terms of the volumetric proportion and distribution of saline water and supercritical CO 2 in the sample pore space. This was done by using the empirical model of Brie et al. to account for the compressibility of mixtures of two-phase immiscible fluids. The combined Biot-Gassman and Brie et al. models were found to represent adequately the changes in P-wave velocity of Berea sandstone during displacement of saline water by supercritical CO2. The third experimental and modeling study addressed shear-wave splitting due to the presence of fractures in a rock mass. Tests were conducted using the high temperature and high pressure (HPHT) triaxial device on samples of Berea sandstone, containing a single induced tensile fracture running along the height of the sample. The fracture was created via a modified Brazilian Split Test loading where the edges of cylindrical samples were loaded on diametrically opposite two points by sharp guillotines. The Joint Roughness Coefficient (JRC) values of the fractured core samples were determined by profilometry and tilt test. The effect of mismatching of the fracture surfaces on shear wave splitting was investigated by applying different amounts of shear displacements to three core samples. The degree of mismatching of the fracture surfaces in the core samples was evaluated using the Joint Matching Coefficient (JMC). Shear-wave splitting, as measured by the difference in magnitudes of shear-wave velocities parallel and perpendicular to the fracture, Vs1 and Vs2 respectively, increases with increasing mismatch of the fracture surfaces and decreases with increasing effective stress, and approaches zero in the effective stress range tested. A model for the stress and JMC dependent shear-wave splitting was developed based on the experimental observations. Finally, the magnitude of shear-wave splitting was correlated with the permeability of the fractured porous sandstone for fluid flow parallel to the induced fracture. (Abstract shortened by UMI.)
Prediction and control of chaotic processes using nonlinear adaptive networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, R.D.; Barnes, C.W.; Flake, G.W.
1990-01-01
We present the theory of nonlinear adaptive networks and discuss a few applications. In particular, we review the theory of feedforward backpropagation networks. We then present the theory of the Connectionist Normalized Linear Spline network in both its feedforward and iterated modes. Also, we briefly discuss the theory of stochastic cellular automata. We then discuss applications to chaotic time series, tidal prediction in Venice lagoon, finite differencing, sonar transient detection, control of nonlinear processes, control of a negative ion source, balancing a double inverted pendulum and design advice for free electron lasers and laser fusion targets.
NASA Astrophysics Data System (ADS)
Shimanovskii, A. V.
A method for calculating the plane bending of elastic-plastic filaments of finite stiffness is proposed on the basis of plastic flow theory. The problem considered is shown to reduce to relations similar to Kirchhoff equations for elastic work. Expressions are obtained for determining the normalized stiffness characteristics for the cross section of a filament with plastic regions containing beam theory equations as a particular case. A study is made of the effect of the plastic region size on the position of the elastic deformation-unloading interface and on the normalized stiffness of the filament cross section. Calculation results are presented in graphic form.
ERIC Educational Resources Information Center
DeMars, Christine E.
2012-01-01
In structural equation modeling software, either limited-information (bivariate proportions) or full-information item parameter estimation routines could be used for the 2-parameter item response theory (IRT) model. Limited-information methods assume the continuous variable underlying an item response is normally distributed. For skewed and…
ERIC Educational Resources Information Center
Boutis, Kathy; Pecaric, Martin; Seeto, Brian; Pusic, Martin
2010-01-01
Signal detection theory (SDT) parameters can describe a learner's ability to discriminate (d[prime symbol]) normal from abnormal and the learner's criterion ([lambda]) to under or overcall abnormalities. To examine the serial changes in SDT parameters with serial exposure to radiological cases. 46 participants were recruited for this study: 20…
Ramsay-Curve Item Response Theory for the Three-Parameter Logistic Item Response Model
ERIC Educational Resources Information Center
Woods, Carol M.
2008-01-01
In Ramsay-curve item response theory (RC-IRT), the latent variable distribution is estimated simultaneously with the item parameters of a unidimensional item response model using marginal maximum likelihood estimation. This study evaluates RC-IRT for the three-parameter logistic (3PL) model with comparisons to the normal model and to the empirical…
ERIC Educational Resources Information Center
Woods, Carol M.; Thissen, David
2006-01-01
The purpose of this paper is to introduce a new method for fitting item response theory models with the latent population distribution estimated from the data using splines. A spline-based density estimation system provides a flexible alternative to existing procedures that use a normal distribution, or a different functional form, for the…
Chaos Theory as a Model for Life Transitions Counseling: Nonlinear Dynamics and Life's Changes
ERIC Educational Resources Information Center
Bussolari, Cori J.; Goodell, Judith A.
2009-01-01
Chaos theory is presented for counselors working with clients experiencing life transitions. It is proposed as a model that considers disorder, unpredictability, and lack of control as normal parts of transition processes. Nonlinear constructs from physics are adapted for use in counseling. The model provides a method clients can use to…
ERIC Educational Resources Information Center
Bulcock, J. W.; And Others
Multicollinearity refers to the presence of highly intercorrelated independent variables in structural equation models, that is, models estimated by using techniques such as least squares regression and maximum likelihood. There is a problem of multicollinearity in both the natural and social sciences where theory formulation and estimation is in…
Ozbay, Ozden; Ozcan, Yusuf Ziya
2006-12-01
Travis Hirschi's social bonding theory has mostly been tested in the West. In this study, the theory is tested on juvenile delinquency in a developing country, Turkey. Data were gathered from 1,710 high school students in Ankara by using two-stage stratified cluster sampling. Factor analysis was employed to determine the dimensions of juvenile delinquency (assault, school delinquency, and public disturbance), and regression analysis was used to test the theory. Similar to some other traditional societies, the social bonding theory plays an important role in the explanation of juvenile delinquency in Turkey.
DOE Office of Scientific and Technical Information (OSTI.GOV)
La Russa, D. J.; Rogers, D. W. O.
EGSnrc calculations of ion chamber response and Spencer-Attix (SA) restricted stopping-power ratios are used to test the assumptions of the SA cavity theory and to assess the accuracy of this theory as it applies to the air kerma formalism for {sup 60}Co beams. Consistent with previous reports, the EGSnrc calculations show that the SA cavity theory, as it is normally applied, requires a correction for the perturbation of the charged particle fluence (K{sub fl}) by the presence of the cavity. The need for K{sub fl} corrections arises from the fact that the standard prescription for choosing the low-energy threshold {Delta}more » in the SA restricted stopping-power ratio consistently underestimates the values of {Delta} needed if no perturbation to the fluence is assumed. The use of fluence corrections can be avoided by appropriately choosing {Delta}, but it is not clear how {Delta} can be calculated from first principles. Values of {Delta} required to avoid K{sub fl} corrections were found to be consistently higher than {Delta} values obtained using the conventional approach and are also observed to be dependent on the composition of the wall in addition to the cavity size. Values of K{sub fl} have been calculated for many of the graphite-walled ion chambers used by the national metrology institutes around the world and found to be within 0.04% of unity in all cases, with an uncertainty of about 0.02%.« less
NASA Astrophysics Data System (ADS)
Sahoo, B. K.; Das, B. P.
2018-05-01
Recent relativistic coupled-cluster (RCC) calculations of electric dipole moments (EDMs) of diamagnetic atoms due to parity and time-reversal violating (P ,T -odd) interactions, which are essential ingredients for probing new physics beyond the standard model of particle interactions, differ substantially from the previous theoretical results. It is therefore necessary to perform an independent test of the validity of these results. In view of this, the normal coupled-cluster method has been extended to the relativistic regime [relativistic normal coupled-cluster (RNCC) method] to calculate the EDMs of atoms by simultaneously incorporating the electrostatic and P ,T -odd interactions in order to overcome the shortcomings of the ordinary RCC method. This new relativistic method has been applied to 199Hg, which currently has a lower EDM limit than that of any other system. The results of our RNCC and self-consistent RCC calculations of the EDM of this atom are found to be close. The discrepancies between these two results on the one hand and those of previous calculations on the other are elucidated. Furthermore, the electric dipole polarizability of this atom, which has computational similarities with the EDM, is evaluated and it is in very good agreement with its measured value.
Sahoo, B K; Das, B P
2018-05-18
Recent relativistic coupled-cluster (RCC) calculations of electric dipole moments (EDMs) of diamagnetic atoms due to parity and time-reversal violating (P,T-odd) interactions, which are essential ingredients for probing new physics beyond the standard model of particle interactions, differ substantially from the previous theoretical results. It is therefore necessary to perform an independent test of the validity of these results. In view of this, the normal coupled-cluster method has been extended to the relativistic regime [relativistic normal coupled-cluster (RNCC) method] to calculate the EDMs of atoms by simultaneously incorporating the electrostatic and P,T-odd interactions in order to overcome the shortcomings of the ordinary RCC method. This new relativistic method has been applied to ^{199}Hg, which currently has a lower EDM limit than that of any other system. The results of our RNCC and self-consistent RCC calculations of the EDM of this atom are found to be close. The discrepancies between these two results on the one hand and those of previous calculations on the other are elucidated. Furthermore, the electric dipole polarizability of this atom, which has computational similarities with the EDM, is evaluated and it is in very good agreement with its measured value.
Normalizing memory recall in fibromyalgia with rehearsal: a distraction-counteracting effect.
Leavitt, Frank; Katz, Robert S
2009-06-15
To examine the impact of distraction on the retention of rehearsed information in patients with fibromyalgia syndrome (FMS). Data refer to the neurocognitive examination of 134 patients (91 with FMS and 43 control subjects) presenting with memory loss. Four neurocognitive measures free of distraction, along with 2 measures with added distraction, were completed. Differences in the retention of rehearsed and unrehearsed information with a source of distraction present were calculated. Patients with FMS showed normal cognitive functioning on verbal memory tests free of distraction. Adding a source of distraction caused unrefreshed information to be lost at a disproportionate rate in patients with FMS. Over 87% of patients with FMS scored in the impaired range on a task of unrehearsed verbal memory. Adding a source of distraction to well-rehearsed information produced a normal rate of recall in FMS. Rehearsal mechanisms are intact in patients with FMS and play beneficial roles in managing interference from a source of distraction. In the absence of rehearsal, a source of distraction added to unrefreshed information signals a remarkable level of cognitive deficit in FMS that goes undetected by conventionally relied-upon neurocognitive measures. We present a theory to promote understanding of the cognitive deficit of people with FMS based on reduced speed of lexical activation and poor recall after distraction.
Neurodynamic system theory: scope and limits.
Erdi, P
1993-06-01
This paper proposes that neurodynamic system theory may be used to connect structural and functional aspects of neural organization. The paper claims that generalized causal dynamic models are proper tools for describing the self-organizing mechanism of the nervous system. In particular, it is pointed out that ontogeny, development, normal performance, learning, and plasticity, can be treated by coherent concepts and formalism. Taking into account the self-referential character of the brain, autopoiesis, endophysics and hermeneutics are offered as elements of a poststructuralist brain (-mind-computer) theory.
Yu, Jia-Lu; Yang, Chun-Nuan; Cai, Hao; Huang, Nian-Ning
2007-04-01
After finding the basic solutions of the linearized nonlinear Schrödinger equation by the method of separation of variables, the perturbation theory for the dark soliton solution is constructed by linear Green's function theory. In application to the self-induced Raman scattering, the adiabatic corrections to the soliton's parameters are obtained and the remaining correction term is given as a pure integral with respect to the continuous spectral parameter.
Accelerated degradation of silicon metallization systems
NASA Technical Reports Server (NTRS)
Lathrop, J. W.
1983-01-01
Clemson University has been engaged for the past five years in a program to determine the reliability attributes of solar cells by means of accelerated test procedures. The cells are electrically measured and visually inspected and then subjected for a period of time to stress in excess of that normally encountered in use, and then they are reinspected. Changes are noted and the process repeated. This testing has thus far involved 23 different unencapsulated cell types from 12 different manufacturers, and 10 different encapsulated cell types from 9 different manufacturers. Reliability attributes of metallization systems can be classified as major or minor, depending on the severity of the effects observed. As a result of the accelerated testing conducted under the Clemson program, major effects have been observed related to contact resistance and to mechanical adherence and solderability. This paper does not attempt a generalized survey of accelerated test results, but rather concentrates on one particular attribute of metallization that has been observed to cause electrical degradation - increased contact resistance due to Schottky barrier formation. In this example basic semiconductor theory was able to provide an understanding of the electrical effects observed during accelerated stress testing.
Kim, Jeong Chul; Wang, Li; Shen, Dinggang; Lin, Weili
2016-12-02
The first year of life is the most critical time period for structural and functional development of the human brain. Combining longitudinal MR imaging and finite strain theory, this study aimed to provide new insights into normal brain development through a biomechanical framework. Thirty-three normal infants were longitudinally imaged using MRI from 2 weeks to 1 year of age. Voxel-wise Jacobian determinant was estimated to elucidate volumetric changes while Lagrange strains (both normal and shear strains) were measured to reveal directional growth information every 3 months during the first year of life. Directional normal strain maps revealed that, during the first 6 months, the growth pattern of gray matter is anisotropic and spatially inhomogeneous with higher left-right stretch around the temporal lobe and interhemispheric fissure, anterior-posterior stretch in the frontal and occipital lobes, and superior-inferior stretch in right inferior occipital and right inferior temporal gyri. In contrast, anterior lateral ventricles and insula showed an isotropic stretch pattern. Volumetric and directional growth rates were linearly decreased with age for most of the cortical regions. Our results revealed anisotropic and inhomogeneous brain growth patterns of the human brain during the first year of life using longitudinal MRI and a biomechanical framework.
ERIC Educational Resources Information Center
Kohli, Nidhi; Koran, Jennifer; Henn, Lisa
2015-01-01
There are well-defined theoretical differences between the classical test theory (CTT) and item response theory (IRT) frameworks. It is understood that in the CTT framework, person and item statistics are test- and sample-dependent. This is not the perception with IRT. For this reason, the IRT framework is considered to be theoretically superior…
Problems of the theory of superconductivity which involve spatial inhomogeneity
NASA Astrophysics Data System (ADS)
Svidzinskii, A. V.
This book is concerned with questions which are related to equilibrium phenomena in superconductors, giving particular attention to effects determined by a spatial variation of the order parameter. The microscopic theory of superconductivity is developed on the basis of a model which takes into account the direct interaction between electrons. The theory of current relations in superconductors is discussed, taking into consideration the magnetic properties of superconductors in weak fields and the Meissner effect. Aspects regarding the general theory of tunneling are also explored, including the Josephson effect. An investigation is conducted of the theory of current conditions in areas in which the superconductor is in contact with normally conducting metal.
Buckling analysis for anisotropic laminated plates under combined inplane loads
NASA Technical Reports Server (NTRS)
Viswanathan, A. V.; Tamekuni, M.; Baker, L. L.
1974-01-01
The buckling analysis presented considers rectangular flat or curved general laminates subjected to combined inplane normal and shear loads. Linear theory is used in the analysis. All prebuckling deformations and any initial imperfections are ignored. The analysis method can be readily extended to longitudinally stiffened structures subjected to combined inplane normal and shear loads.
ERIC Educational Resources Information Center
CANTOR, GORDON N.; GIRARDEAU, FREDERIC L.
THIS INQUIRY INVESTIGATED DISCRIMINATION LEARNING PROCESSES IN TRAINABLE MONGOLOID CHILDREN AS COMPARED WITH NORMAL PRESCHOOL CHILDREN. ITS PURPOSE WAS TO CONTRIBUTE TO GENERAL BEHAVIOR THEORY AND TO THE KNOWLEDGE OF MENTAL DEFICIENCY BY SEEING IF SUCH VARIABLES AS TRANSFER OF TRAINING, ACQUIRED DISTINCTIVENESS OF CUES, AND ACQUIRED EQUIVALENCE OF…
ERIC Educational Resources Information Center
Peterson, Candida C.; Siegal, Michael
1997-01-01
Examined reasoning in normal, autistic, and deaf individuals. Found that deaf individuals who grow up in hearing homes without fluent signers show selective impairments in theory of mind similar to those of autistic individuals. Results suggest that conversational differences in the language children hear accounts for distinctive patterns of…
ERIC Educational Resources Information Center
Buium, Nissan; Rynders, John
To demonstrate that the child learning language constructs his theory of language on the basis of the linguistic data available to him, this study investigated 21 linguistic parameters that Down's Syndrome and normal children are exposed to in their maternal linguistic environment. It was found that mothers produced certain levels of linguistic…
From practice to midrange theory and back again: Beck's theory of postpartum depression.
Lasiuk, Gerri C; Ferguson, Linda M
2005-01-01
This article presents a brief overview of theory as background for a more detailed discussion of midrange theory-its origins, the critical role for midrange theory in the development of nursing practice knowledge, and the criteria for evaluating midrange theory. We then chronicle Cheryl Tatano Beck's program of research on postpartum depression (PPD) and advance the thesis that her theory of PPD, titled Teetering on the Edge, is an exemplar of a substantive midrange nursing theory. We demonstrate Beck's progression from identification of a clinical problem to exploratory-descriptive research, to concept analysis and midrange theory development, and finally to the application and testing of the theory in the clinical setting. Through ongoing refinement and testing of her theory, Beck has increased its generalizability across various practice settings and continually identifies new issues for investigation. Beck's program of research on PPD exemplifies using nursing outcomes to build and test nursing practice knowledge.
An Alternative Approach to Identifying a Dimension in Second Language Proficiency.
ERIC Educational Resources Information Center
Griffin, Patrick E.; And Others
Current practice in language testing has not yet integrated classical test theory with assessment of language skills. In addition, language testing needs to be part of theory development. Lack of sound testing procedures can lead to problems in research design and ultimately, inappropriate theory development. The debate over dimensionality of…
Testing approximate theories of first-order phase transitions on the two-dimensional Potts model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dasgupta, C.; Pandit, R.
The two-dimensional, q-state (q > 4) Potts model is used as a testing ground for approximate theories of first-order phase transitions. In particular, the predictions of a theory analogous to the Ramakrishnan-Yussouff theory of freezing are compared with those of ordinary mean-field (Curie-Wiess) theory. It is found that the Curie-Weiss theory is a better approximation than the Ramakrishnan-Yussouff theory, even though the former neglects all fluctuations. It is shown that the Ramakrishnan-Yussouff theory overestimates the effects of fluctuations in this system. The reasons behind the failure of the Ramakrishnan-Yussouff approximation and the suitability of using the two-dimensional Potts model asmore » a testing ground for these theories are discussed.« less
Lin, Bih-Jiau; Chiou, Wen-Bin
2010-06-01
English competency has become essential for obtaining a better job or succeeding in higher education in Taiwan. Thus, passing the General English Proficiency Test is important for college students in Taiwan. The current study applied Ajzen's theory of planned behavior and the notions of outcome expectancy and self-efficacy from Bandura's social cognitive theory to investigate college students' intentions to take the General English Proficiency Test. The formal sample consisted of 425 undergraduates (217 women, 208 men; M age = 19.5 yr., SD = 1.3). The theory of planned behavior showed greater predictive ability (R2 = 33%) of intention than the social cognitive theory (R2 = 7%) in regression analysis and made a unique contribution to prediction of actual test-taking behavior one year later in logistic regression. Within-model analyses indicated that subjective norm in theory of planned behavior and outcome expectancy in social cognitive theory are crucial factors in predicting intention. Implications for enhancing undergraduates' intentions to take the English proficiency test are discussed.
Hidden Fermi liquid: Self-consistent theory for the normal state of high-Tc superconductors
NASA Astrophysics Data System (ADS)
Casey, Philip A.
The anomalous "strange metal" properties of the normal, non-superconducting state of the high-Tc cuprate superconductors have been extensively studied for over two decades. The resistivity is robustly T-linear at high temperatures, while at low T it appears to maintain linearity near optimal doping and is T2 at higher doping. The inverse Hall angle is strictly T2 and hence has a distinct scattering lifetime from the resistivity. The transport scattering lifetime is highly anisotropic as directly measured by angle-dependent magnetoresistance (ADMR) and indirectly in more traditional transport experiments. The IR conductivity exhibits a non-integer power-law in frequency, which we take as a defining characteristic of the "strange metal". A phenomenological theory of the transport and spectroscopic properties at a self-consistent and predictive level has been much sought after, yet elusive. Hidden Fermi liquid theory (HFL) explicitly accounts for the effects of Gutzwiller projection in the t-J Hamiltonian, widely believed to contain the essential physics of the high-Tc superconductors. We show this theory to be the first self-consistent description for the normal state of the cuprates based on transparent, fundamental assumptions. Our well-defined formalism also serves as a guide for further experimental confirmation. Chapter 1 reviews the "strange metal" properties and the relevant aspects of competing models. Chapter 2 presents the theoretical foundations of the formalism. Chapters 3 and 4 derive expressions for the entire normal state relating many of the properties, for example: angle-resolved photoemission, IR conductivity, resistivity, Hall angle, and by generalizing the formalism to include the Fermi surface topology---ADMR. Self-consistency is demonstrated with experimental comparisons, including the most recent laser-ARPES and ADMR. Chapter 5 discusses entropy transport, as in the thermal conductivity, thermal Hall conductivity, and consequent metrics of non-Fermi liquid behavior such as the Wiedemann-Franz and Kadowaki-Woods ratios.
Hydrogels for engineering: normalization of swelling due to arbitrary stimulus
NASA Astrophysics Data System (ADS)
Ehrenhofer, Adrian; Wallmersperger, Thomas
2017-04-01
In engineering, materials are chosen from databases: Engineers orient on specific parameters such as Young's modulus, yield stress or thermal expansion coefficients for a desired application. For hydrogels, the choice of materials is rather tedious since no generalized material parameters are currently available to quantify the swelling behavior. The normalization of swelling, which we present in the current work, allows an easy comparison of different hydrogel materials. Thus, for a specific application like a sensor or an actuator, an adequate material can be chosen. In the current work, we present the process of normalization and provide a course of action for the data analysis. Special challenges for hydrogels like hysteresis, conditional multi-sensitivity and anisotropic swelling are addressed. Then, the Temperature Expansion Model is shortly described and applied. Using the derived normalized swelling curves, a nonlinear expansion coefficient ß(F) is derived. The derived material behavior is used in an analytical model to predict the bending behavior of a beam made of thermo-responsive hydrogel material under an anisotropic temperature load. A bending behavior of the beam can be observed and the impact of other geometry and material parameters can be investigated. To overcome the limitations of the one-dimensional beam theory, the material behavior and geometry can be implemented in Finite Element analysis tools. Thus, novel applications for hydrogels in various fields can be envisioned, designed and tested. This can lead to a wider use of smart materials in sensor or actuator devices even by engineers without chemical background.
Theory of mind in early psychosis.
Langdon, Robyn; Still, Megan; Connors, Michael H; Ward, Philip B; Catts, Stanley V
2014-08-01
A deficit in theory of mind--the ability to infer and reason about the mental states of others - might underpin the poor social functioning of patients with psychosis. Unfortunately, however, there is considerable variation in how such a deficit is assessed. The current study compared three classic tests of theory of mind in terms of their ability to detect impairment in patients in the early stages of psychosis. Twenty-three patients within 2 years of their first psychotic episode and 19 healthy controls received picture-sequencing, joke-appreciation and story-comprehension tests of theory of mind. Whereas the picture-sequencing and joke-appreciation tests successfully detected a selective theory-of-mind deficit in patients, the story-comprehension test did not. The findings suggest that tests that place minimal demands on language processing and involve indirect, rather than explicit, instructions to assess theory of mind might be best suited to detecting theory-of-mind impairment in early stages of psychosis. © 2013 Wiley Publishing Asia Pty Ltd.
NNLO QCD corrections to Higgs boson production at large transverse momentum
NASA Astrophysics Data System (ADS)
Chen, X.; Cruz-Martinez, J.; Gehrmann, T.; Glover, E. W. N.; Jaquier, M.
2016-10-01
We derive the second-order QCD corrections to the production of a Higgs boson recoiling against a parton with finite transverse momentum, working in the effective field theory in which the top quark contributions are integrated out. To account for quark mass effects, we supplement the effective field theory result by the full quark mass dependence at leading order. Our calculation is fully differential in the final state kinematics and includes the decay of the Higgs boson to a photon pair. It allows one to make next-to-next-to-leading order (NNLO)-accurate theory predictions for Higgs-plus-jet final states and for the transverse momentum distribution of the Higgs boson, accounting for the experimental definition of the fiducial cross sections. The NNLO QCD corrections are found to be moderate and positive, they lead to a substantial reduction of the theory uncertainty on the predictions. We compare our results to 8 TeV LHC data from ATLAS and CMS. While the shape of the data is well-described for both experiments, we agree on the normalization only for CMS. By normalizing data and theory to the inclusive fiducial cross section for Higgs production, good agreement is found for both experiments, however at the expense of an increased theory uncertainty. We make predictions for Higgs production observables at the 13 TeV LHC, which are in good agreement with recent ATLAS data. At this energy, the leading order mass corrections to the effective field theory prediction become significant at large transverse momenta, and we discuss the resulting uncertainties on the predictions.
Shany-Ur, Tal; Poorzand, Pardis; Grossman, Scott N; Growdon, Matthew E; Jang, Jung Y; Ketelle, Robin S; Miller, Bruce L; Rankin, Katherine P
2012-01-01
Comprehension of insincere communication is an important aspect of social cognition requiring visual perspective taking, emotion reading, and understanding others' thoughts, opinions, and intentions. Someone who is lying intends to hide their insincerity from the listener, while a sarcastic speaker wants the listener to recognize they are speaking insincerely. We investigated whether face-to-face testing of comprehending insincere communication would effectively discriminate among neurodegenerative disease patients with different patterns of real-life social deficits. We examined ability to comprehend lies and sarcasm from a third-person perspective, using contextual cues, in 102 patients with one of four neurodegenerative diseases (behavioral variant frontotemporal dementia [bvFTD], Alzheimer's disease [AD], progressive supranuclear palsy [PSP], and vascular cognitive impairment) and 77 healthy older adults (normal controls--NCs). Participants answered questions about videos depicting social interactions involving deceptive, sarcastic, or sincere speech using The Awareness of Social Inference Test. All subjects equally understood sincere remarks, but bvFTD patients displayed impaired comprehension of lies and sarcasm compared with NCs. In other groups, impairment was not disease-specific but was proportionate to general cognitive impairment. Analysis of the task components revealed that only bvFTD patients were impaired on perspective taking and emotion reading elements and that both bvFTD and PSP patients had impaired ability to represent others' opinions and intentions (i.e., theory of mind). Test performance correlated with informants' ratings of subjects' empathy, perspective taking and neuropsychiatric symptoms in everyday life. Comprehending insincere communication is complex and requires multiple cognitive and emotional processes vulnerable across neurodegenerative diseases. However, bvFTD patients show uniquely focal and severe impairments at every level of theory of mind and emotion reading, leading to an inability to identify obvious examples of deception and sarcasm. This is consistent with studies suggesting this disease targets a specific neural network necessary for perceiving social salience and predicting negative social outcomes. Copyright © 2011 Elsevier Srl. All rights reserved.
ERIC Educational Resources Information Center
Cheung, Nicole W. T.; Cheung, Yuet W.
2008-01-01
The objectives of this study were to test the predictive power of self-control theory for delinquency in a Chinese context, and to explore if social factors as predicted in social bonding theory, differential association theory, general strain theory, and labeling theory have effects on delinquency in the presence of self-control. Self-report data…
McDowell, J J; Calvin, Olivia L; Hackett, Ryan; Klapes, Bryan
2017-07-01
Two competing predictions of matching theory and an evolutionary theory of behavior dynamics, and one additional prediction of the evolutionary theory, were tested in a critical experiment in which human participants worked on concurrent schedules for money (Dallery et al., 2005). The three predictions concerned the descriptive adequacy of matching theory equations, and of equations describing emergent equilibria of the evolutionary theory. Tests of the predictions falsified matching theory and supported the evolutionary theory. Copyright © 2017 Elsevier B.V. All rights reserved.
Representing metarepresentations: is there theory of mind-specific cognition?
Egeth, Marc; Kurzban, Robert
2009-03-01
What cognitive mechanisms underlie Theory of Mind? Some infer domain-specific Theory of Mind cognition based the pattern of children diagnosed with autism failing the False Belief test but passing the False Photograph test. However, we argue that the False Belief test entails various task demands the False Photograph task does not, including the necessity to represent a higher-order representation (a metarepresentation), thus confounding the inference of domain-specificity. Instead, a general difficulty that affects representations of metarepresentations might account for the seeming domain-specific failure. Here we find that False-Belief failing False-Photograph passing children fail the Meta Photograph test, a new photograph-domain test that requires subjects to represent a metarepresentation. We conclude that people who fail the False Belief test but pass the False Photograph test do not necessarily have a content-specific Theory of Mind deficit. Instead, the general ability to represent representations and metarepresentations might underlie Theory of Mind.
Examination of the neighborhood activation theory in normal and hearing-impaired listeners.
Dirks, D D; Takayanagi, S; Moshfegh, A; Noffsinger, P D; Fausti, S A
2001-02-01
Experiments were conducted to examine the effects of lexical information on word recognition among normal hearing listeners and individuals with sensorineural hearing loss. The lexical factors of interest were incorporated in the Neighborhood Activation Model (NAM). Central to this model is the concept that words are recognized relationally in the context of other phonemically similar words. NAM suggests that words in the mental lexicon are organized into similarity neighborhoods and the listener is required to select the target word from competing lexical items. Two structural characteristics of similarity neighborhoods that influence word recognition have been identified; "neighborhood density" or the number of phonemically similar words (neighbors) for a particular target item and "neighborhood frequency" or the average frequency of occurrence of all the items within a neighborhood. A third lexical factor, "word frequency" or the frequency of occurrence of a target word in the language, is assumed to optimize the word recognition process by biasing the system toward choosing a high frequency over a low frequency word. Three experiments were performed. In the initial experiments, word recognition for consonant-vowel-consonant (CVC) monosyllables was assessed in young normal hearing listeners by systematically partitioning the items into the eight possible lexical conditions that could be created by two levels of the three lexical factors, word frequency (high and low), neighborhood density (high and low), and average neighborhood frequency (high and low). Neighborhood structure and word frequency were estimated computationally using a large, on-line lexicon-based Webster's Pocket Dictionary. From this program 400 highly familiar, monosyllables were selected and partitioned into eight orthogonal lexical groups (50 words/group). The 400 words were presented randomly to normal hearing listeners in speech-shaped noise (Experiment 1) and "in quiet" (Experiment 2) as well as to an elderly group of listeners with sensorineural hearing loss in the speech-shaped noise (Experiment 3). The results of three experiments verified predictions of NAM in both normal hearing and hearing-impaired listeners. In each experiment, words from low density neighborhoods were recognized more accurately than those from high density neighborhoods. The presence of high frequency neighbors (average neighborhood frequency) produced poorer recognition performance than comparable conditions with low frequency neighbors. Word frequency was found to have a highly significant effect on word recognition. Lexical conditions with high word frequencies produced higher performance scores than conditions with low frequency words. The results supported the basic tenets of NAM theory and identified both neighborhood structural properties and word frequency as significant lexical factors affecting word recognition when listening in noise and "in quiet." The results of the third experiment permit extension of NAM theory to individuals with sensorineural hearing loss. Future development of speech recognition tests should allow for the effects of higher level cognitive (lexical) factors on lower level phonemic processing.
NASA Technical Reports Server (NTRS)
Subrahmanyam, K. B.; Kaza, K. R. V.
1986-01-01
The governing coupled flapwise bending, edgewise bending, and torsional equations are derived including third-degree geometric nonlinear elastic terms by making use of the geometric nonlinear theory of elasticity in which the elongations and shears are negligible compared to unity. These equations are specialized for blades of doubly symmetric cross section with linear variation of pretwist over the blade length. The nonlinear steady state equations and the linearized perturbation equations are solved by using the Galerkin method, and by utilizing the nonrotating normal modes for the shape functions. Parametric results obtained for various cases of rotating blades from the present theoretical formulation are compared to those produced from the finite element code MSC/NASTRAN, and also to those produced from an in-house experimental test rig. It is shown that the spurious instabilities, observed for thin, rotating blades when second degree geometric nonlinearities are used, can be eliminated by including the third-degree elastic nonlinear terms. Furthermore, inclusion of third degree terms improves the correlation between the theory and experiment.
McLeod, Peter; Reed, Nick; Gilson, Stuart; Glennerster, Andrew
2010-01-01
We measured the movements of soccer players heading a football in a fully immersive virtual reality environment. In mid-flight the ball’s trajectory was altered from its normal quasi-parabolic path to a linear one, producing a jump in the rate of change of the angle of elevation of gaze (α) from player to ball. One reation time later the players adjusted their speed so that the rate of change of α increased when it had been reduced and reduced it when it had been increased. Since the result of the player’s movement was to regain a value of the rate of change close to that before the disturbance, the data suggest that the players have an expectation of, and memory for, the pattern that the rate of change of α will follow during the flight. The results support the general claim that players intercepting balls use servo control strategies and are consistent with the particular claim of Optic Acceleration Cancellation theory that the servo strategy is to allow α to increase at a steadily decreasing rate. PMID:18472123
McLeod, Peter; Reed, Nick; Gilson, Stuart; Glennerster, Andrew
2008-06-01
We measured the movements of soccer players heading a football in a fully immersive virtual reality environment. In mid-flight the ball's trajectory was altered from its normal quasi-parabolic path to a linear one, producing a jump in the rate of change of the angle of elevation of gaze (alpha) from player to ball. One reaction time later the players adjusted their speed so that the rate of change of alpha increased when it had been reduced and reduced it when it had been increased. Since the result of the player's movement was to regain a value of the rate of change close to that before the disturbance, the data suggest that the players have an expectation of, and memory for, the pattern that the rate of change of alpha will follow during the flight. The results support the general claim that players intercepting balls use servo control strategies and are consistent with the particular claim of Optic Acceleration Cancellation theory that the servo strategy is to allow alpha to increase at a steadily decreasing rate.
ERIC Educational Resources Information Center
Mislevy, Robert J.
Educational test theory consists of statistical and methodological tools to support inferences about examinees' knowledge, skills, and accomplishments. The evolution of test theory has been shaped by the nature of users' inferences which, until recently, have been framed almost exclusively in terms of trait and behavioral psychology. Progress in…
Bowker, Matthew A.; Maestre, Fernando T.
2012-01-01
Dryland vegetation is inherently patchy. This patchiness goes on to impact ecology, hydrology, and biogeochemistry. Recently, researchers have proposed that dryland vegetation patch sizes follow a power law which is due to local plant facilitation. It is unknown what patch size distribution prevails when competition predominates over facilitation, or if such a pattern could be used to detect competition. We investigated this question in an alternative vegetation type, mosses and lichens of biological soil crusts, which exhibit a smaller scale patch-interpatch configuration. This micro-vegetation is characterized by competition for space. We proposed that multiplicative effects of genetics, environment and competition should result in a log-normal patch size distribution. When testing the prevalence of log-normal versus power law patch size distributions, we found that the log-normal was the better distribution in 53% of cases and a reasonable fit in 83%. In contrast, the power law was better in 39% of cases, and in 8% of instances both distributions fit equally well. We further hypothesized that the log-normal distribution parameters would be predictably influenced by competition strength. There was qualitative agreement between one of the distribution's parameters (μ) and a novel intransitive (lacking a 'best' competitor) competition index, suggesting that as intransitivity increases, patch sizes decrease. The correlation of μ with other competition indicators based on spatial segregation of species (the C-score) depended on aridity. In less arid sites, μ was negatively correlated with the C-score (suggesting smaller patches under stronger competition), while positive correlations (suggesting larger patches under stronger competition) were observed at more arid sites. We propose that this is due to an increasing prevalence of competition transitivity as aridity increases. These findings broaden the emerging theory surrounding dryland patch size distributions and, with refinement, may help us infer cryptic ecological processes from easily observed spatial patterns in the field.
Loffing, Florian; Sölter, Florian; Hagemann, Norbert
2014-01-01
In the elite domain of interactive sports, athletes who demonstrate a left preference (e.g., holding a weapon with the left hand in fencing or boxing in a ‘southpaw’ stance) seem overrepresented. Such excess indicates a performance advantage and was also interpreted as evidence in favour of frequency-dependent selection mechanisms to explain the maintenance of left-handedness in humans. To test for an overrepresentation, the incidence of athletes' lateral preferences is typically compared with an expected ratio of left- to right-handedness in the normal population. However, the normal population reference values did not always relate to the sport-specific tasks of interest, which may limit the validity of reports of an excess of ‘left-oriented’ athletes. Here we sought to determine lateral preferences for various sport-specific tasks (e.g., baseball batting, boxing) in the normal population and to examine the relationship between these preferences and handedness. To this end, we asked 903 participants to indicate their lateral preferences for sport-specific and common tasks using a paper-based questionnaire. Lateral preferences varied considerably across the different sport tasks and we found high variation in the relationship between those preferences and handedness. In contrast to unimanual tasks (e.g., fencing or throwing), for bimanually controlled actions such as baseball batting, shooting in ice hockey or boxing the incidence of left preferences was considerably higher than expected from the proportion of left-handedness in the normal population and the relationship with handedness was relatively low. We conclude that (i) task-specific reference values are mandatory for reliably testing for an excess of athletes with a left preference, (ii) the term ‘handedness’ should be more cautiously used within the context of sport-related laterality research and (iii) observation of lateral preferences in sports may be of limited suitability for the verification of evolutionary theories of handedness. PMID:25141020
Mechanical behaviour of the human atria.
Bellini, Chiara; Di Martino, Elena S; Federico, Salvatore
2013-07-01
This work was aimed at providing a local mechanical characterisation of tissues from the healthy human atria. Thirty-two tissue specimens were harvested from nine adult subjects whose death was not directly related to cardiovascular diseases. Tissues were kept in Tyrode's solution and tested using a planar biaxial device. Results showed that tissues from healthy human atria undergo large deformations under in-plane distributed tensions roughly corresponding to an in vivo pressure of 15 mmHg. The material was modelled as hyperelastic and a Fung-type elastic strain energy potential was chosen. This class of potentials is based on a function of a quadratic form in the components of the Green-Lagrange strain tensor, and it has been previously proved that the fourth-order tensor of this quadratic form is proportional to the linear elasticity tensor of the linearised theory. This has three important consequences: (i) the coefficients in Fung-type potentials have a precise physical meaning; (ii) whenever a microstructural description for the linear elasticity tensor is available, this is automatically inherited by the Fung-type potential; (iii) because of the presence of the linear elasticity tensor in the definition of a Fung-type potential, each of the three normal stresses is coupled with all three normal strains.We propose to include information on the microstructure of the atrium by writing the linear elasticity tensor as the volumetric-fraction-weighed sum of the linear elasticity tensors of the three constituents of the tissue: the ground matrix, the main fibre family and the secondary fibre family. To the best of our knowledge, this is the first time that a Fung-type potential is given a precise structural meaning, based on the directions and the material properties of the fibres. Because of the coupling between normal strains and normal stresses, this structurally-based Fung-type potential allows for discriminating among all testing protocols in planar biaxial stretch.
Loffing, Florian; Sölter, Florian; Hagemann, Norbert
2014-01-01
In the elite domain of interactive sports, athletes who demonstrate a left preference (e.g., holding a weapon with the left hand in fencing or boxing in a 'southpaw' stance) seem overrepresented. Such excess indicates a performance advantage and was also interpreted as evidence in favour of frequency-dependent selection mechanisms to explain the maintenance of left-handedness in humans. To test for an overrepresentation, the incidence of athletes' lateral preferences is typically compared with an expected ratio of left- to right-handedness in the normal population. However, the normal population reference values did not always relate to the sport-specific tasks of interest, which may limit the validity of reports of an excess of 'left-oriented' athletes. Here we sought to determine lateral preferences for various sport-specific tasks (e.g., baseball batting, boxing) in the normal population and to examine the relationship between these preferences and handedness. To this end, we asked 903 participants to indicate their lateral preferences for sport-specific and common tasks using a paper-based questionnaire. Lateral preferences varied considerably across the different sport tasks and we found high variation in the relationship between those preferences and handedness. In contrast to unimanual tasks (e.g., fencing or throwing), for bimanually controlled actions such as baseball batting, shooting in ice hockey or boxing the incidence of left preferences was considerably higher than expected from the proportion of left-handedness in the normal population and the relationship with handedness was relatively low. We conclude that (i) task-specific reference values are mandatory for reliably testing for an excess of athletes with a left preference, (ii) the term 'handedness' should be more cautiously used within the context of sport-related laterality research and (iii) observation of lateral preferences in sports may be of limited suitability for the verification of evolutionary theories of handedness.
Theory-driven research in pediatric psychology: a little bit on why and how.
Wallander, J L
1992-10-01
Introduces a Special Issue, covering two published issues (5 and 6) of this journal, on theory-driven research in pediatric psychology. A rationale for conducting research from a conceptual basis is presented. It is emphasized that science is primarily an intellectual activity, demonstrated in the form of theory building, testing, and reformulation. Furthermore, it is argued theory serves as a planning and communication aide for scientific pursuit. The process and components of theory-driven research are then highlighted. Theoretical constructs, theoretical and empirical definitions of constructs, and the use of variables are discussed. A definition of scientific theory is offered. Theory testing is distinguished from post hoc theorizing. Differences in the scope of theories are noted. Connections between theory and hypothesis testing and research design are addressed, especially for nonexperimental or correlational research.
ERIC Educational Resources Information Center
Penningroth, Suzanna L.; Scott, Walter D.
2012-01-01
Two prominent theories of lifespan development, socioemotional selectivity theory and selection, optimization, and compensation theory, make similar predictions for differences in the goal representations of younger and older adults. Our purpose was to test whether the goals of younger and older adults differed in ways predicted by these two…
ERIC Educational Resources Information Center
Sharma, Kshitij; Chavez-Demoulin, Valérie; Dillenbourg, Pierre
2017-01-01
The statistics used in education research are based on central trends such as the mean or standard deviation, discarding outliers. This paper adopts another viewpoint that has emerged in statistics, called extreme value theory (EVT). EVT claims that the bulk of normal distribution is comprised mainly of uninteresting variations while the most…
Referential Communication Abilities and Theory of Mind Development in Preschool Children
ERIC Educational Resources Information Center
Resches, Mariela; Pereira, Miguel Perez
2007-01-01
This work aims to analyse the specific contribution of social abilities (here considered as the capacity for attributing knowledge to others) in a particular communicative context. 74 normally developing children (aged 3;4 to 5;9, M=4.6) were given two Theory of Mind (ToM) tasks, which are considered to assess increasing complexity levels of…
NASA Technical Reports Server (NTRS)
Unz, H.; Roskam, J.
1979-01-01
The theory of acoustic plane wave normally incident on a clamped panel in a rectangular duct is developed. The coupling theory between the elastic vibrations of the panel (plate) and the acoustic wave propagation in infinite space and in the rectangular duct is considered. The partial differential equation which governs the vibration of the panel (plate) is modified by adding to its stiffness (spring) forces and damping forces, and the fundamental resonance frequency and the attenuation factor are discussed. The noise reduction expression based on the theory is found to agree well with the corresponding experimental data of a sample aluminum panel in the mass controlled region, the damping controlled region, and the stiffness controlled region. All the frequency positions of the upward and downward resonance spikes in the sample experimental data are identified theoretically as resulting from four cross interacting major resonance phenomena: the cavity resonance, the acoustic resonance, the plate resonance, and the wooden back panel resonance.
Analysis and modification of theory for impact of seaplanes on water
NASA Technical Reports Server (NTRS)
Mayo, Wilbur L
1945-01-01
An analysis of available theory on seaplane impact and a proposed modification thereto are presented. In previous methods the overall momentum of the float and virtual mass has been assumed to remain constant during the impact but the present analysis shows that this assumption is rigorously correct only when the resultant velocity of the float is normal to the keel. The proposed modification chiefly involves consideration of the fact that forward velocity of the seaplane float causes momentum to be passed into the hydrodynamic downwash (an action that is the entire consideration in the case of the planing float) and consideration of the fact that, for an impact with trim, the rate of penetration is determined not only by the velocity component normal to the keel but also by the velocity component parallel to the keel, which tends to reduce the penetration. Experimental data for planing, oblique impact, and vertical drop are used to show that the accuracy of the proposed theory is good.
ERIC Educational Resources Information Center
Guion, Robert M.; Ironson, Gail H.
Challenges to classical psychometric theory are examined in the context of a broader range of fundamental, derived, and intuitive measurements in psychology; the challenges include content-referenced testing, latent trait theory, and generalizability theory. A taxonomy of psychological measurement is developed, based on: (1) purposes of…
I-Love-Q relations for neutron stars in dynamical Chern Simons gravity
NASA Astrophysics Data System (ADS)
Gupta, Toral; Majumder, Barun; Yagi, Kent; Yunes, Nicolás
2018-01-01
Neutron stars are ideal to probe, not only nuclear physics, but also strong-field gravity. Approximate universal relations insensitive to the star’s internal structure exist among certain observables and are useful in testing general relativity, as they project out the uncertainties in the equation of state. One such set of universal relations between the moment of inertia (I), the tidal Love number and the quadrupole moment (Q) has been studied both in general relativity and in modified theories. In this paper, we study the relations in dynamical Chern–Simons gravity, a well-motivated, parity-violating effective field theory, extending previous work in various ways. First, we study how projected constraints on the theory using the I-Love relation depend on the measurement accuracy of I with radio observations and that of the Love number with gravitational-wave observations. Provided these quantities can be measured with future observations, we find that the latter could place bounds on dynamical Chern–Simons gravity that are six orders of magnitude stronger than current bounds. Second, we study the I–Q and Q-Love relations in this theory by constructing slowly-rotating neutron star solutions to quadratic order in spin. We find that the approximate universality continues to hold in dynamical Chern–Simons gravity, and in fact, it becomes stronger than in general relativity, although its existence depends on the normalization of the dimensional coupling constant of the theory. Finally, we study the variation of the eccentricity of isodensity contours inside a star and its relation to the degree of universality. We find that, in most cases, the eccentricity variation is smaller in dynamical Chern–Simons gravity than in general relativity, providing further support to the idea that the approximate self-similarity of isodensity contours is responsible for universality.
Conceptualizing patient empowerment in cancer follow-up by combining theory and qualitative data.
Johnsen, Anna Thit; Eskildsen, Nanna Bjerg; Thomsen, Thora Grothe; Grønvold, Mogens; Ross, Lone; Jørgensen, Clara R
2017-02-01
Patient empowerment (PE) may be defined as the opportunity for patients to master issues important to their own health. The aim of this study was to conceptualize PE and how the concept manifests itself for cancer patients attending follow-up, in order to develop a relevant and sensitive questionnaire for this population. A theoretical model of PE was made, based on Zimmerman's theory of psychological empowerment. Patients who were in follow-up after first line treatment for their cancer (n = 16) were interviewed about their experiences with follow-up. A deductive thematic analysis was conducted to contextualize the theory and find concrete manifestations of empowerment. Data were analyzed to find situations that expressed empowerment or lack of empowerment. We then analyzed what abilities these situations called for and we further analyzed how these abilities fitted Zimmerman's theory. In all, 16 patients from two different hospitals participated in the interviews. PE in cancer follow-up was conceptualized as: (1) the perception that one had the possibility of mastering treatment and care (e.g. the possibility of 'saying no' to treatment and getting in contact with health care when needed); (2) having knowledge and skills regarding, for example treatment, care, plan of treatment and care, normal reactions and late effects, although knowledge and information was not always considered positively; and (3) being able to make the health care system address one's concerns and needs and, for some patients, also being able to monitor one's treatment, tests and care. We conceptualized PE based on Zimmerman's theory and empirical data to contextualize the concept in cancer follow-up. When developing a patient reported outcome measure measuring PE for this group of patients, one needs to be attentive to differences in wishes regarding mastery.
TESTING THE PROPAGATING FLUCTUATIONS MODEL WITH A LONG, GLOBAL ACCRETION DISK SIMULATION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hogg, J Drew; Reynolds, Christopher S.
2016-07-20
The broadband variability of many accreting systems displays characteristic structures; log-normal flux distributions, root-mean square (rms)-flux relations, and long inter-band lags. These characteristics are usually interpreted as inward propagating fluctuations of the mass accretion rate in an accretion disk driven by stochasticity of the angular momentum transport mechanism. We present the first analysis of propagating fluctuations in a long-duration, high-resolution, global three-dimensional magnetohydrodynamic (MHD) simulation of a geometrically thin ( h / r ≈ 0.1) accretion disk around a black hole. While the dynamical-timescale turbulent fluctuations in the Maxwell stresses are too rapid to drive radially coherent fluctuations in themore » accretion rate, we find that the low-frequency quasi-periodic dynamo action introduces low-frequency fluctuations in the Maxwell stresses, which then drive the propagating fluctuations. Examining both the mass accretion rate and emission proxies, we recover log-normality, linear rms-flux relations, and radial coherence that would produce inter-band lags. Hence, we successfully relate and connect the phenomenology of propagating fluctuations to modern MHD accretion disk theory.« less
A mechanical model for deformable and mesh pattern wheel of lunar roving vehicle
NASA Astrophysics Data System (ADS)
Liang, Zhongchao; Wang, Yongfu; Chen, Gang (Sheng); Gao, Haibo
2015-12-01
As an indispensable tool for astronauts on lunar surface, the lunar roving vehicle (LRV) is of great significance for manned lunar exploration. An LRV moves on loose and soft lunar soil, so the mechanical property of its wheels directly affects the mobility performance. The wheels used for LRV have deformable and mesh pattern, therefore, the existing mechanical theory of vehicle wheel cannot be used directly for analyzing the property of LRV wheels. In this paper, a new mechanical model for LRV wheel is proposed. At first, a mechanical model for a rigid normal wheel is presented, which involves in multiple conventional parameters such as vertical load, tangential traction force, lateral force, and slip ratio. Secondly, six equivalent coefficients are introduced to amend the rigid normal wheel model to fit for the wheels with deformable and mesh-pattern in LRV application. Thirdly, the values of the six equivalent coefficients are identified by using experimental data obtained in an LRV's single wheel testing. Finally, the identified mechanical model for LRV's wheel with deformable and mesh pattern are further verified and validated by using additional experimental results.
Smith, Adam L; Villar, Sofía S
2018-01-01
Adaptive designs for multi-armed clinical trials have become increasingly popular recently because of their potential to shorten development times and to increase patient response. However, developing response-adaptive designs that offer patient-benefit while ensuring the resulting trial provides a statistically rigorous and unbiased comparison of the different treatments included is highly challenging. In this paper, the theory of Multi-Armed Bandit Problems is used to define near optimal adaptive designs in the context of a clinical trial with a normally distributed endpoint with known variance. We report the operating characteristics (type I error, power, bias) and patient-benefit of these approaches and alternative designs using simulation studies based on an ongoing trial. These results are then compared to those recently published in the context of Bernoulli endpoints. Many limitations and advantages are similar in both cases but there are also important differences, specially with respect to type I error control. This paper proposes a simulation-based testing procedure to correct for the observed type I error inflation that bandit-based and adaptive rules can induce.
Statistical analysis of the calibration procedure for personnel radiation measurement instruments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bush, W.J.; Bengston, S.J.; Kalbeitzer, F.L.
1980-11-01
Thermoluminescent analyzer (TLA) calibration procedures were used to estimate personnel radiation exposure levels at the Idaho National Engineering Laboratory (INEL). A statistical analysis is presented herein based on data collected over a six month period in 1979 on four TLA's located in the Department of Energy (DOE) Radiological and Environmental Sciences Laboratory at the INEL. The data were collected according to the day-to-day procedure in effect at that time. Both gamma and beta radiation models are developed. Observed TLA readings of thermoluminescent dosimeters are correlated with known radiation levels. This correlation is then used to predict unknown radiation doses frommore » future analyzer readings of personnel thermoluminescent dosimeters. The statistical techniques applied in this analysis include weighted linear regression, estimation of systematic and random error variances, prediction interval estimation using Scheffe's theory of calibration, the estimation of the ratio of the means of two normal bivariate distributed random variables and their corresponding confidence limits according to Kendall and Stuart, tests of normality, experimental design, a comparison between instruments, and quality control.« less
NASA Technical Reports Server (NTRS)
Dugan, Duane W.
1959-01-01
The possibility of obtaining useful estimates of the static longitudinal stability of aircraft flying at high supersonic Mach numbers at angles of attack between 0 and +/-180 deg is explored. Existing theories, empirical formulas, and graphical procedures are employed to estimate the normal-force and pitching-moment characteristics of an example airplane configuration consisting of an ogive-cylinder body, trapezoidal wing, and cruciform trapezoidal tail. Existing wind-tunnel data for this configuration at a Mach number of 6.86 provide an evaluation of the estimates up to an angle of attack of 35 deg. Evaluation at higher angles of attack is afforded by data obtained from wind-tunnel tests made with the same configuration at angles of attack between 30 and 150 deg at five Mach numbers between 2.5 and 3.55. Over the ranges of Mach numbers and angles of attack investigated, predictions of normal force and center-of-pressure locations for the configuration considered agree well with those obtained experimentally, particularly at the higher Mach numbers.
The hopelessness theory of depression: attributional aspects.
Alloy, L B; Abramson, L Y; Metalsky, G I; Hartlage, S
1988-02-01
In this article, we clarify, expand and revise the basic postulates of the hopelessness theory of depression (Abramson, Alloy & Metalsky, 1988a; Abramson, Metalsky & Alloy, 1987, 1988b; previously referred to as the reformulated helplessness theory of depression: Abramson, Seligman & Teasdale, 1978) and place the theory more explicitly in the context of work in descriptive psychiatry about the heterogeneity among the depressive disorders. We suggest that the hopelessness theory hypothesizes the existence in nature of an, as yet, unidentified subtype of depression--'hopelessness depression'--defined, in part, by its cause. We then give a critique of work conducted to test the hopelessness theory and explicate the limitations in research strategy associated with this line of work. Our critique includes a logical analysis that deduces the conceptual and methodological inadequacies of the research strategies used to test the theory. Finally, we suggest more adequate research strategies for testing the hopelessness theory and discuss conceptual and assessment issues that will arise in conducting such tests with special emphasis on attributional styles.
The Interrelationship of Ego, Moral, and Conceptual Development in a College Group.
ERIC Educational Resources Information Center
Lutwak, Nita
1984-01-01
Compared three personality theories (ego development, moral development, and conceptual systems theory) in 102 college students who completed the Sentence Completion Test, This I Believe Test, and Defining Issues Test. Results indicated a significant relationship between all three pairs of theories. (JAC)
ERIC Educational Resources Information Center
Hole, Arne; Grønmo, Liv Sissel; Onstad, Torgeir
2018-01-01
Background: This paper discusses a framework for analyzing the dependence on mathematical theory in test items, that is, a framework for discussing to what extent knowledge of mathematical theory is helpful for the student in solving the item. The framework can be applied to any test in which some knowledge of mathematical theory may be useful,…
[The research in a foot pressure measuring system based on LabVIEW].
Li, Wei; Qiu, Hong; Xu, Jiang; He, Jiping
2011-01-01
This paper presents a system of foot pressure measuring system based on LabVIEW. The designs of hardware and software system are figured out. LabVIEW is used to design the application interface for displaying plantar pressure. The system can realize the plantar pressure data acquisition, data storage, waveform display, and waveform playback. It was also shown that the testing results of the system were in line with the changing trend of normal gait, which conformed to human system engineering theory. It leads to the demonstration of system reliability. The system gives vivid and visual results, and provides a new method of how to measure foot-pressure and some references for the design of Insole System.
Simons, Jeffery P; Wilson, Jacob M; Wilson, Gabriel J; Theall, Stephen
2009-09-01
We tested expert baseball pitchers for evidence of especial skills at the regulation pitching distance. Seven college pitchers threw indoors to a target placed at 60.5 feet (18.44 m) and four closer and four further distances away. Accuracy at the regulation distance was significantly better than predicted by regression on the nonregulation distances (p < .02), indicating an especial skill effect emerged despite the absence of normal contextual cues. Self-efficacy data failed to support confidence as a mediating factor in especial skill effect. We concluded that cognitive theories fail to fully account for the patterns of observed data, and therefore theoretical explanations of the especial skills must address noncognitive aspects of motor learning and control.
A New Reynolds Stress Algebraic Equation Model
NASA Technical Reports Server (NTRS)
Shih, Tsan-Hsing; Zhu, Jiang; Lumley, John L.
1994-01-01
A general turbulent constitutive relation is directly applied to propose a new Reynolds stress algebraic equation model. In the development of this model, the constraints based on rapid distortion theory and realizability (i.e. the positivity of the normal Reynolds stresses and the Schwarz' inequality between turbulent velocity correlations) are imposed. Model coefficients are calibrated using well-studied basic flows such as homogeneous shear flow and the surface flow in the inertial sublayer. The performance of this model is then tested in complex turbulent flows including the separated flow over a backward-facing step and the flow in a confined jet. The calculation results are encouraging and point to the success of the present model in modeling turbulent flows with complex geometries.
Can autistic children read the mind of an animated triangle?
Salter, Gemma; Seigal, Anna; Claxton, Melanie; Lawrence, Kate; Skuse, David
2008-07-01
Are children with an autism spectrum disorder (ASD), but normal-range intelligence, impaired on theory of mind skills measured by responses to abstract animations in the form of a computerized cartoon? Fifty-six cases and closely matched comparisons were tested. We rated verbal responses according to the length of their descriptions, their appropriateness and the children's use of 'mentalizing' terms. Children with ASD used 'mentalizing' language to describe the animations as well as comparisons, although the content of their descriptions was significantly less appropriate. Performance on this task was not well correlated with standardized measures of parent-reported behaviour or the child's interactions with an observer. The implications of our results are discussed in relation to previous studies that have used this methodology.
Theories of Impaired Consciousness in Epilepsy
Yu, Lissa; Blumenfeld, Hal
2015-01-01
Although the precise mechanisms for control of consciousness are not fully understood, emerging data show that conscious information processing depends on the activation of certain networks in the brain and that the impairment of consciousness is related to abnormal activity in these systems. Epilepsy can lead to transient impairment of consciousness, providing a window into the mechanisms necessary for normal consciousness. Thus, despite differences in behavioral manifestations, cause, and electrophysiology, generalized tonic–clonic, absence, and partial seizures engage similar anatomical structures and pathways. We review prior concepts of impaired consciousness in epilepsy, focusing especially on temporal lobe complex partial seizures, which are a common and debilitating form of epileptic unconsciousness. We discuss a “network inhibition hypothesis” in which focal temporal lobe seizure activity disrupts normal cortical–subcortical interactions, leading to depressed neocortical function and impaired consciousness. This review of the major prior theories of impaired consciousness in epilepsy allows us to put more recent data into context and to reach a better understanding of the mechanisms important for normal consciousness. PMID:19351355
NASA Astrophysics Data System (ADS)
Nikitenko, V. R.; von Seggern, H.
2007-11-01
An analytic theory of nonequilibrium hopping charge transport in disordered organic materials includes quasiequilibrium (normal) and extremely nonequilibrium (dispersive) regimes as limiting cases at long and short times, respectively. In the intermediate interval of time quasiequilibrium value of mobility is nearly established while the coefficient of field-assisted diffusion continues to increase (quasidispersive regime). Therefore, normalized time dependencies of transient current in time-of-flight (TOF) conditions are practically independent of field strength and sample thickness, in good agreement both with data of TOF experiments for molecularly doped polymers and results of numerical simulations of Gaussian disorder model. An analytic model of transient electroluminescence (TEL) is developed on the base of the mentioned theory. Strong asymmetry of mobilities is presumed. In analogy with TOF transients, dispersion parameter of normalized TEL intensity is anomalously large and almost field independent in the quasidispersive regime of transport. The method for determination of mobility from TEL data is proposed.
Electric field induced sheeting and breakup of dielectric liquid jets
NASA Astrophysics Data System (ADS)
Khoshnevis, Ahmad; Tsai, Scott S. H.; Esmaeilzadeh, Esmaeil
2014-01-01
We report experimental observations of the controlled deformation of a dielectric liquid jet subjected to a local high-voltage electrostatic field in the direction normal to the jet. The jet deforms to the shape of an elliptic cylinder upon application of a normal electrostatic field. As the applied electric field strength is increased, the elliptic cylindrical jet deforms permanently into a flat sheet, and eventually breaks-up into droplets. We interpret this observation—the stretch of the jet is in the normal direction to the applied electric field—qualitatively using the Taylor-Melcher leaky dielectric theory, and develop a simple scaling model that predicts the critical electric field strength for the jet-to-sheet transition. Our model shows a good agreement with experimental results, and has a form that is consistent with the classical drop deformation criterion in the Taylor-Melcher theory. Finally, we statistically analyze the resultant droplets from sheet breakup, and find that increasing the applied electric field strength improves droplet uniformity and reduces droplet size.
Dunford, Jeffrey L; Dhirani, Al-Amin
2008-11-12
Interfaces between disordered normal materials and superconductors (S) can exhibit 'reflectionless tunnelling' (RT)-a phenomenon that arises from repeated disorder-driven elastic scattering, multiple Andreev reflections, and electron/hole interference. RT has been used to explain zero-bias conductance peaks (ZBCPs) observed using doped semiconductors and evaporated granular metal films as the disordered normal materials. Recently, in addition to ZBCPs, magnetoconductance oscillations predicted by RT theory have been observed using a novel normal disordered material: self-assembled nanoparticle films. In the present study, we find that the period of these oscillations decreases as temperature (T) increases. This suggests that the magnetic flux associated with interfering pathways increases accordingly. We propose that the increasing flux can be attributed to magnetic field penetration into S as [Formula: see text]. This model agrees remarkably well with known T dependence of penetration depth predicted by Bardeen-Cooper-Schrieffer theory. Our study shows that this additional region of flux is significant and must be considered in experimental and theoretical studies of RT.
Perinatal sadness among Shuar women: support for an evolutionary theory of psychic pain.
Hagen, Edward H; Barrett, H Clark
2007-03-01
Psychiatry faces an internal contradiction in that it regards mild sadness and low mood as normal emotions, yet when these emotions are directed toward a new infant, it regards them as abnormal. We apply parental investment theory, a widely used framework from evolutionary biology, to maternal perinatal emotions, arguing that negative emotions directed toward a new infant could serve an important evolved function. If so, then under some definitions of psychiatric disorder, these emotions are not disorders. We investigate the applicability of parental investment theory to maternal postpartum emotions among Shuar mothers. Shuar mothers' conceptions of perinatal sadness closely match predictions of parental investment theory.
ERIC Educational Resources Information Center
Bukach, Cindy M.; Bub, Daniel N.; Masson, Michael E. J.; Lindsay, D. Stephen
2004-01-01
Studies of patients with category-specific agnosia (CSA) have given rise to multiple theories of object recognition, most of which assume the existence of a stable, abstract semantic memory system. We applied an episodic view of memory to questions raised by CSA in a series of studies examining normal observers' recall of newly learned attributes…
ERIC Educational Resources Information Center
Chasiotis, Athanasios; Kiessling, Florian; Winter, Vera; Hofer, Jan
2006-01-01
After distinguishing between neocortical abilities for executive control and subcortical sensory motor skills for proprioceptive and vestibular integration, we compare a sample of 116 normal preschoolers with a sample of 31 preschoolers receiving occupational therapeutical treatment. This is done in an experimental design controlled for age (mean:…
Hyperscaling breakdown and Ising spin glasses: The Binder cumulant
NASA Astrophysics Data System (ADS)
Lundow, P. H.; Campbell, I. A.
2018-02-01
Among the Renormalization Group Theory scaling rules relating critical exponents, there are hyperscaling rules involving the dimension of the system. It is well known that in Ising models hyperscaling breaks down above the upper critical dimension. It was shown by Schwartz (1991) that the standard Josephson hyperscaling rule can also break down in Ising systems with quenched random interactions. A related Renormalization Group Theory hyperscaling rule links the critical exponents for the normalized Binder cumulant and the correlation length in the thermodynamic limit. An appropriate scaling approach for analyzing measurements from criticality to infinite temperature is first outlined. Numerical data on the scaling of the normalized correlation length and the normalized Binder cumulant are shown for the canonical Ising ferromagnet model in dimension three where hyperscaling holds, for the Ising ferromagnet in dimension five (so above the upper critical dimension) where hyperscaling breaks down, and then for Ising spin glass models in dimension three where the quenched interactions are random. For the Ising spin glasses there is a breakdown of the normalized Binder cumulant hyperscaling relation in the thermodynamic limit regime, with a return to size independent Binder cumulant values in the finite-size scaling regime around the critical region.
Gravitational mass of relativistic matter and antimatter
Kalaydzhyan, Tigran
2015-10-13
The universality of free fall, the weak equivalence principle (WEP), is a cornerstone of the general theory of relativity, the most precise theory of gravity confirmed in all experiments up to date. The WEP states the equivalence of the inertial, m, and gravitational, m g, masses and was tested in numerous occasions with normal matter at relatively low energies. However, there is no confirmation for the matter and antimatter at high energies. For the antimatter the situation is even less clear – current direct observations of trapped antihydrogen suggest the limits -65 < m g/m <110 not excluding the so-calledmore » antigravity phenomenon, i.e. repulsion of the antimatter by Earth. Here we demonstrate an indirect bound 0.96 < m g/m < 1.04 on the gravitational mass of relativistic electrons and positrons coming from the absence of the vacuum Cherenkov radiation at the Large Electron–Positron Collider (LEP) and stability of photons at the Tevatron collider in presence of the annual variations of the solar gravitational potential. Our result clearly rules out the speculated antigravity. By considering the absolute potential of the Local Supercluster (LS), we also predict the bounds 1 -4 ×10 -7 < m g/m <1 +2 ×10 -7 for an electron and positron. Lastly, we comment on a possibility of performing complementary tests at the future International Linear Collider (ILC) and Compact Linear Collider (CLIC).« less
How Children’s Mentalistic Theory Widens their Conception of Pictorial Possibilities
Gilli, Gabriella M.; Ruggi, Simona; Gatti, Monica; Freeman, Norman H.
2016-01-01
An interpretative theory of mind enables young children to grasp that people fulfill varying intentions when making pictures. We tested the hypothesis that in middle childhood a unifunctional conception of artists’ intention to produce a picture widens to include artists’ intention to display their pictures to others. Children aged between 5 and 10 years viewed a brief video of an artist deliberately hiding her picture but her intention was thwarted when her picture was discovered and displayed. By 8 years of age children were almost unanimous that a picture-producer without an intention to show her work to others cannot be considered to be an artist. Further exploratory studies centered on aspects of picture-display involving normal public display as well as the contrary intentions of hiding an original picture and of deceitfully displaying a forgery. Interviews suggested that the concept of exhibition widened to take others’ minds into account viewers’ critical judgments and effects of forgeries on viewers’ minds. The approach of interpolating probes of typical possibilities between atypical intentions generated evidence that in middle childhood the foundations are laid for a conception of communication between artists’ minds and viewers’ minds via pictorial display. The combination of hypothesis-testing and exploratory opening-up of the area generates a new testable hypothesis about how an increasingly mentalistic approach enables children to understand diverse possibilities in the pictorial domain. PMID:26955360
Gravitational mass of relativistic matter and antimatter
NASA Astrophysics Data System (ADS)
Kalaydzhyan, Tigran
2015-12-01
The universality of free fall, the weak equivalence principle (WEP), is a cornerstone of the general theory of relativity, the most precise theory of gravity confirmed in all experiments up to date. The WEP states the equivalence of the inertial, m, and gravitational, mg, masses and was tested in numerous occasions with normal matter at relatively low energies. However, there is no confirmation for the matter and antimatter at high energies. For the antimatter the situation is even less clear - current direct observations of trapped antihydrogen suggest the limits - 65
Hagger, Martin S; Chan, Derwin K C; Protogerou, Cleo; Chatzisarantis, Nikos L D
2016-08-01
Synthesizing research on social cognitive theories applied to health behavior is an important step in the development of an evidence base of psychological factors as targets for effective behavioral interventions. However, few meta-analyses of research on social cognitive theories in health contexts have conducted simultaneous tests of theoretically-stipulated pattern effects using path analysis. We argue that conducting path analyses of meta-analytic effects among constructs from social cognitive theories is important to test nomological validity, account for mediation effects, and evaluate unique effects of theory constructs independent of past behavior. We illustrate our points by conducting new analyses of two meta-analyses of a popular theory applied to health behaviors, the theory of planned behavior. We conducted meta-analytic path analyses of the theory in two behavioral contexts (alcohol and dietary behaviors) using data from the primary studies included in the original meta-analyses augmented to include intercorrelations among constructs and relations with past behavior missing from the original analysis. Findings supported the nomological validity of the theory and its hypotheses for both behaviors, confirmed important model processes through mediation analysis, demonstrated the attenuating effect of past behavior on theory relations, and provided estimates of the unique effects of theory constructs independent of past behavior. Our analysis illustrates the importance of conducting a simultaneous test of theory-stipulated effects in meta-analyses of social cognitive theories applied to health behavior. We recommend researchers adopt this analytic procedure when synthesizing evidence across primary tests of social cognitive theories in health. Copyright © 2016 Elsevier Inc. All rights reserved.
Towards a theory of tiered testing.
Hansson, Sven Ove; Rudén, Christina
2007-06-01
Tiered testing is an essential part of any resource-efficient strategy for the toxicity testing of a large number of chemicals, which is required for instance in the risk management of general (industrial) chemicals, In spite of this, no general theory seems to be available for the combination of single tests into efficient tiered testing systems. A first outline of such a theory is developed. It is argued that chemical, toxicological, and decision-theoretical knowledge should be combined in the construction of such a theory. A decision-theoretical approach for the optimization of test systems is introduced. It is based on expected utility maximization with simplified assumptions covering factual and value-related information that is usually missing in the development of test systems.
Sundqvist, Annette; Lyxell, Björn; Jönsson, Radoslava; Heimann, Mikael
2014-03-01
The present study investigates how auditory stimulation from cochlear implants (CI) is associated with the development of Theory of Mind (ToM) in severely and profoundly hearing impaired children with hearing parents. Previous research has shown that deaf children of hearing parents have a delayed ToM development. This is, however, not always the case with deaf children of deaf parents, who presumably are immersed in a more vivid signing environment. Sixteen children with CI (4.25 to 9.5 years of age) were tested on measures of cognitive and emotional ToM, language and cognition. Eight of the children received their first implant relatively early (before 27 months) and half of them late (after 27 months). The two groups did not differ in age, gender, language or cognition at entry of the study. ToM tests included the unexpected location task and a newly developed Swedish social-emotional ToM test. The tests aimed to test both cognitive and emotional ToM. A comparison group of typically developing hearing age matched children was also added (n=18). Compared to the comparison group, the early CI-group did not differ in emotional ToM. The late CI-group differed significantly from the comparison group on both the cognitive and emotional ToM tests. The results revealed that children with early cochlear implants solved ToM problems to a significantly higher degree than children with late implants, although the groups did not differ on language or cognitive measures at baseline. The outcome suggests that early cochlear implantation for deaf children in hearing families, in conjunction with early social and communicative stimulation in a language that is native to the parents, can provide a foundation for a more normalized ToM development. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
How to test gravitation theories by means of gravitational-wave measurements
NASA Technical Reports Server (NTRS)
Thorne, K. S.
1974-01-01
Gravitational-wave experiments are a potentially powerful tool for testing gravitation theories. Most theories in the literature predict rather different polarization properties for gravitational waves than are predicted by general relativity; and many theories predict anomalies in the propagation speeds of gravitational waves.
Theory-Based University Admissions Testing for a New Millennium
ERIC Educational Resources Information Center
Sternberg, Robert J.
2004-01-01
This article describes two projects based on Robert J. Sternberg's theory of successful intelligence and designed to provide theory-based testing for university admissions. The first, Rainbow Project, provided a supplementary test of analytical, practical, and creative skills to augment the SAT in predicting college performance. The Rainbow…
Robust Approach to Verifying the Weak Form of the Efficient Market Hypothesis
NASA Astrophysics Data System (ADS)
Střelec, Luboš
2011-09-01
The weak form of the efficient markets hypothesis states that prices incorporate only past information about the asset. An implication of this form of the efficient markets hypothesis is that one cannot detect mispriced assets and consistently outperform the market through technical analysis of past prices. One of possible formulations of the efficient market hypothesis used for weak form tests is that share prices follow a random walk. It means that returns are realizations of IID sequence of random variables. Consequently, for verifying the weak form of the efficient market hypothesis, we can use distribution tests, among others, i.e. some tests of normality and/or some graphical methods. Many procedures for testing the normality of univariate samples have been proposed in the literature [7]. Today the most popular omnibus test of normality for a general use is the Shapiro-Wilk test. The Jarque-Bera test is the most widely adopted omnibus test of normality in econometrics and related fields. In particular, the Jarque-Bera test (i.e. test based on the classical measures of skewness and kurtosis) is frequently used when one is more concerned about heavy-tailed alternatives. As these measures are based on moments of the data, this test has a zero breakdown value [2]. In other words, a single outlier can make the test worthless. The reason so many classical procedures are nonrobust to outliers is that the parameters of the model are expressed in terms of moments, and their classical estimators are expressed in terms of sample moments, which are very sensitive to outliers. Another approach to robustness is to concentrate on the parameters of interest suggested by the problem under this study. Consequently, novel robust testing procedures of testing normality are presented in this paper to overcome shortcomings of classical normality tests in the field of financial data, which are typical with occurrence of remote data points and additional types of deviations from normality. This study also discusses some results of simulation power studies of these tests for normality against selected alternatives. Based on outcome of the power simulation study, selected normality tests were consequently used to verify weak form of efficiency in Central Europe stock markets.
NASA Technical Reports Server (NTRS)
Lundquist, Eugene E; Schwartz, Edward B
1942-01-01
The results of a theoretical and experimental investigation to determine the critical compression load for a universal testing machine are presented for specimens loaded through knife edges. The critical load for the testing machine is the load at which one of the loading heads becomes laterally instable in relation to the other. For very short specimens the critical load was found to be less than the rated capacity given by the manufacturer for the machine. A load-length diagram is proposed for defining the safe limits of the test region for the machine. Although this report is particularly concerned with a universal testing machine of a certain type, the basic theory which led to the derivation of the general equation for the critical load, P (sub cr) = alpha L can be applied to any testing machine operated in compression where the specimen is loaded through knife edges. In this equation, L is the length of the specimen between knife edges and alpha is the force necessary to displace the upper end of the specimen unit horizontal distance relative to the lower end of the specimen in a direction normal to the knife edges through which the specimen is loaded.
Selective, sustained, and shift in attention in patients with diagnoses of schizophrenia.
Hagh-Shenas, H; Toobai, S; Makaremi, A
2002-12-01
Attentional deficits are a prominent aspect of cognitive dysfunction in schizophrenia. The present study was designed to investigate attention deficit in a group of patients with diagnosis of schizophrenia. According to the segmental set theory suggested by Hogarty and Flesher, three aspects of attention problems, selective, sustained, and shift in attention, were studied. The 30 patients hospitalized on three psychiatric wards at Shiraz and Isfahan and 30 normal healthy subjects matched for age, sex, and years of education were administered a computerized Continuous Performance Test, Stroop Color-word Test, and Wisconsin Card Sorting test. Analysis showed patients performed more poorly than control subjects on measured aspects of attention. The acute/chronic classification did not predict differences in attention scores between subtypes of schizophrenia, while the positive/negative classification did. Paranoid, undifferentiated, and residual groups by subtypes of schizophrenia showed similar performance on the Continuous Performance Test, but were significantly different on errors on the Wisconsin Card Sorting test and on reaction time to Stroop stimuli in the incongruent color-word condition. Patients with paranoid diagnosis performed better than other subtypes on these tasks. Present results suggest that the Continuous Performance Test is valuable for differentiating of schizophrenia spectrum disorder, while scores on Stroop and Wisconsin card sorting may have better diagnostic value for differentiating subtypes of the disorder.
Jha, Paridhi; Christensson, Kyllike; Svanberg, Agneta Skoog; Larsson, Margareta; Sharma, Bharati; Johansson, Eva
2016-08-01
this study aimed to explore and understand the perceptions and experiences of women regarding quality of care received during childbirth in public health facilities. qualitative in-depth interviews were conducted and analysed using the Grounded Theory approach. thirteen women who had given vaginal birth to a healthy newborn infant. participants were interviewed in their homes in one district of Chhattisgarh, India. the interview followed a pre-tested guide comprising one key question: How did the women experience and perceive the care provided during labour and childbirth? 'cashless childbirth but at a cost: subordination during childbirth' was identified as the core category. Women chose a public health facility due to their socio-economic limitations, and to have a cashless and safe childbirth. Participants expressed a sense of trust in public health facilities, and verbalised that free food and ambulance services provided by the government were appreciated. Care during normal birth was medicalised, and women lacked control over the process of their labour. Often, the women experienced verbal and physical abuse, which led to passive acceptance of all the services provided to avoid confrontation with the providers. increasingly higher numbers of women give birth in public health facilities in Chhattisgarh, India, and women who have no alternative place to have a safe and normal birth are the main beneficiaries. The labour rooms are functional, but there is a need for improvement of interpersonal processes, information-sharing, and sensitive treatment of women seeking childbirth services in public health facilities. Copyright © 2016 Elsevier Ltd. All rights reserved.
Color preference in red-green dichromats.
Álvaro, Leticia; Moreira, Humberto; Lillo, Julio; Franklin, Anna
2015-07-28
Around 2% of males have red-green dichromacy, which is a genetic disorder of color vision where one type of cone photoreceptor is missing. Here we investigate the color preferences of dichromats. We aim (i) to establish whether the systematic and reliable color preferences of normal trichromatic observers (e.g., preference maximum at blue, minimum at yellow-green) are affected by dichromacy and (ii) to test theories of color preference with a dichromatic sample. Dichromat and normal trichromat observers named and rated how much they liked saturated, light, dark, and focal colors twice. Trichromats had the expected pattern of preference. Dichromats had a reliable pattern of preference that was different to trichromats, with a preference maximum rather than minimum at yellow and a much weaker preference for blue than trichromats. Color preference was more affected in observers who lacked the cone type sensitive to long wavelengths (protanopes) than in those who lacked the cone type sensitive to medium wavelengths (deuteranopes). Trichromats' preferences were summarized effectively in terms of cone-contrast between color and background, and yellow-blue cone-contrast could account for dichromats' pattern of preference, with some evidence for residual red-green activity in deuteranopes' preference. Dichromats' color naming also could account for their color preferences, with colors named more accurately and quickly being more preferred. This relationship between color naming and preference also was present for trichromat males but not females. Overall, the findings provide novel evidence on how dichromats experience color, advance the understanding of why humans like some colors more than others, and have implications for general theories of aesthetics.
Color preference in red–green dichromats
Álvaro, Leticia; Moreira, Humberto; Lillo, Julio; Franklin, Anna
2015-01-01
Around 2% of males have red–green dichromacy, which is a genetic disorder of color vision where one type of cone photoreceptor is missing. Here we investigate the color preferences of dichromats. We aim (i) to establish whether the systematic and reliable color preferences of normal trichromatic observers (e.g., preference maximum at blue, minimum at yellow-green) are affected by dichromacy and (ii) to test theories of color preference with a dichromatic sample. Dichromat and normal trichromat observers named and rated how much they liked saturated, light, dark, and focal colors twice. Trichromats had the expected pattern of preference. Dichromats had a reliable pattern of preference that was different to trichromats, with a preference maximum rather than minimum at yellow and a much weaker preference for blue than trichromats. Color preference was more affected in observers who lacked the cone type sensitive to long wavelengths (protanopes) than in those who lacked the cone type sensitive to medium wavelengths (deuteranopes). Trichromats’ preferences were summarized effectively in terms of cone-contrast between color and background, and yellow-blue cone-contrast could account for dichromats’ pattern of preference, with some evidence for residual red–green activity in deuteranopes’ preference. Dichromats’ color naming also could account for their color preferences, with colors named more accurately and quickly being more preferred. This relationship between color naming and preference also was present for trichromat males but not females. Overall, the findings provide novel evidence on how dichromats experience color, advance the understanding of why humans like some colors more than others, and have implications for general theories of aesthetics. PMID:26170287
Strengthening Theoretical Testing in Criminology Using Agent-based Modeling.
Johnson, Shane D; Groff, Elizabeth R
2014-07-01
The Journal of Research in Crime and Delinquency ( JRCD ) has published important contributions to both criminological theory and associated empirical tests. In this article, we consider some of the challenges associated with traditional approaches to social science research, and discuss a complementary approach that is gaining popularity-agent-based computational modeling-that may offer new opportunities to strengthen theories of crime and develop insights into phenomena of interest. Two literature reviews are completed. The aim of the first is to identify those articles published in JRCD that have been the most influential and to classify the theoretical perspectives taken. The second is intended to identify those studies that have used an agent-based model (ABM) to examine criminological theories and to identify which theories have been explored. Ecological theories of crime pattern formation have received the most attention from researchers using ABMs, but many other criminological theories are amenable to testing using such methods. Traditional methods of theory development and testing suffer from a number of potential issues that a more systematic use of ABMs-not without its own issues-may help to overcome. ABMs should become another method in the criminologists toolbox to aid theory testing and falsification.
What Goes Up... Gravity and Scientific Method
NASA Astrophysics Data System (ADS)
Kosso, Peter
2017-02-01
Preface; 1. Introduction; 2. Forces and fields; 3. Basic Newtonian theory; 4. Gravity before Newton; 5. Early modern astronomy; 6. Connecting physics and astronomy; 7. Connecting kinematics and dynamics; 8. Testing the Newtonian theory; 9. Challenging the Newtonian theory; 10. Geometry and equivalence; 11. The general theory of relativity; 12. Testing the general theory of relativity; 13. Using the theory to explore the universe; 14. Dark matter; 15. The structure of scientific knowledge; Glossary; Bibliography.
The MOC reflex during active listening to speech.
Garinis, Angela C; Glattke, Theodore; Cone, Barbara K
2011-10-01
The purpose of this study was to test the hypothesis that active listening to speech would increase medial olivocochlear (MOC) efferent activity for the right vs. the left ear. Click-evoked otoacoustic emissions (CEOAEs) were evoked by 60-dB p.e. SPL clicks in 13 normally hearing adults in 4 test conditions for each ear: (a) in quiet; (b) with 60-dB SPL contralateral broadband noise; (c) with words embedded (at -3-dB signal-to-noise ratio [SNR]) in 60-dB SPL contralateral noise during which listeners directed attention to the words; and (d) for the same SNR as in the 3rd condition, with words played backwards. There was greater suppression during active listening compared with passive listening that was apparent in the latency range of 6- to 18-ms poststimulus onset. Ear differences in CEOAE amplitude were observed in all conditions, with right-ear amplitudes larger than those for the left. The absolute difference between CEOAE amplitude in quiet and with contralateral noise, a metric of suppression, was equivalent for right and left ears. When the amplitude differences were normalized, suppression was greater for noise presented to the right and the effect measured for a probe in the left ear. The findings support the theory that cortical mechanisms involved in listening to speech affect cochlear function through the MOC efferent system.
Helps, Suzannah K; Bamford, Susan; Sonuga-Barke, Edmund J S; Söderlund, Göran B W
2014-01-01
Noise often has detrimental effects on performance. However, because of the phenomenon of stochastic resonance (SR), auditory white noise (WN) can alter the "signal to noise" ratio and improve performance. The Moderate Brain Arousal (MBA) model postulates different levels of internal "neural noise" in individuals with different attentional capacities. This in turn determines the particular WN level most beneficial in each individual case-with one level of WN facilitating poor attenders but hindering super-attentive children. The objective of the present study is to find out if added WN affects cognitive performance differently in children that differ in attention ability. Participants were teacher-rated super- (N = 25); normal- (N = 29) and sub-attentive (N = 36) children (aged 8 to 10 years). Two non-executive function (EF) tasks (a verbal episodic recall task and a delayed verbal recognition task) and two EF tasks (a visuo-spatial working memory test and a Go-NoGo task) were performed under three WN levels. The non-WN condition was only used to control for potential differences in background noise in the group testing situations. There were different effects of WN on performance in the three groups-adding moderate WN worsened the performance of super-attentive children for both task types and improved EF performance in sub-attentive children. The normal-attentive children's performance was unaffected by WN exposure. The shift from moderate to high levels of WN had little further effect on performance in any group. The predicted differential effect of WN on performance was confirmed. However, the failure to find evidence for an inverted U function challenges current theories. Alternative explanations are discussed. We propose that WN therapy should be further investigated as a possible non-pharmacological treatment for inattention.
Superselection Structure of Massive Quantum Field Theories in 1+1 Dimensions
NASA Astrophysics Data System (ADS)
Müger, Michael
We show that a large class of massive quantum field theories in 1+1 dimensions, characterized by Haag duality and the split property for wedges, does not admit locally generated superselection sectors in the sense of Doplicher, Haag and Roberts. Thereby the extension of DHR theory to 1+1 dimensions due to Fredenhagen, Rehren and Schroer is vacuous for such theories. Even charged representations which are localizable only in wedge regions are ruled out. Furthermore, Haag duality holds in all locally normal representations. These results are applied to the theory of soliton sectors. Furthermore, the extension of localized representations of a non-Haag dual net to the dual net is reconsidered. It must be emphasized that these statements do not apply to massless theories since they do not satisfy the above split property. In particular, it is known that positive energy representations of conformally invariant theories are DHR representations.