Sample records for source theory analysis

  1. ERP Reliability Analysis (ERA) Toolbox: An open-source toolbox for analyzing the reliability of event-related brain potentials.

    PubMed

    Clayson, Peter E; Miller, Gregory A

    2017-01-01

    Generalizability theory (G theory) provides a flexible, multifaceted approach to estimating score reliability. G theory's approach to estimating score reliability has important advantages over classical test theory that are relevant for research using event-related brain potentials (ERPs). For example, G theory does not require parallel forms (i.e., equal means, variances, and covariances), can handle unbalanced designs, and provides a single reliability estimate for designs with multiple sources of error. This monograph provides a detailed description of the conceptual framework of G theory using examples relevant to ERP researchers, presents the algorithms needed to estimate ERP score reliability, and provides a detailed walkthrough of newly-developed software, the ERP Reliability Analysis (ERA) Toolbox, that calculates score reliability using G theory. The ERA Toolbox is open-source, Matlab software that uses G theory to estimate the contribution of the number of trials retained for averaging, group, and/or event types on ERP score reliability. The toolbox facilitates the rigorous evaluation of psychometric properties of ERP scores recommended elsewhere in this special issue. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Information Foraging Theory: A Framework for Intelligence Analysis

    DTIC Science & Technology

    2014-11-01

    oceanographic information, human intelligence (HUMINT), open-source intelligence ( OSINT ), and information provided by other governmental departments [1][5...Human Intelligence IFT Information Foraging Theory LSA Latent Semantic Similarity MVT Marginal Value Theorem OFT Optimal Foraging Theory OSINT

  3. An Unsolved Electric Circuit: A Common Misconception

    ERIC Educational Resources Information Center

    Harsha, N. R. Sree; Sreedevi, A.; Prakash, Anupama

    2015-01-01

    Despite a number of theories in circuit analysis, little is known about the behaviour of ideal equal voltage sources in parallel, connected across a resistive load. We neither have any theory that can predict the voltage source that provides the load current, nor is there any method to test it experimentally. In a series of experiments performed…

  4. Point source moving above a finite impedance reflecting plane - Experiment and theory

    NASA Technical Reports Server (NTRS)

    Norum, T. D.; Liu, C. H.

    1978-01-01

    A widely used experimental version of the acoustic monopole consists of an acoustic driver of restricted opening forced by a discrete frequency oscillator. To investigate the effects of forward motion on this source, it was mounted above an automobile and driven over an asphalt surface at constant speed past a microphone array. The shapes of the received signal were compared to results computed from an analysis of a fluctuating-mass-type point source moving above a finite impedance reflecting plane. Good agreement was found between experiment and theory when a complex normal impedance representative of a fairly hard acoustic surface was used in the analysis.

  5. Cultural Geography Model Validation

    DTIC Science & Technology

    2010-03-01

    the Cultural Geography Model (CGM), a government owned, open source multi - agent system utilizing Bayesian networks, queuing systems, the Theory of...referent determined either from theory or SME opinion. 4. CGM Overview The CGM is a government-owned, open source, data driven multi - agent social...HSCB, validation, social network analysis ABSTRACT: In the current warfighting environment , the military needs robust modeling and simulation (M&S

  6. Communicating Science to Impact Learning? A Phenomenological Inquiry into 4th and 5th Graders' Perceptions of Science Information Sources

    ERIC Educational Resources Information Center

    Gelmez Burakgazi, Sevinc; Yildirim, Ali; Weeth Feinstein, Noah

    2016-01-01

    Rooted in science education and science communication studies, this study examines 4th and 5th grade students' perceptions of science information sources (SIS) and their use in communicating science to students. It combines situated learning theory with uses and gratifications theory in a qualitative phenomenological analysis. Data were gathered…

  7. The Use of Narrative Paradigm Theory in Assessing Audience Value Conflict in Image Advertising.

    ERIC Educational Resources Information Center

    Stutts, Nancy B.; Barker, Randolph T.

    1999-01-01

    Presents an analysis of image advertisement developed from Narrative Paradigm Theory. Suggests that the nature of postmodern culture makes image advertising an appropriate external communication strategy for generating stake holder loyalty. Suggests that Narrative Paradigm Theory can identify potential sources of audience conflict by illuminating…

  8. Challenges in combining different data sets during analysis when using grounded theory.

    PubMed

    Rintala, Tuula-Maria; Paavilainen, Eija; Astedt-Kurki, Päivi

    2014-05-01

    To describe the challenges in combining two data sets during grounded theory analysis. The use of grounded theory in nursing research is common. It is a suitable method for studying human action and interaction. It is recommended that many alternative sources of data are collected to create as rich a dataset as possible. Data from interviews with people with diabetes (n=19) and their family members (n=19). Combining two data sets. When using grounded theory, there are numerous challenges in collecting and managing data, especially for the novice researcher. One challenge is to combine different data sets during the analysis. There are many methodological textbooks about grounded theory but there is little written in the literature about combining different data sets. Discussion is needed on the management of data and the challenges of grounded theory. This article provides a means for combining different data sets in the grounded theory analysis process.

  9. Refraction and scattering of sound by a shear layer

    NASA Technical Reports Server (NTRS)

    Schlinker, R. H.; Amiet, R. K.

    1980-01-01

    The angle and amplitude changes for acoustic waves refracted by a circular open jet shear layer were determined. The generalized refraction theory was assessed experimentally for on axis and off axis acoustic source locations as source frequency varied from 1 kHz to 10 kHz and free stream Mach number varied from 0.1 to 0.4. Angle and amplitude changes across the shear layer show good agreement with theory. Experiments confirm that the refraction theory is independent of shear layer thickness, acoustic source frequency, and source type. A generalized theory is, thus, available for correcting far field noise data acquired in open jet test facilities. The effect of discrete tone scattering by the open jet turbulent shear layer was also studied. Scattering effects were investigated over the same Mach number range as frequency varied from 5 kHz to 15 kHz. Attenuation of discrete tone amplitude and tone broadening were measured as a function of acoustic source position and radiation angle. Scattering was found to be stronger at angles close to the open jet axis than at 90 deg, and becomes stronger as the acoustic source position shifts downstream. A scattering analysis provided an estimate of the onset of discrete tone scattering.

  10. Applying circular economy innovation theory in business process modeling and analysis

    NASA Astrophysics Data System (ADS)

    Popa, V.; Popa, L.

    2017-08-01

    The overall aim of this paper is to develop a new conceptual framework for business process modeling and analysis using circular economy innovative theory as a source for business knowledge management. The last part of the paper presents an author’s proposed basic structure for a new business models applying circular economy innovation theories. For people working on new innovative business models in the field of the circular economy this paper provides new ideas for clustering their concepts.

  11. A Grounded Theory Approach to the Development of a Framework for Researching Children's Decision-Making Skills within Design and Technology Education

    ERIC Educational Resources Information Center

    Mettas, Alexandros; Norman, Eddie

    2011-01-01

    This paper discusses the establishment of a framework for researching children's decision-making skills in design and technology education through taking a grounded theory approach. Three data sources were used: (1) analysis of available literature; (2) curriculum analysis and interviews with teachers concerning their practice in relation to their…

  12. Facilitating Constructive Alignment in Power Systems Engineering Education Using Free and Open-Source Software

    ERIC Educational Resources Information Center

    Vanfretti, L.; Milano, F.

    2012-01-01

    This paper describes how the use of free and open-source software (FOSS) can facilitate the application of constructive alignment theory in power systems engineering education by enabling the deep learning approach in power system analysis courses. With this aim, this paper describes the authors' approach in using the Power System Analysis Toolbox…

  13. Accoustic waveform logging--Advances in theory and application

    USGS Publications Warehouse

    Paillet, F.L.; Cheng, C.H.; Pennington , W.D.

    1992-01-01

    Full-waveform acoustic logging has made significant advances in both theory and application in recent years, and these advances have greatly increased the capability of log analysts to measure the physical properties of formations. Advances in theory provide the analytical tools required to understand the properties of measured seismic waves, and to relate those properties to such quantities as shear and compressional velocity and attenuation, and primary and fracture porosity and permeability of potential reservoir rocks. The theory demonstrates that all parts of recorded waveforms are related to various modes of propagation, even in the case of dipole and quadrupole source logging. However, the theory also indicates that these mode properties can be used to design velocity and attenuation picking schemes, and shows how source frequency spectra can be selected to optimize results in specific applications. Synthetic microseismogram computations are an effective tool in waveform interpretation theory; they demonstrate how shear arrival picks and mode attenuation can be used to compute shear velocity and intrinsic attenuation, and formation permeability for monopole, dipole and quadrupole sources. Array processing of multi-receiver data offers the opportunity to apply even more sophisticated analysis techniques. Synthetic microseismogram data is used to illustrate the application of the maximum-likelihood method, semblance cross-correlation, and Prony's method analysis techniques to determine seismic velocities and attenuations. The interpretation of acoustic waveform logs is illustrated by reviews of various practical applications, including synthetic seismogram generation, lithology determination, estimation of geomechanical properties in situ, permeability estimation, and design of hydraulic fracture operations.

  14. An exploratory analysis of the nature of informal knowledge underlying theories of planned action used for public health oriented knowledge translation.

    PubMed

    Kothari, Anita; Boyko, Jennifer A; Campbell-Davison, Andrea

    2015-09-09

    Informal knowledge is used in public health practice to make sense of research findings. Although knowledge translation theories highlight the importance of informal knowledge, it is not clear to what extent the same literature provides guidance in terms of how to use it in practice. The objective of this study was to address this gap by exploring what planned action theories suggest in terms of using three types of informal knowledge: local, experiential and expert. We carried out an exploratory secondary analysis of the planned action theories that informed the development of a popular knowledge translation theory. Our sample included twenty-nine (n = 29) papers. We extracted information from these papers about sources of and guidance for using informal knowledge, and then carried out a thematic analysis. We found that theories of planned action provide guidance (including sources of, methods for identifying, and suggestions for use) for using local, experiential and expert knowledge. This study builds on previous knowledge translation related work to provide insight into the practical use of informal knowledge. Public health practitioners can refer to the guidance summarized in this paper to inform their decision-making. Further research about how to use informal knowledge in public health practice is needed given the value being accorded to using informal knowledge in public health decision-making processes.

  15. Statistical correlation analysis for comparing vibration data from test and analysis

    NASA Technical Reports Server (NTRS)

    Butler, T. G.; Strang, R. F.; Purves, L. R.; Hershfeld, D. J.

    1986-01-01

    A theory was developed to compare vibration modes obtained by NASTRAN analysis with those obtained experimentally. Because many more analytical modes can be obtained than experimental modes, the analytical set was treated as expansion functions for putting both sources in comparative form. The dimensional symmetry was developed for three general cases: nonsymmetric whole model compared with a nonsymmetric whole structural test, symmetric analytical portion compared with a symmetric experimental portion, and analytical symmetric portion with a whole experimental test. The theory was coded and a statistical correlation program was installed as a utility. The theory is established with small classical structures.

  16. The Economics of Terrorism: Economics Methods of Analysis in the Study of Terrorism and Counterterrorism

    DTIC Science & Technology

    2010-12-01

    addition to outlining definitions, data sources, choice theory , game theory , and the economic consequences of terrorism, this study identifies how...stratégiques. Les auteurs sont le Maj Alain Rollin, le Maj Meaghan Setter et Mme Rachel Lea Heide, Ph.D., sous la direction du Lcol William Yee...18 7 Choice Theory and its Applications 7.1

  17. Transformative Learning: A Case for Using Grounded Theory as an Assessment Analytic

    ERIC Educational Resources Information Center

    Patterson, Barbara A. B.; Munoz, Leslie; Abrams, Leah; Bass, Caroline

    2015-01-01

    Transformative Learning Theory and pedagogies leverage disruptive experiences as catalysts for learning and teaching. By facilitating processes of critical analysis and reflection that challenge assumptions, transformative learning reframes what counts as knowledge and the sources and processes for gaining and producing it. Students develop a…

  18. Verbal Stimulus Control and the Intraverbal Relation

    ERIC Educational Resources Information Center

    Sundberg, Mark L.

    2016-01-01

    The importance of the intraverbal relation is missed in most theories of language. Skinner (1957) attributes this to traditional semantic theories of meaning that focus on the nonverbal referents of words and neglect verbal stimuli as separate sources of control for linguistic behavior. An analysis of verbal stimulus control is presented, along…

  19. Case II: Conflict recognition--the case of the misdirected faculty.

    PubMed

    Hoelscher, Diane C; Comer, Robert W

    2002-04-01

    Early recognition is fundamental to managing conflict. Successful leaders rely upon their ability to recognize conflict before it escalates into crisis. This article reviews the signs and sources of conflict along with related management theories. Conflict management includes understanding the sources and types of conflict as well as the impact potential; in the case presented, the leaders were unaware of conflict. Dr. Forester, the faculty member "in the middle," was in a precarious position. Her performance evaluation reflected unacceptable accomplishments. However, her self-assessment, based on the hiring agreement, was successful. Her requests for guidance and clarification were unproductive. What does she do now? The management theories that apply to the case of "the misdirected faculty" include analysis and discussion of communication, feedback, and expectancy theory. Action alternatives are presented to explore some of the options available to stimulate discussion and to provide readers with an eclectic approach to applying a case analysis.

  20. Risk Dimensions and Political Decisions Frame Environmental Communication: A Content Analysis of Seven U.S. Newspapers from 1970-2010

    ERIC Educational Resources Information Center

    Grantham, Susan; Vieira, Edward T., Jr.

    2014-01-01

    This project examined the focus of environmental news frames used in seven American newspapers between 1970 and 2010. During this time newspapers were a primary source of news. Based on gatekeeping and agenda-setting theory, as well as source credibility, the content analysis of 2,123 articles examined the environmental topics within the articles,…

  1. Quantum theory for 1D X-ray free electron laser

    DOE PAGES

    Anisimov, Petr Mikhaylovich

    2017-09-19

    Classical 1D X-ray Free Electron Laser (X-ray FEL) theory has stood the test of time by guiding FEL design and development prior to any full-scale analysis. Future X-ray FELs and inverse-Compton sources, where photon recoil approaches an electron energy spread value, push the classical theory to its limits of applicability. After substantial efforts by the community to find what those limits are, there is no universally agreed upon quantum approach to design and development of future X-ray sources. We offer a new approach to formulate the quantum theory for 1D X-ray FELs that has an obvious connection to the classicalmore » theory, which allows for immediate transfer of knowledge between the two regimes. In conclusion, we exploit this connection in order to draw quantum mechanical conclusions about the quantum nature of electrons and generated radiation in terms of FEL variables.« less

  2. Quantum theory for 1D X-ray free electron laser

    NASA Astrophysics Data System (ADS)

    Anisimov, Petr M.

    2018-06-01

    Classical 1D X-ray Free Electron Laser (X-ray FEL) theory has stood the test of time by guiding FEL design and development prior to any full-scale analysis. Future X-ray FELs and inverse-Compton sources, where photon recoil approaches an electron energy spread value, push the classical theory to its limits of applicability. After substantial efforts by the community to find what those limits are, there is no universally agreed upon quantum approach to design and development of future X-ray sources. We offer a new approach to formulate the quantum theory for 1D X-ray FELs that has an obvious connection to the classical theory, which allows for immediate transfer of knowledge between the two regimes. We exploit this connection in order to draw quantum mechanical conclusions about the quantum nature of electrons and generated radiation in terms of FEL variables.

  3. Gender differences in sexuality: a meta-analysis.

    PubMed

    Oliver, M B; Hyde, J S

    1993-07-01

    This meta-analysis surveyed 177 usable sources that reported data on gender differences on 21 different measures of sexual attitudes and behaviors. The largest gender difference was in incidence of masturbation: Men had the greater incidence (d = .96). There was also a large gender difference in attitudes toward casual sex: Males had considerably more permissive attitudes (d = .81). There were no gender differences in attitudes toward homosexuality or in sexual satisfaction. Most other gender differences were in the small-to-moderate range. Gender differences narrowed from the 1960s to the 1980s for many variables. Chodorow's neoanalytic theory, sociobiology, social learning theory, social role theory, and script theory are discussed in relation to these findings.

  4. EEGNET: An Open Source Tool for Analyzing and Visualizing M/EEG Connectome.

    PubMed

    Hassan, Mahmoud; Shamas, Mohamad; Khalil, Mohamad; El Falou, Wassim; Wendling, Fabrice

    2015-01-01

    The brain is a large-scale complex network often referred to as the "connectome". Exploring the dynamic behavior of the connectome is a challenging issue as both excellent time and space resolution is required. In this context Magneto/Electroencephalography (M/EEG) are effective neuroimaging techniques allowing for analysis of the dynamics of functional brain networks at scalp level and/or at reconstructed sources. However, a tool that can cover all the processing steps of identifying brain networks from M/EEG data is still missing. In this paper, we report a novel software package, called EEGNET, running under MATLAB (Math works, inc), and allowing for analysis and visualization of functional brain networks from M/EEG recordings. EEGNET is developed to analyze networks either at the level of scalp electrodes or at the level of reconstructed cortical sources. It includes i) Basic steps in preprocessing M/EEG signals, ii) the solution of the inverse problem to localize / reconstruct the cortical sources, iii) the computation of functional connectivity among signals collected at surface electrodes or/and time courses of reconstructed sources and iv) the computation of the network measures based on graph theory analysis. EEGNET is the unique tool that combines the M/EEG functional connectivity analysis and the computation of network measures derived from the graph theory. The first version of EEGNET is easy to use, flexible and user friendly. EEGNET is an open source tool and can be freely downloaded from this webpage: https://sites.google.com/site/eegnetworks/.

  5. EEGNET: An Open Source Tool for Analyzing and Visualizing M/EEG Connectome

    PubMed Central

    Hassan, Mahmoud; Shamas, Mohamad; Khalil, Mohamad; El Falou, Wassim; Wendling, Fabrice

    2015-01-01

    The brain is a large-scale complex network often referred to as the “connectome”. Exploring the dynamic behavior of the connectome is a challenging issue as both excellent time and space resolution is required. In this context Magneto/Electroencephalography (M/EEG) are effective neuroimaging techniques allowing for analysis of the dynamics of functional brain networks at scalp level and/or at reconstructed sources. However, a tool that can cover all the processing steps of identifying brain networks from M/EEG data is still missing. In this paper, we report a novel software package, called EEGNET, running under MATLAB (Math works, inc), and allowing for analysis and visualization of functional brain networks from M/EEG recordings. EEGNET is developed to analyze networks either at the level of scalp electrodes or at the level of reconstructed cortical sources. It includes i) Basic steps in preprocessing M/EEG signals, ii) the solution of the inverse problem to localize / reconstruct the cortical sources, iii) the computation of functional connectivity among signals collected at surface electrodes or/and time courses of reconstructed sources and iv) the computation of the network measures based on graph theory analysis. EEGNET is the unique tool that combines the M/EEG functional connectivity analysis and the computation of network measures derived from the graph theory. The first version of EEGNET is easy to use, flexible and user friendly. EEGNET is an open source tool and can be freely downloaded from this webpage: https://sites.google.com/site/eegnetworks/. PMID:26379232

  6. Defense Acquisition Research Journal. Volume 21, Number 1, Issue 68

    DTIC Science & Technology

    2014-01-01

    Harrison’s game theory model of competition examines the bidding behavior of two equal competitors, but it does not address character- istics that...analysis examines a series of outcomes in both competitive and sole-source acquisition programs, using a statistical model that builds on a game theory ...model- ing, within a game theory framework developed by Todd Harrison, to show that the DoD may actually incur increased costs from competi- tion

  7. A meta-analysis of work-family conflict and social support.

    PubMed

    French, Kimberly A; Dumani, Soner; Allen, Tammy D; Shockley, Kristen M

    2018-03-01

    The relationship between social support and work-family conflict is well-established, but the notion that different forms, sources, and types of social support as well as contextual factors can alter this relationship has been relatively neglected. To address this limitation, the current study provides the most comprehensive and in-depth examination of the relationship between social support and work-family conflict to date. We conduct a meta-analysis based on 1021 effect sizes and 46 countries to dissect the social support and work-family conflict relationship. Using social support theory as a theoretical framework, we challenge the assumption that social support measures are interchangeable by comparing work/family support relationships with work-family conflict across different support forms (behavior, perceptions), sources (e.g., supervisor, coworker, spouse), types (instrumental, emotional), and national contexts (cultural values, economic factors). National context hypotheses use a strong inferences paradigm in which utility and value congruence theoretical perspectives are pitted against one another. Significant results concerning support source are in line with social support theory, indicating that broad sources of support are more strongly related to work-family conflict than are specific sources of support. In line with utility perspective from social support theory, culture and economic national context significantly moderate some of the relationships between work/family support and work interference with family, indicating that social support is most beneficial in contexts in which it is needed or perceived as useful. The results suggest that organizational support may be the most important source of support overall. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anisimov, Petr Mikhaylovich

    Classical 1D X-ray Free Electron Laser (X-ray FEL) theory has stood the test of time by guiding FEL design and development prior to any full-scale analysis. Future X-ray FELs and inverse-Compton sources, where photon recoil approaches an electron energy spread value, push the classical theory to its limits of applicability. After substantial efforts by the community to find what those limits are, there is no universally agreed upon quantum approach to design and development of future X-ray sources. We offer a new approach to formulate the quantum theory for 1D X-ray FELs that has an obvious connection to the classicalmore » theory, which allows for immediate transfer of knowledge between the two regimes. In conclusion, we exploit this connection in order to draw quantum mechanical conclusions about the quantum nature of electrons and generated radiation in terms of FEL variables.« less

  9. Introduction to Generalized Functions with Applications in Aerodynamics and Aeroacoustics

    NASA Technical Reports Server (NTRS)

    Farassat, F.

    1994-01-01

    Generalized functions have many applications in science and engineering. One useful aspect is that discontinuous functions can be handled as easily as continuous or differentiable functions and provide a powerful tool in formulating and solving many problems of aerodynamics and acoustics. Furthermore, generalized function theory elucidates and unifies many ad hoc mathematical approaches used by engineers and scientists. We define generalized functions as continuous linear functionals on the space of infinitely differentiable functions with compact support, then introduce the concept of generalized differentiation. Generalized differentiation is the most important concept in generalized function theory and the applications we present utilize mainly this concept. First, some results of classical analysis, are derived with the generalized function theory. Other applications of the generalized function theory in aerodynamics discussed here are the derivations of general transport theorems for deriving governing equations of fluid mechanics, the interpretation of the finite part of divergent integrals, the derivation of the Oswatitsch integral equation of transonic flow, and the analysis of velocity field discontinuities as sources of vorticity. Applications in aeroacoustics include the derivation of the Kirchhoff formula for moving surfaces, the noise from moving surfaces, and shock noise source strength based on the Ffowcs Williams-Hawkings equation.

  10. Continuous wavelet transform analysis and modal location analysis acoustic emission source location for nuclear piping crack growth monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mohd, Shukri; Holford, Karen M.; Pullin, Rhys

    2014-02-12

    Source location is an important feature of acoustic emission (AE) damage monitoring in nuclear piping. The ability to accurately locate sources can assist in source characterisation and early warning of failure. This paper describe the development of a novelAE source location technique termed 'Wavelet Transform analysis and Modal Location (WTML)' based on Lamb wave theory and time-frequency analysis that can be used for global monitoring of plate like steel structures. Source location was performed on a steel pipe of 1500 mm long and 220 mm outer diameter with nominal thickness of 5 mm under a planar location test setup usingmore » H-N sources. The accuracy of the new technique was compared with other AE source location methods such as the time of arrival (TOA) techniqueand DeltaTlocation. Theresults of the study show that the WTML method produces more accurate location resultscompared with TOA and triple point filtering location methods. The accuracy of the WTML approach is comparable with the deltaT location method but requires no initial acoustic calibration of the structure.« less

  11. Intuitive theories of information: beliefs about the value of redundancy.

    PubMed

    Soll, J B

    1999-03-01

    In many situations, quantity estimates from multiple experts or diagnostic instruments must be collected and combined. Normatively, and all else equal, one should value information sources that are nonredundant, in the sense that correlation in forecast errors should be minimized. Past research on the preference for redundancy has been inconclusive. While some studies have suggested that people correctly place higher value on uncorrelated inputs when collecting estimates, others have shown that people either ignore correlation or, in some cases, even prefer it. The present experiments show that the preference for redundancy depends on one's intuitive theory of information. The most common intuitive theory identified is the Error Tradeoff Model (ETM), which explicitly distinguishes between measurement error and bias. According to ETM, measurement error can only be averaged out by consulting the same source multiple times (normatively false), and bias can only be averaged out by consulting different sources (normatively true). As a result, ETM leads people to prefer redundant estimates when the ratio of measurement error to bias is relatively high. Other participants favored different theories. Some adopted the normative model, while others were reluctant to mathematically average estimates from different sources in any circumstance. In a post hoc analysis, science majors were more likely than others to subscribe to the normative model. While tentative, this result lends insight into how intuitive theories might develop and also has potential ramifications for how statistical concepts such as correlation might best be learned and internalized. Copyright 1999 Academic Press.

  12. Mixture Modeling for Background and Sources Separation in x-ray Astronomical Images

    NASA Astrophysics Data System (ADS)

    Guglielmetti, Fabrizia; Fischer, Rainer; Dose, Volker

    2004-11-01

    A probabilistic technique for the joint estimation of background and sources in high-energy astrophysics is described. Bayesian probability theory is applied to gain insight into the coexistence of background and sources through a probabilistic two-component mixture model, which provides consistent uncertainties of background and sources. The present analysis is applied to ROSAT PSPC data (0.1-2.4 keV) in Survey Mode. A background map is modelled using a Thin-Plate spline. Source probability maps are obtained for each pixel (45 arcsec) independently and for larger correlation lengths, revealing faint and extended sources. We will demonstrate that the described probabilistic method allows for detection improvement of faint extended celestial sources compared to the Standard Analysis Software System (SASS) used for the production of the ROSAT All-Sky Survey (RASS) catalogues.

  13. Reliability analysis of the objective structured clinical examination using generalizability theory.

    PubMed

    Trejo-Mejía, Juan Andrés; Sánchez-Mendiola, Melchor; Méndez-Ramírez, Ignacio; Martínez-González, Adrián

    2016-01-01

    The objective structured clinical examination (OSCE) is a widely used method for assessing clinical competence in health sciences education. Studies using this method have shown evidence of validity and reliability. There are no published studies of OSCE reliability measurement with generalizability theory (G-theory) in Latin America. The aims of this study were to assess the reliability of an OSCE in medical students using G-theory and explore its usefulness for quality improvement. An observational cross-sectional study was conducted at National Autonomous University of Mexico (UNAM) Faculty of Medicine in Mexico City. A total of 278 fifth-year medical students were assessed with an 18-station OSCE in a summative end-of-career final examination. There were four exam versions. G-theory with a crossover random effects design was used to identify the main sources of variance. Examiners, standardized patients, and cases were considered as a single facet of analysis. The exam was applied to 278 medical students. The OSCE had a generalizability coefficient of 0.93. The major components of variance were stations, students, and residual error. The sites and the versions of the tests had minimum variance. Our study achieved a G coefficient similar to that found in other reports, which is acceptable for summative tests. G-theory allows the estimation of the magnitude of multiple sources of error and helps decision makers to determine the number of stations, test versions, and examiners needed to obtain reliable measurements.

  14. Reliability analysis of the objective structured clinical examination using generalizability theory.

    PubMed

    Trejo-Mejía, Juan Andrés; Sánchez-Mendiola, Melchor; Méndez-Ramírez, Ignacio; Martínez-González, Adrián

    2016-01-01

    Background The objective structured clinical examination (OSCE) is a widely used method for assessing clinical competence in health sciences education. Studies using this method have shown evidence of validity and reliability. There are no published studies of OSCE reliability measurement with generalizability theory (G-theory) in Latin America. The aims of this study were to assess the reliability of an OSCE in medical students using G-theory and explore its usefulness for quality improvement. Methods An observational cross-sectional study was conducted at National Autonomous University of Mexico (UNAM) Faculty of Medicine in Mexico City. A total of 278 fifth-year medical students were assessed with an 18-station OSCE in a summative end-of-career final examination. There were four exam versions. G-theory with a crossover random effects design was used to identify the main sources of variance. Examiners, standardized patients, and cases were considered as a single facet of analysis. Results The exam was applied to 278 medical students. The OSCE had a generalizability coefficient of 0.93. The major components of variance were stations, students, and residual error. The sites and the versions of the tests had minimum variance. Conclusions Our study achieved a G coefficient similar to that found in other reports, which is acceptable for summative tests. G-theory allows the estimation of the magnitude of multiple sources of error and helps decision makers to determine the number of stations, test versions, and examiners needed to obtain reliable measurements.

  15. Phase noise in oscillators as differential-algebraic systems with colored noise sources

    NASA Astrophysics Data System (ADS)

    Demir, Alper

    2004-05-01

    Oscillators are key components of many kinds of systems, particularly electronic and opto-electronic systems. Undesired perturbations, i.e. noise, in practical systems adversely affect the spectral and timing properties of the signals generated by oscillators resulting in phase noise and timing jitter, which are key performance limiting factors, being major contributors to bit-error-rate (BER) of RF and possibly optical communication systems, and creating synchronization problems in clocked and sampled-data electronic systems. In this paper, we review our work on the theory and numerical methods for nonlinear perturbation and noise analysis of oscillators described by a system of differential-algebraic equations (DAEs) with white and colored noise sources. The bulk of the work reviewed in this paper first appeared in [1], then in [2] and [3]. Prior to the work mentioned above, we developed a theory and numerical methods for nonlinear perturbation and noise analysis of oscillators described by a system of ordinary differential equations (ODEs) with white noise sources only [4, 5]. In this paper, we also discuss some open problems and issues in the modeling and analysis of phase noise both in free running oscillators and in phase/injection-locked ones.

  16. Asymptotic/numerical analysis of supersonic propeller noise

    NASA Technical Reports Server (NTRS)

    Myers, M. K.; Wydeven, R.

    1989-01-01

    An asymptotic analysis based on the Mach surface structure of the field of a supersonic helical source distribution is applied to predict thickness and loading noise radiated by high speed propeller blades. The theory utilizes an integral representation of the Ffowcs-Williams Hawkings equation in a fully linearized form. The asymptotic results are used for chordwise strips of the blade, while required spanwise integrations are performed numerically. The form of the analysis enables predicted waveforms to be interpreted in terms of Mach surface propagation. A computer code developed to implement the theory is described and found to yield results in close agreement with more exact computations.

  17. Testing of motor unit synchronization model for localized muscle fatigue.

    PubMed

    Naik, Ganesh R; Kumar, Dinesh K; Yadav, Vivek; Wheeler, Katherine; Arjunan, Sridhar

    2009-01-01

    Spectral compression of surface electromyogram (sEMG) is associated with onset of localized muscle fatigue. The spectral compression has been explained based on motor unit synchronization theory. According to this theory, motor units are pseudo randomly excited during muscle contraction, and with the onset of muscle fatigue the recruitment pattern changes such that motor unit firings become more synchronized. While this is widely accepted, there is little experimental proof of this phenomenon. This paper has used source dependence measures developed in research related to independent component analysis (ICA) to test this theory.

  18. Theoretical prediction of thick wing and pylon-fuselage-fanpod-nacelle aerodynamic characteristics at subcritical speeds. Part 1: Theory and results

    NASA Technical Reports Server (NTRS)

    Tulinius, J. R.

    1974-01-01

    The theoretical development and the comparison of results with data of a thick wing and pylon-fuselage-fanpod-nacelle analysis are presented. The analysis utilizes potential flow theory to compute the surface velocities and pressures, section lift and center of pressure, and the total configuration lift, moment, and vortex drag. The skin friction drag is also estimated in the analysis. The perturbation velocities induced by the wing and pylon, fuselage and fanpod, and nacelle are represented by source and vortex lattices, quadrilateral vortices, and source frustums, respectively. The strengths of these singularities are solved for simultaneously including all interference effects. The wing and pylon planforms, twists, cambers, and thickness distributions, and the fuselage and fanpod geometries can be arbitrary in shape, provided the surface gradients are smooth. The flow through nacelle is assumed to be axisymmetric. An axisymmetric center engine hub can also be included. The pylon and nacelle can be attached to the wing, fuselage, or fanpod.

  19. EEG and MEG data analysis in SPM8.

    PubMed

    Litvak, Vladimir; Mattout, Jérémie; Kiebel, Stefan; Phillips, Christophe; Henson, Richard; Kilner, James; Barnes, Gareth; Oostenveld, Robert; Daunizeau, Jean; Flandin, Guillaume; Penny, Will; Friston, Karl

    2011-01-01

    SPM is a free and open source software written in MATLAB (The MathWorks, Inc.). In addition to standard M/EEG preprocessing, we presently offer three main analysis tools: (i) statistical analysis of scalp-maps, time-frequency images, and volumetric 3D source reconstruction images based on the general linear model, with correction for multiple comparisons using random field theory; (ii) Bayesian M/EEG source reconstruction, including support for group studies, simultaneous EEG and MEG, and fMRI priors; (iii) dynamic causal modelling (DCM), an approach combining neural modelling with data analysis for which there are several variants dealing with evoked responses, steady state responses (power spectra and cross-spectra), induced responses, and phase coupling. SPM8 is integrated with the FieldTrip toolbox , making it possible for users to combine a variety of standard analysis methods with new schemes implemented in SPM and build custom analysis tools using powerful graphical user interface (GUI) and batching tools.

  20. EEG and MEG Data Analysis in SPM8

    PubMed Central

    Litvak, Vladimir; Mattout, Jérémie; Kiebel, Stefan; Phillips, Christophe; Henson, Richard; Kilner, James; Barnes, Gareth; Oostenveld, Robert; Daunizeau, Jean; Flandin, Guillaume; Penny, Will; Friston, Karl

    2011-01-01

    SPM is a free and open source software written in MATLAB (The MathWorks, Inc.). In addition to standard M/EEG preprocessing, we presently offer three main analysis tools: (i) statistical analysis of scalp-maps, time-frequency images, and volumetric 3D source reconstruction images based on the general linear model, with correction for multiple comparisons using random field theory; (ii) Bayesian M/EEG source reconstruction, including support for group studies, simultaneous EEG and MEG, and fMRI priors; (iii) dynamic causal modelling (DCM), an approach combining neural modelling with data analysis for which there are several variants dealing with evoked responses, steady state responses (power spectra and cross-spectra), induced responses, and phase coupling. SPM8 is integrated with the FieldTrip toolbox , making it possible for users to combine a variety of standard analysis methods with new schemes implemented in SPM and build custom analysis tools using powerful graphical user interface (GUI) and batching tools. PMID:21437221

  1. Influences of system uncertainties on the numerical transfer path analysis of engine systems

    NASA Astrophysics Data System (ADS)

    Acri, A.; Nijman, E.; Acri, A.; Offner, G.

    2017-10-01

    Practical mechanical systems operate with some degree of uncertainty. In numerical models uncertainties can result from poorly known or variable parameters, from geometrical approximation, from discretization or numerical errors, from uncertain inputs or from rapidly changing forcing that can be best described in a stochastic framework. Recently, random matrix theory was introduced to take parameter uncertainties into account in numerical modeling problems. In particular in this paper, Wishart random matrix theory is applied on a multi-body dynamic system to generate random variations of the properties of system components. Multi-body dynamics is a powerful numerical tool largely implemented during the design of new engines. In this paper the influence of model parameter variability on the results obtained from the multi-body simulation of engine dynamics is investigated. The aim is to define a methodology to properly assess and rank system sources when dealing with uncertainties. Particular attention is paid to the influence of these uncertainties on the analysis and the assessment of the different engine vibration sources. Examples of the effects of different levels of uncertainties are illustrated by means of examples using a representative numerical powertrain model. A numerical transfer path analysis, based on system dynamic substructuring, is used to derive and assess the internal engine vibration sources. The results obtained from this analysis are used to derive correlations between parameter uncertainties and statistical distribution of results. The derived statistical information can be used to advance the knowledge of the multi-body analysis and the assessment of system sources when uncertainties in model parameters are considered.

  2. Local tsunamis and earthquake source parameters

    USGS Publications Warehouse

    Geist, Eric L.; Dmowska, Renata; Saltzman, Barry

    1999-01-01

    This chapter establishes the relationship among earthquake source parameters and the generation, propagation, and run-up of local tsunamis. In general terms, displacement of the seafloor during the earthquake rupture is modeled using the elastic dislocation theory for which the displacement field is dependent on the slip distribution, fault geometry, and the elastic response and properties of the medium. Specifically, nonlinear long-wave theory governs the propagation and run-up of tsunamis. A parametric study is devised to examine the relative importance of individual earthquake source parameters on local tsunamis, because the physics that describes tsunamis from generation through run-up is complex. Analysis of the source parameters of various tsunamigenic earthquakes have indicated that the details of the earthquake source, namely, nonuniform distribution of slip along the fault plane, have a significant effect on the local tsunami run-up. Numerical methods have been developed to address the realistic bathymetric and shoreline conditions. The accuracy of determining the run-up on shore is directly dependent on the source parameters of the earthquake, which provide the initial conditions used for the hydrodynamic models.

  3. Quantum Theory of Superresolution for Incoherent Optical Imaging

    NASA Astrophysics Data System (ADS)

    Tsang, Mankei

    Rayleigh's criterion for resolving two incoherent point sources has been the most influential measure of optical imaging resolution for over a century. In the context of statistical image processing, violation of the criterion is especially detrimental to the estimation of the separation between the sources, and modern far-field superresolution techniques rely on suppressing the emission of close sources to enhance the localization precision. Using quantum optics, quantum metrology, and statistical analysis, here we show that, even if two close incoherent sources emit simultaneously, measurements with linear optics and photon counting can estimate their separation from the far field almost as precisely as conventional methods do for isolated sources, rendering Rayleigh's criterion irrelevant to the problem. Our results demonstrate that superresolution can be achieved not only for fluorophores but also for stars. Recent progress in generalizing our theory for multiple sources and spectroscopy will also be discussed. This work is supported by the Singapore National Research Foundation under NRF Grant No. NRF-NRFF2011-07 and the Singapore Ministry of Education Academic Research Fund Tier 1 Project R-263-000-C06-112.

  4. Evaluating Cognitive Theory: A Joint Modeling Approach Using Responses and Response Times

    ERIC Educational Resources Information Center

    Klein Entink, Rinke H.; Kuhn, Jorg-Tobias; Hornke, Lutz F.; Fox, Jean-Paul

    2009-01-01

    In current psychological research, the analysis of data from computer-based assessments or experiments is often confined to accuracy scores. Response times, although being an important source of additional information, are either neglected or analyzed separately. In this article, a new model is developed that allows the simultaneous analysis of…

  5. Investigation of the Statistics of Pure Tone Sound Power Injection from Low Frequency, Finite Sized Sources in a Reverberant Room

    NASA Technical Reports Server (NTRS)

    Smith, Wayne Farrior

    1973-01-01

    The effect of finite source size on the power statistics in a reverberant room for pure tone excitation was investigated. Theoretical results indicate that the standard deviation of low frequency, pure tone finite sources is always less than that predicted by point source theory and considerably less when the source dimension approaches one-half an acoustic wavelength or greater. A supporting experimental study was conducted utilizing an eight inch loudspeaker and a 30 inch loudspeaker at eleven source positions. The resulting standard deviation of sound power output of the smaller speaker is in excellent agreement with both the derived finite source theory and existing point source theory, if the theoretical data is adjusted to account for experimental incomplete spatial averaging. However, the standard deviation of sound power output of the larger speaker is measurably lower than point source theory indicates, but is in good agreement with the finite source theory.

  6. Instantaneous and time-averaged dispersion and measurement models for estimation theory applications with elevated point source plumes

    NASA Technical Reports Server (NTRS)

    Diamante, J. M.; Englar, T. S., Jr.; Jazwinski, A. H.

    1977-01-01

    Estimation theory, which originated in guidance and control research, is applied to the analysis of air quality measurements and atmospheric dispersion models to provide reliable area-wide air quality estimates. A method for low dimensional modeling (in terms of the estimation state vector) of the instantaneous and time-average pollutant distributions is discussed. In particular, the fluctuating plume model of Gifford (1959) is extended to provide an expression for the instantaneous concentration due to an elevated point source. Individual models are also developed for all parameters in the instantaneous and the time-average plume equations, including the stochastic properties of the instantaneous fluctuating plume.

  7. Qualitative Data Analysis for Health Services Research: Developing Taxonomy, Themes, and Theory

    PubMed Central

    Bradley, Elizabeth H; Curry, Leslie A; Devers, Kelly J

    2007-01-01

    Objective To provide practical strategies for conducting and evaluating analyses of qualitative data applicable for health services researchers. Data Sources and Design We draw on extant qualitative methodological literature to describe practical approaches to qualitative data analysis. Approaches to data analysis vary by discipline and analytic tradition; however, we focus on qualitative data analysis that has as a goal the generation of taxonomy, themes, and theory germane to health services research. Principle Findings We describe an approach to qualitative data analysis that applies the principles of inductive reasoning while also employing predetermined code types to guide data analysis and interpretation. These code types (conceptual, relationship, perspective, participant characteristics, and setting codes) define a structure that is appropriate for generation of taxonomy, themes, and theory. Conceptual codes and subcodes facilitate the development of taxonomies. Relationship and perspective codes facilitate the development of themes and theory. Intersectional analyses with data coded for participant characteristics and setting codes can facilitate comparative analyses. Conclusions Qualitative inquiry can improve the description and explanation of complex, real-world phenomena pertinent to health services research. Greater understanding of the processes of qualitative data analysis can be helpful for health services researchers as they use these methods themselves or collaborate with qualitative researchers from a wide range of disciplines. PMID:17286625

  8. Qualitative data analysis for health services research: developing taxonomy, themes, and theory.

    PubMed

    Bradley, Elizabeth H; Curry, Leslie A; Devers, Kelly J

    2007-08-01

    To provide practical strategies for conducting and evaluating analyses of qualitative data applicable for health services researchers. DATA SOURCES AND DESIGN: We draw on extant qualitative methodological literature to describe practical approaches to qualitative data analysis. Approaches to data analysis vary by discipline and analytic tradition; however, we focus on qualitative data analysis that has as a goal the generation of taxonomy, themes, and theory germane to health services research. We describe an approach to qualitative data analysis that applies the principles of inductive reasoning while also employing predetermined code types to guide data analysis and interpretation. These code types (conceptual, relationship, perspective, participant characteristics, and setting codes) define a structure that is appropriate for generation of taxonomy, themes, and theory. Conceptual codes and subcodes facilitate the development of taxonomies. Relationship and perspective codes facilitate the development of themes and theory. Intersectional analyses with data coded for participant characteristics and setting codes can facilitate comparative analyses. Qualitative inquiry can improve the description and explanation of complex, real-world phenomena pertinent to health services research. Greater understanding of the processes of qualitative data analysis can be helpful for health services researchers as they use these methods themselves or collaborate with qualitative researchers from a wide range of disciplines.

  9. Processable English: The Theory Behind the PENG System

    DTIC Science & Technology

    2009-06-01

    implicit - is often buried amongst masses of irrelevant data. Heralding from unstructured sources such as natural language documents, email, audio ...estimation and prediction, data-mining, social network analysis, and semantic search and visualisation . This report describes the theoretical

  10. Exploring the Epileptic Brain Network Using Time-Variant Effective Connectivity and Graph Theory.

    PubMed

    Storti, Silvia Francesca; Galazzo, Ilaria Boscolo; Khan, Sehresh; Manganotti, Paolo; Menegaz, Gloria

    2017-09-01

    The application of time-varying measures of causality between source time series can be very informative to elucidate the direction of communication among the regions of an epileptic brain. The aim of the study was to identify the dynamic patterns of epileptic networks in focal epilepsy by applying multivariate adaptive directed transfer function (ADTF) analysis and graph theory to high-density electroencephalographic recordings. The cortical network was modeled after source reconstruction and topology modulations were detected during interictal spikes. First a distributed linear inverse solution, constrained to the individual grey matter, was applied to the averaged spikes and the mean source activity over 112 regions, as identified by the Harvard-Oxford Atlas, was calculated. Then, the ADTF, a dynamic measure of causality, was used to quantify the connectivity strength between pairs of regions acting as nodes in the graph, and the measure of node centrality was derived. The proposed analysis was effective in detecting the focal regions as well as in characterizing the dynamics of the spike propagation, providing evidence of the fact that the node centrality is a reliable feature for the identification of the epileptogenic zones. Validation was performed by multimodal analysis as well as from surgical outcomes. In conclusion, the time-variant connectivity analysis applied to the epileptic patients can distinguish the generator of the abnormal activity from the propagation spread and identify the connectivity pattern over time.

  11. Probing the self-assembled nanostructures of functional polymers with synchrotron grazing incidence X-ray scattering.

    PubMed

    Ree, Moonhor

    2014-05-01

    For advanced functional polymers such as biopolymers, biomimic polymers, brush polymers, star polymers, dendritic polymers, and block copolymers, information about their surface structures, morphologies, and atomic structures is essential for understanding their properties and investigating their potential applications. Grazing incidence X-ray scattering (GIXS) is established for the last 15 years as the most powerful, versatile, and nondestructive tool for determining these structural details when performed with the aid of an advanced third-generation synchrotron radiation source with high flux, high energy resolution, energy tunability, and small beam size. One particular merit of this technique is that GIXS data can be obtained facilely for material specimens of any size, type, or shape. However, GIXS data analysis requires an understanding of GIXS theory and of refraction and reflection effects, and for any given material specimen, the best methods for extracting the form factor and the structure factor from the data need to be established. GIXS theory is reviewed here from the perspective of practical GIXS measurements and quantitative data analysis. In addition, schemes are discussed for the detailed analysis of GIXS data for the various self-assembled nanostructures of functional homopolymers, brush, star, and dendritic polymers, and block copolymers. Moreover, enhancements to the GIXS technique are discussed that can significantly improve its structure analysis by using the new synchrotron radiation sources such as third-generation X-ray sources with picosecond pulses and partial coherence and fourth-generation X-ray laser sources with femtosecond pulses and full coherence. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Power System Oscillatory Behaviors: Sources, Characteristics, & Analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Follum, James D.; Tuffner, Francis K.; Dosiek, Luke A.

    This document is intended to provide a broad overview of the sources, characteristics, and analyses of natural and forced oscillatory behaviors in power systems. These aspects are necessarily linked. Oscillations appear in measurements with distinguishing characteristics derived from the oscillation’s source. These characteristics determine which analysis methods can be appropriately applied, and the results from these analyses can only be interpreted correctly with an understanding of the oscillation’s origin. To describe oscillations both at their source within a physical power system and within measurements, a perspective from the boundary between power system and signal processing theory has been adopted.

  13. Concurrent analysis: towards generalisable qualitative research.

    PubMed

    Snowden, Austyn; Martin, Colin R

    2011-10-01

    This study develops an original method of qualitative analysis coherent with its interpretivist principles. The objective is to increase the likelihood of achieving generalisability and so improve the chance of the findings being translated into practice. Good qualitative research depends on coherent analysis of different types of data. The limitations of existing methodologies are first discussed to justify the need for a novel approach. To illustrate this approach, primary evidence is presented using the new methodology. The primary evidence consists of a constructivist grounded theory of how mental health nurses with prescribing authority integrate prescribing into practice. This theory is built concurrently from interviews, reflective accounts and case study data from the literature. Concurrent analysis. Ten research articles and 13 semi-structured interviews were sampled purposively and then theoretically and analysed concurrently using constructivist grounded theory. A theory of the process of becoming competent in mental health nurse prescribing was generated through this process. This theory was validated by 32 practising mental health nurse prescribers as an accurate representation of their experience. The methodology generated a coherent and generalisable theory. It is therefore claimed that concurrent analysis engenders consistent and iterative treatment of different sources of qualitative data in a manageable manner. This process supports facilitation of the highest standard of qualitative research. Concurrent analysis removes the artificial delineation of relevant literature from other forms of constructed data. This gives researchers clear direction to treat qualitative data consistently raising the chances of generalisability of the findings. Raising the generalisability of qualitative research will increase its chances of informing clinical practice. © 2010 Blackwell Publishing Ltd.

  14. ANALYSIS OF THE MOMENTS METHOD EXPERIMENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kloster, R.L.

    1959-09-01

    Monte Cario calculations show the effects of a plane water-air boundary on both fast neutron and gamma dose rates. Multigroup diffusion theory calculation for a reactor source shows the effects of a plane water-air boundary on thermal neutron dose rate. The results of Monte Cario and multigroup calculations are compared with experimental values. The predicted boundary effect for fast neutrons of 7.3% agrees within 16% with the measured effect of 6.3%. The gamma detector did not measure a boundary effect because it lacked sensitivity at low energies. However, the effect predicted for gamma rays of 5 to 10% is asmore » large as that for neutrons. An estimate of the boundary effect for thermal neutrons from a PoBe source is obtained from the results of muitigroup diffusion theory calcuiations for a reactor source. The calculated boundary effect agrees within 13% with the measured values. (auth)« less

  15. Learning Over Time: Using Rapid Prototyping Generative Analysis Experts and Reduction of Scope to Operationalize Design

    DTIC Science & Technology

    2010-05-04

    during the Vietnam Conflict. 67 David A. Kolb , Experiential Learning : Experience as the Source of Learning and Development. (Upper Saddle River, NJ...Essentials for Military Applications. Newport Paper #10. Newport: Newport War College Press. 1996. Kolb , David A. Experiential Learning : Experience... learning over analysis. A broad review of design theory suggests that four techniques - rapid prototyping, generative analysis, use of experts, and

  16. Analysis on Dangerous Source of Large Safety Accident in Storage Tank Area

    NASA Astrophysics Data System (ADS)

    Wang, Tong; Li, Ying; Xie, Tiansheng; Liu, Yu; Zhu, Xueyuan

    2018-01-01

    The difference between a large safety accident and a general accident is that the consequences of a large safety accident are particularly serious. To study the tank area which factors directly or indirectly lead to the occurrence of large-sized safety accidents. According to the three kinds of hazard source theory and the consequence cause analysis of the super safety accident, this paper analyzes the dangerous source of the super safety accident in the tank area from four aspects, such as energy source, large-sized safety accident reason, management missing, environmental impact Based on the analysis of three kinds of hazard sources and environmental analysis to derive the main risk factors and the AHP evaluation model is established, and after rigorous and scientific calculation, the weights of the related factors in four kinds of risk factors and each type of risk factors are obtained. The result of analytic hierarchy process shows that management reasons is the most important one, and then the environmental factors and the direct cause and Energy source. It should be noted that although the direct cause is relatively low overall importance, the direct cause of Failure of emergency measures and Failure of prevention and control facilities in greater weight.

  17. Ion diffusion may introduce spurious current sources in current-source density (CSD) analysis.

    PubMed

    Halnes, Geir; Mäki-Marttunen, Tuomo; Pettersen, Klas H; Andreassen, Ole A; Einevoll, Gaute T

    2017-07-01

    Current-source density (CSD) analysis is a well-established method for analyzing recorded local field potentials (LFPs), that is, the low-frequency part of extracellular potentials. Standard CSD theory is based on the assumption that all extracellular currents are purely ohmic, and thus neglects the possible impact from ionic diffusion on recorded potentials. However, it has previously been shown that in physiological conditions with large ion-concentration gradients, diffusive currents can evoke slow shifts in extracellular potentials. Using computer simulations, we here show that diffusion-evoked potential shifts can introduce errors in standard CSD analysis, and can lead to prediction of spurious current sources. Further, we here show that the diffusion-evoked prediction errors can be removed by using an improved CSD estimator which accounts for concentration-dependent effects. NEW & NOTEWORTHY Standard CSD analysis does not account for ionic diffusion. Using biophysically realistic computer simulations, we show that unaccounted-for diffusive currents can lead to the prediction of spurious current sources. This finding may be of strong interest for in vivo electrophysiologists doing extracellular recordings in general, and CSD analysis in particular. Copyright © 2017 the American Physiological Society.

  18. Searching for Order Within Chaos: Complexity Theorys Implications to Intelligence Support During Joint Operational Planning

    DTIC Science & Technology

    2017-06-09

    structures constantly arise in firefights and skirmishes on the battlefield. Source: Andrew Ilachinski, Artificial War: Multiagent- Based Simulation of...Alternative Methods of Analysis and Innovative Organizational Structures .” Conference, Rome, Italy March 31-April 2. ...Intelligence Analysis, Joint Operational Planning, Cellular Automata, Agent- Based Modeling 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18

  19. How Many Separable Sources? Model Selection In Independent Components Analysis

    PubMed Central

    Woods, Roger P.; Hansen, Lars Kai; Strother, Stephen

    2015-01-01

    Unlike mixtures consisting solely of non-Gaussian sources, mixtures including two or more Gaussian components cannot be separated using standard independent components analysis methods that are based on higher order statistics and independent observations. The mixed Independent Components Analysis/Principal Components Analysis (mixed ICA/PCA) model described here accommodates one or more Gaussian components in the independent components analysis model and uses principal components analysis to characterize contributions from this inseparable Gaussian subspace. Information theory can then be used to select from among potential model categories with differing numbers of Gaussian components. Based on simulation studies, the assumptions and approximations underlying the Akaike Information Criterion do not hold in this setting, even with a very large number of observations. Cross-validation is a suitable, though computationally intensive alternative for model selection. Application of the algorithm is illustrated using Fisher's iris data set and Howells' craniometric data set. Mixed ICA/PCA is of potential interest in any field of scientific investigation where the authenticity of blindly separated non-Gaussian sources might otherwise be questionable. Failure of the Akaike Information Criterion in model selection also has relevance in traditional independent components analysis where all sources are assumed non-Gaussian. PMID:25811988

  20. Student nurses' experiences of caring for infectious patients in source isolation. A hermeneutic phenomenological study.

    PubMed

    Cassidy, Irene

    2006-10-01

    To illuminate issues central to general student nurses' experiences of caring for isolated patients within the hospital environment, which may assist facilitators of learning to prepare students for caring roles. Because of the development of hospital-resistant micro-organisms, caring for patients in source isolation is a frequent occurrence for supernumerary students on the general nursing programme. Despite this, students' perceptions of caring for this client group remain under researched. Through methods grounded in hermeneutic phenomenology, eight students in the second year of the three-year undergraduate programme in general nursing were interviewed using an un-structured, open-ended and face-to-face interview approach. Data analysis was approached through thematic analysis. Four themes emerged: The organization: caring in context, Barriers and breaking the barriers, Theory and practice, Only a student. The imposed physical, psychological, social and emotional barriers of isolation dramatically alter the caring experience. Balancing the care of isolated patients to meet their individual needs while preventing the spread of infection has significance for students. Applying infection control theory to the care of patients in source isolation is vital for students' personal and professional development. Perceptions of supernumerary status influence students' experiences of caring for these patients. Designating equipment for the sole use of isolated patients assists students in maintaining infection control standards. Balancing the art and science of caring for patients in source isolation is important to reduce barriers to the student-patient relationship and to promote delivery of holistic care. Staff nurses should consider using available opportunities to impart recommended isolation practices to students thereby linking the theory of infection control to patient care. Providing structured, continuing education for all grades of staff would acknowledge the interdependence of all healthcare workers in controlling hospital-acquired infection.

  1. Quantifying learning in biotracer studies.

    PubMed

    Brown, Christopher J; Brett, Michael T; Adame, Maria Fernanda; Stewart-Koster, Ben; Bunn, Stuart E

    2018-04-12

    Mixing models have become requisite tools for analyzing biotracer data, most commonly stable isotope ratios, to infer dietary contributions of multiple sources to a consumer. However, Bayesian mixing models will always return a result that defaults to their priors if the data poorly resolve the source contributions, and thus, their interpretation requires caution. We describe an application of information theory to quantify how much has been learned about a consumer's diet from new biotracer data. We apply the approach to two example data sets. We find that variation in the isotope ratios of sources limits the precision of estimates for the consumer's diet, even with a large number of consumer samples. Thus, the approach which we describe is a type of power analysis that uses a priori simulations to find an optimal sample size. Biotracer data are fundamentally limited in their ability to discriminate consumer diets. We suggest that other types of data, such as gut content analysis, must be used as prior information in model fitting, to improve model learning about the consumer's diet. Information theory may also be used to identify optimal sampling protocols in situations where sampling of consumers is limited due to expense or ethical concerns.

  2. Antenna theory: Analysis and design

    NASA Astrophysics Data System (ADS)

    Balanis, C. A.

    The book's main objective is to introduce the fundamental principles of antenna theory and to apply them to the analysis, design, and measurements of antennas. In a description of antennas, the radiation mechanism is discussed along with the current distribution on a thin wire. Fundamental parameters of antennas are examined, taking into account the radiation pattern, radiation power density, radiation intensity, directivity, numerical techniques, gain, antenna efficiency, half-power beamwidth, beam efficiency, bandwidth, polarization, input impedance, and antenna temperature. Attention is given to radiation integrals and auxiliary potential functions, linear wire antennas, loop antennas, linear and circular arrays, self- and mutual impedances of linear elements and arrays, broadband dipoles and matching techniques, traveling wave and broadband antennas, frequency independent antennas and antenna miniaturization, the geometrical theory of diffraction, horns, reflectors and lens antennas, antenna synthesis and continuous sources, and antenna measurements.

  3. Sierra Structural Dynamics Theory Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reese, Garth M.

    Sierra/SD provides a massively parallel implementation of structural dynamics finite element analysis, required for high fidelity, validated models used in modal, vibration, static and shock analysis of structural systems. This manual describes the theory behind many of the constructs in Sierra/SD. For a more detailed description of how to use Sierra/SD , we refer the reader to Sierra/SD, User's Notes . Many of the constructs in Sierra/SD are pulled directly from published material. Where possible, these materials are referenced herein. However, certain functions in Sierra/SD are specific to our implementation. We try to be far more complete in those areas.more » The theory manual was developed from several sources including general notes, a programmer notes manual, the user's notes and of course the material in the open literature. This page intentionally left blank.« less

  4. Ray-optical theory of broadband partially coherent emission

    NASA Astrophysics Data System (ADS)

    Epstein, Ariel; Tessler, Nir; Einziger, Pinchas D.

    2013-04-01

    We present a rigorous formulation of the effects of spectral broadening on emission of partially coherent source ensembles embedded in multilayered formations with arbitrarily shaped interfaces, provided geometrical optics is valid. The resulting ray-optical theory, applicable to a variety of optical systems from terahertz lenses to photovoltaic cells, quantifies the fundamental interplay between bandwidth and layer dimensions, and sheds light on common practices in optical analysis of statistical fields, e.g., disregarding multiple reflections or neglecting interference cross terms.

  5. The theory, practice, and future of process improvement in general thoracic surgery.

    PubMed

    Freeman, Richard K

    2014-01-01

    Process improvement, in its broadest sense, is the analysis of a given set of actions with the aim of elevating quality and reducing costs. The tenets of process improvement have been applied to medicine in increasing frequency for at least the last quarter century including thoracic surgery. This review outlines the theory underlying process improvement, the currently available data sources for process improvement and possible future directions of research. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Accretion of magnetized matter into a black hole.

    NASA Astrophysics Data System (ADS)

    Bisnovatyj-Kogan, G. S.

    1999-12-01

    Accretion is the main source of energy in binary X-ray sources inside the Galaxy, and most probably in active galactic nuclei, where numerous observational data for the existence of supermassive black holes have been obtained. Standard accretion disk theory is formulated which is based on local heat balance. The whole energy produced by turbulent viscous heating is supposed to be emitted to the sides of the disk. Sources of turbulence in the accretion disk are discussed, including nonlinear hydrodynamic turbulence, convection and magnetic field. In standard theory there are two branches of solution, optically thick, anti-optically thin, which are individually self-consistent. The choice between these solutions should be done on the basis of a stability analysis. Advection in the accretion disks is described by differential equations, which makes the theory nonlocal. The low-luminosity optically thin accretion disk model with advection under some conditions may become advectively dominated, carrying almost all the energy inside the black hole. A proper account for magnetic field in the process of accretion limits the energy advected into a black hole, and does not allow the radiative efficiency of accretion to become lower than about 1/4 of the standard accretion disk model efficiency.

  7. An analysis of innovation in materials and energy

    NASA Astrophysics Data System (ADS)

    Connelly, Michael

    This dissertation presents an analysis of innovation in engineering materials and energy sources. More than fifty engineering materials and fourteen energy sources were selected for an evaluation of the relationship between the yearly production activity and yearly patent counts, which may be considered as a measure of innovation, for each. Through the employment of correlation theory, best-fit and origin shift analyses, it has been determined here that engineering materials and energy sources display similar life cycle and innovative activity behaviors. Correlation theory revealed a relationship between the yearly production and yearly patent counts indicating the extent that production and innovation affect each other. Best-fit analysis determined that four-stage life cycles exist for both engineering materials (metals and non-metals) and energy sources. Correlation and best-fit indicators of an estimated Stage III are confirmed by the presence of an origin shift of the patent data when compared to the production data which indicates that patents, or innovation, are driving, or being driven by, production. This driving force could represent the constructive or destructive side of the innovative process, with such sides being delineated by a possible universal constant above which there is destructive innovative behavior and below which exists constructive innovation. The driving force may also illustrate the manner in which an engineering material or energy source transitions into an innovatively less active state, enter Stage IV and possibly become a commodity. A possible Stage V, indicating "Final Death", is introduced in which production is on a steep decline with no signs of recovery. Additionally, innovatively active energy sources are often found to utilize or be supported by innovatively active engineering materials. A model is presented that can be used for the evaluation of innovation and production that can be applied to both engineering materials and energy sources that may be used to predict the innovative behavior of these resources in order that they can be more effectively allocated and utilized.

  8. The Impact of the Photocopier on Peer Review and Nursing Theory.

    PubMed

    Nicoll, Leslie H

    Two influential publications in nursing, Nursing Research and Perspectives on Nursing Theory, are used to illustrate how a specific technology change-the invention and marketing of the photocopier-influenced knowledge dissemination and information utilization in nursing, perhaps in ways not immediately apparent. Content analysis and historical comparison, using editorials from Nursing Research, historical reports on technology development, and personal reflections on the genesis of Perspectives on Nursing Theory are used to create an argument for the role of technology in peer review, information utilization, and knowledge development in nursing. Multiple forces influence nursing science. Scholars should be alert to data inputs from many sources and respond accordingly.

  9. ANALYSIS OF DEUTERON STRIPPING EXPERIMENTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amado, R.D.

    1959-05-01

    Deuteron stripping experiments analyzed according to the theory of Butter are a nearly unique source of information on the orbital angular momentum and single-particle widths of nuclear bound states. A number of problems in the Butter theory remain. Chew and Low show that in reactions in which there is a contribution from thc exchange of a single particles there can appear isolated poles in the normalized Born approximation to the cross section and that the residue at these poles can be related to quantities of physical interest. Stripping is such a reactions and the Butter theory is the renormalized Bornmore » approximation. (A.C.)« less

  10. Pangenesis as a source of new genetic information. The history of a now disproven theory.

    PubMed

    Bergman, Gerald

    2006-01-01

    Evolution is based on natural selection of existing biological phenotypic traits. Natural selection can only eliminate traits. It cannot create new ones, requiring a theory to explain the origin of new genetic information. The theory of pangenesis was a major attempt to explain the source of new genetic information required to produce phenotypic variety. This theory, advocated by Darwin as the main source of genetic variety, has now been empirically disproved. It is currently a theory mainly of interest to science historians.

  11. Children's Ability to Distinguish between Memories from Multiple Sources: Implications for the Quality and Accuracy of Eyewitness Statements.

    ERIC Educational Resources Information Center

    Roberts, Kim P.

    2002-01-01

    Outlines five perspectives addressing alternate aspects of the development of children's source monitoring: source-monitoring theory, fuzzy-trace theory, schema theory, person-based perspective, and mental-state reasoning model. Discusses research areas with relation to forensic developmental psychology: agent identity, prospective processing,…

  12. The mass-zero spin-two field and gravitational theory.

    NASA Technical Reports Server (NTRS)

    Coulter, C. A.

    1972-01-01

    Demonstration that the conventional theory of the mass-zero spin-two field with sources introduces extraneous nonspin-two field components in source regions and fails to be covariant under the full or restricted conformal group. A modified theory is given, expressed in terms of the physical components of mass-zero spin-two field rather than in terms of 'potentials,' which has no extraneous components inside or outside sources, and which is covariant under the full conformal group. For a proper choice of source term, this modified theory has the correct Newtonian limit and automatically implies that a symmetric second-rank source tensor has zero divergence. It is shown that possibly a generally covariant form of the spin-two theory derived here can be constructed to agree with general relativity in all currently accessible experimental situations.

  13. An Ontology of Power: Perception and Reality in Conflict

    DTIC Science & Technology

    2016-12-01

    synthetic model was developed as the constant comparative analysis was resumed through the application of selected theory toward the original source...The synthetic model represents a series of maxims for the analysis of a complex social system, developed through a study of contemporary national...and categories. A model of strategic agency is proposed as an alternative framework for developing security strategy. The strategic agency model draws

  14. An analysis of the radiation from apertures in curved surfaces by the geometrical theory of diffraction. [ray technique for electromagnetic fields

    NASA Technical Reports Server (NTRS)

    Pathak, P. H.; Kouyoumjian, R. G.

    1974-01-01

    In this paper the geometrical theory of diffraction is extended to treat the radiation from apertures of slots in convex perfectly conducting surfaces. It is assumed that the tangential electric field in the aperture is known so that an equivalent infinitesimal source can be defined at each point in the aperture. Surface rays emanate from this source which is a caustic of the ray system. A launching coefficient is introduced to describe the excitation of the surface ray modes. If the field radiated from the surface is desired, the ordinary diffraction coefficients are used to determine the field of the rays shed tangentially from the surface rays. The field of the surface ray modes is not the field on the surface; hence if the mutual coupling between slots is of interest, a second coefficient related to the launching coefficient must be employed. In the region adjacent to the shadow boundary, the component of the field directly radiated from the source is represented by Fock-type functions. In the illuminated region the incident radiation from the source (this does not include the diffracted field components) is treated by geometrical optics. This extension of the geometrical theory of diffraction is applied to calculate the radiation from slots on elliptic cylinders, spheres, and spheroids.

  15. Job satisfaction of nurse practitioners: an analysis using Herzberg's theory.

    PubMed

    Koelbel, P W; Fuller, S G; Misener, T R

    1991-04-01

    The current sociopolitical and economic forces affecting health care may lead to job dissatisfaction among nurse practitioners, according to results of a South Carolina study. A mailed survey that consisted of the Index of Job Satisfaction and the Minnesota Satisfaction Questionnaire--Short Form was used to test Herzberg's dual-factor theory of job satisfaction. A response rate of 90 percent was attained, with a final sample of 132 nurse practitioners and midwives. Consistent with the predictions of Herzberg's model, intrinsic factors served as sources of job satisfaction, while extrinsic factors were the primary sources of job dissatisfaction. Nurse practitioners in the sample reported a moderate amount of satisfaction with their "overall jobs." Suggestions are provided for ways both nurse practitioners and health administrators can enhance job satisfaction.

  16. Occipital MEG Activity in the Early Time Range (<300 ms) Predicts Graded Changes in Perceptual Consciousness.

    PubMed

    Andersen, Lau M; Pedersen, Michael N; Sandberg, Kristian; Overgaard, Morten

    2016-06-01

    Two electrophysiological components have been extensively investigated as candidate neural correlates of perceptual consciousness: An early, occipitally realized component occurring 130-320 ms after stimulus onset and a late, frontally realized component occurring 320-510 ms after stimulus onset. Recent studies have suggested that the late component may not be uniquely related to perceptual consciousness, but also to sensory expectations, task associations, and selective attention. We conducted a magnetoencephalographic study; using multivariate analysis, we compared classification accuracies when decoding perceptual consciousness from the 2 components using sources from occipital and frontal lobes. We found that occipital sources during the early time range were significantly more accurate in decoding perceptual consciousness than frontal sources during both the early and late time ranges. These results are the first of its kind where the predictive values of the 2 components are quantitatively compared, and they provide further evidence for the primary importance of occipital sources in realizing perceptual consciousness. The results have important consequences for current theories of perceptual consciousness, especially theories emphasizing the role of frontal sources. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  17. The importance of quadrupole sources in prediction of transonic tip speed propeller noise

    NASA Technical Reports Server (NTRS)

    Hanson, D. B.; Fink, M. R.

    1978-01-01

    A theoretical analysis is presented for the harmonic noise of high speed, open rotors. Far field acoustic radiation equations based on the Ffowcs-Williams/Hawkings theory are derived for a static rotor with thin blades and zero lift. Near the plane of rotation, the dominant sources are the volume displacement and the rho U(2) quadrupole, where u is the disturbance velocity component in the direction blade motion. These sources are compared in both the time domain and the frequency domain using two dimensional airfoil theories valid in the subsonic, transonic, and supersonic speed ranges. For nonlifting parabolic arc blades, the two sources are equally important at speeds between the section critical Mach number and a Mach number of one. However, for moderately subsonic or fully supersonic flow over thin blade sections, the quadrupole term is negligible. It is concluded for thin blades that significant quadrupole noise radiation is strictly a transonic phenomenon and that it can be suppressed with blade sweep. Noise calculations are presented for two rotors, one simulating a helicopter main rotor and the other a model propeller. For the latter, agreement with test data was substantially improved by including the quadrupole source term.

  18. Open source tools for the information theoretic analysis of neural data.

    PubMed

    Ince, Robin A A; Mazzoni, Alberto; Petersen, Rasmus S; Panzeri, Stefano

    2010-01-01

    The recent and rapid development of open source software tools for the analysis of neurophysiological datasets consisting of simultaneous multiple recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and for the integration of information obtained at different spatial and temporal scales. In this review we focus on recent advances in open source toolboxes for the information theoretic analysis of neural responses. We also present examples of their use to investigate the role of spike timing precision, correlations across neurons, and field potential fluctuations in the encoding of sensory information. These information toolboxes, available both in MATLAB and Python programming environments, hold the potential to enlarge the domain of application of information theory to neuroscience and to lead to new discoveries about how neurons encode and transmit information.

  19. The deuteron-radius puzzle is alive: A new analysis of nuclear structure uncertainties

    NASA Astrophysics Data System (ADS)

    Hernandez, O. J.; Ekström, A.; Nevo Dinur, N.; Ji, C.; Bacca, S.; Barnea, N.

    2018-03-01

    To shed light on the deuteron radius puzzle we analyze the theoretical uncertainties of the nuclear structure corrections to the Lamb shift in muonic deuterium. We find that the discrepancy between the calculated two-photon exchange correction and the corresponding experimentally inferred value by Pohl et al. [1] remain. The present result is consistent with our previous estimate, although the discrepancy is reduced from 2.6 σ to about 2 σ. The error analysis includes statistic as well as systematic uncertainties stemming from the use of nucleon-nucleon interactions derived from chiral effective field theory at various orders. We therefore conclude that nuclear theory uncertainty is more likely not the source of the discrepancy.

  20. Cartan symmetries and global dynamical systems analysis in a higher-order modified teleparallel theory

    NASA Astrophysics Data System (ADS)

    Karpathopoulos, L.; Basilakos, S.; Leon, G.; Paliathanasis, A.; Tsamparlis, M.

    2018-07-01

    In a higher-order modified teleparallel theory cosmological we present analytical cosmological solutions. In particular we determine forms of the unknown potential which drives the scalar field such that the field equations form a Liouville integrable system. For the determination of the conservation laws we apply the Cartan symmetries. Furthermore, inspired from our solutions, a toy model is studied and it is shown that it can describe the Supernova data, while at the same time introduces dark matter components in the Hubble function. When the extra matter source is a stiff fluid then we show how analytical solutions for Bianchi I universes can be constructed from our analysis. Finally, we perform a global dynamical analysis of the field equations by using variables different from that of the Hubble-normalization.

  1. Preservice Biology Teachers' Conceptions About the Tentative Nature of Theories and Models in Biology

    NASA Astrophysics Data System (ADS)

    Reinisch, Bianca; Krüger, Dirk

    2018-02-01

    In research on the nature of science, there is a need to investigate the role and status of different scientific knowledge forms. Theories and models are two of the most important knowledge forms within biology and are the focus of this study. During interviews, preservice biology teachers ( N = 10) were asked about their understanding of theories and models. They were requested to give reasons why they see theories and models as either tentative or certain constructs. Their conceptions were then compared to philosophers' positions (e.g., Popper, Giere). A category system was developed from the qualitative content analysis of the interviews. These categories include 16 conceptions for theories ( n tentative = 11; n certai n = 5) and 18 conceptions for models ( n tentative = 10; n certain = 8). The analysis of the interviews showed that the preservice teachers gave reasons for the tentativeness or certainty of theories and models either due to their understanding of the terms or due to their understanding of the generation or evaluation of theories and models. Therefore, a variety of different terminology, from different sources, should be used in learning-teaching situations. Additionally, an understanding of which processes lead to the generation, evaluation, and refinement or rejection of theories and models should be discussed with preservice teachers. Within philosophy of science, there has been a shift from theories to models. This should be transferred to educational contexts by firstly highlighting the role of models and also their connections to theories.

  2. Finding joy in social work. II: Intrapersonal sources.

    PubMed

    Pooler, David Kenneth; Wolfer, Terry; Freeman, Miriam

    2014-07-01

    Despite the social work profession's strengths orientation, research on its workforce tends to focus on problems (for example, depression, problem drinking, compassion fatigue, burnout). In contrast, this study explored ways in which social workers find joy in their work. The authors used an appreciative inquiry approach, semistructured interviews (N = 26), and a collaborative grounded theory method of analysis. Participants identified interpersonal (making connections and making a difference) and intrapersonal (making meaning and making a life) sources of joy and reflected significant personal initiative in the process of finding joy. The authors present findings regarding these intrapersonal sources of joy.

  3. Experiments on the applicability of MAE techniques for predicting sound diffraction by irregular terrains. [Matched Asymptotic Expansion

    NASA Technical Reports Server (NTRS)

    Berthelot, Yves H.; Pierce, Allan D.; Kearns, James A.

    1987-01-01

    The sound field diffracted by a single smooth hill of finite impedance is studied both analytically, within the context of the theory of Matched Asymptotic Expansions (MAE), and experimentally, under laboratory scale modeling conditions. Special attention is given to the sound field on the diffracting surface and throughout the transition region between the illuminated and the shadow zones. The MAE theory yields integral equations that are amenable to numerical computations. Experimental results are obtained with a spark source producing a pulse of 42 microsec duration and about 130 Pa at 1 m. The insertion loss of the hill is inferred from measurements of the acoustic signals at two locations in the field, with subsequent Fourier analysis on an IBM PC/AT. In general, experimental results support the predictions of the MAE theory, and provide a basis for the analysis of more complicated geometries.

  4. The source-filter theory of whistle-like calls in marmosets: Acoustic analysis and simulation of helium-modulated voices.

    PubMed

    Koda, Hiroki; Tokuda, Isao T; Wakita, Masumi; Ito, Tsuyoshi; Nishimura, Takeshi

    2015-06-01

    Whistle-like high-pitched "phee" calls are often used as long-distance vocal advertisements by small-bodied marmosets and tamarins in the dense forests of South America. While the source-filter theory proposes that vibration of the vocal fold is modified independently from the resonance of the supralaryngeal vocal tract (SVT) in human speech, a source-filter coupling that constrains the vibration frequency to SVT resonance effectively produces loud tonal sounds in some musical instruments. Here, a combined approach of acoustic analyses and simulation with helium-modulated voices was used to show that phee calls are produced principally with the same mechanism as in human speech. The animal keeps the fundamental frequency (f0) close to the first formant (F1) of the SVT, to amplify f0. Although f0 and F1 are primarily independent, the degree of their tuning can be strengthened further by a flexible source-filter interaction, the variable strength of which depends upon the cross-sectional area of the laryngeal cavity. The results highlight the evolutionary antiquity and universality of the source-filter model in primates, but the study can also explore the diversification of vocal physiology, including source-filter interaction and its anatomical basis in non-human primates.

  5. Application and Exploration of Big Data Mining in Clinical Medicine

    PubMed Central

    Zhang, Yue; Guo, Shu-Li; Han, Li-Na; Li, Tie-Ling

    2016-01-01

    Objective: To review theories and technologies of big data mining and their application in clinical medicine. Data Sources: Literatures published in English or Chinese regarding theories and technologies of big data mining and the concrete applications of data mining technology in clinical medicine were obtained from PubMed and Chinese Hospital Knowledge Database from 1975 to 2015. Study Selection: Original articles regarding big data mining theory/technology and big data mining's application in the medical field were selected. Results: This review characterized the basic theories and technologies of big data mining including fuzzy theory, rough set theory, cloud theory, Dempster–Shafer theory, artificial neural network, genetic algorithm, inductive learning theory, Bayesian network, decision tree, pattern recognition, high-performance computing, and statistical analysis. The application of big data mining in clinical medicine was analyzed in the fields of disease risk assessment, clinical decision support, prediction of disease development, guidance of rational use of drugs, medical management, and evidence-based medicine. Conclusion: Big data mining has the potential to play an important role in clinical medicine. PMID:26960378

  6. Defining Adapted Physical Activity: International Perspectives

    ERIC Educational Resources Information Center

    Hutzler, Yeshayahu; Sherrill, Claudine

    2007-01-01

    The purpose of this study was to describe international perspectives concerning terms, definitions, and meanings of adapted physical activity (APA) as (a) activities or service delivery, (b) a profession, and (c) an academic field of study. Gergen's social constructionism, our theory, guided analysis of multiple sources of data via qualitative…

  7. Discovering Authorities and Hubs in Different Topological Web Graph Structures.

    ERIC Educational Resources Information Center

    Meghabghab, George

    2002-01-01

    Discussion of citation analysis on the Web considers Web hyperlinks as a source to analyze citations. Topics include basic graph theory applied to Web pages, including matrices, linear algebra, and Web topology; and hubs and authorities, including a search technique called HITS (Hyperlink Induced Topic Search). (Author/LRW)

  8. Generating Ideas in Jazz Improvisation: Where Theory Meets Practice

    ERIC Educational Resources Information Center

    Hargreaves, Wendy

    2012-01-01

    Idea generation is an integral component of jazz improvising. This article merges theoretical origins and practical experiences through the examination of two seminal works from Pressing and Sudnow. A comparative analysis yields three common sources with distinct characteristics. The greater body of jazz literature supports this potential link…

  9. Curriculum in Early Childhood Education: Re-Examined, Rediscovered, Renewed

    ERIC Educational Resources Information Center

    File, Nancy, Ed.; Mueller, Jennifer J., Ed.; Wisneski, Debora Basler, Ed.

    2011-01-01

    This book provides a critical examination of the sources, aims, and features of early childhood curricula. Providing a theoretical and philosophical foundation for examining teaching and learning, this book will provoke discussion and analysis among all readers. How has theory been used to understand, develop, and critique curriculum? Whose…

  10. The Body of Persuasion: A Theory of the Enthymeme.

    ERIC Educational Resources Information Center

    Walker, Jeffrey

    1994-01-01

    Examines the primary and not exclusively Aristotelian sources from which a more adequate concept of the enthymeme can be derived. Considers the relevance of that concept to the analysis of modern discourse. Analyzes works by Martin Luther King, Jr., and Roland Barthes as examples of enthymeming. (HB)

  11. Deciphering Chinese Strategic Deception: The Middle Kingdom’s First Aircraft Carrier

    DTIC Science & Technology

    2013-06-01

    data sources, gathering and maintaining the data needed , and completing and reviewing the collection of information. Send comments regarding this...89 C. THEORY OF ANALYSIS OF COMPETITIVE HYPOTHESES ............90 Step 1: Identifying Competing...Hypotheses .................................................91 Step 2: Listing of Consistent and Inconsistent Evidence and Arguments for Hypotheses

  12. Systemic Planning: An Annotated Bibliography and Literature Guide. Exchange Bibliography No. 91.

    ERIC Educational Resources Information Center

    Catanese, Anthony James

    Systemic planning is an operational approach to using scientific rigor and qualitative judgment in a complementary manner. It integrates rigorous techniques and methods from systems analysis, cybernetics, decision theory, and work programing. The annotated reference sources in this bibliography include those works that have been most influential…

  13. Neighborhood Context and Police Vigor: A Multilevel Analysis

    ERIC Educational Resources Information Center

    Sobol, James J.; Wu, Yuning; Sun, Ivan Y.

    2013-01-01

    This study provides a partial test of Klinger's ecological theory of police behavior using hierarchical linear modeling on 1,677 suspects who had encounters with police within 24 beats. The current study used data from four sources originally collected by the Project on Policing Neighborhoods (POPN), including systematic social observation,…

  14. Age Differences and Dimensions of Religious Behavior

    ERIC Educational Resources Information Center

    Johnson, Arthur L.; And Others

    1974-01-01

    This research explores the magnitude, sources, and consequences of differences among age strata in various dimensions of religious orientation and practice. Analysis of a national sample of 4444 Lutheran church members, ages 16-65, revealed patterns of age strata differences that supported a "selective gap" theory rather than a "great gap"…

  15. In Search of a Pony: Sources, Methods, Outcomes, and Motivated Reasoning.

    PubMed

    Stone, Marc B

    2018-05-01

    It is highly desirable to be able to evaluate the effect of policy interventions. Such evaluations should have expected outcomes based upon sound theory and be carefully planned, objectively evaluated and prospectively executed. In many cases, however, assessments originate with investigators' poorly substantiated beliefs about the effects of a policy. Instead of designing studies that test falsifiable hypotheses, these investigators adopt methods and data sources that serve as little more than descriptions of these beliefs in the guise of analysis. Interrupted time series analysis is one of the most popular forms of analysis used to present these beliefs. It is intuitively appealing but, in most cases, it is based upon false analogies, fallacious assumptions and analytical errors.

  16. Analysis of the United States Marine Corps Utilization of Defense Logistics Agency Disposition Services as a Source of Supply

    DTIC Science & Technology

    2011-12-01

    initiatives. However, this alignment is difficult within the DoD because “most incentives and motivations are not apparent for either government or industry ...p. 84). 12 Doane and Spencer (1997) stated that industry incentives and motivation seem to be based on the same profit and loss theories that were...clothing and eyewear , plasma televisions, lawn equipment, and so forth. These items could potentially be sourced from DDS on a free-issue basis

  17. Exact image theory for the problem of dielectric/magnetic slab

    NASA Technical Reports Server (NTRS)

    Lindell, I. V.

    1987-01-01

    Exact image method, recently introduced for the exact solution of electromagnetic field problems involving homogeneous half spaces and microstrip-like geometries, is developed for the problem of homogeneous slab of dielectric and/or magnetic material in free space. Expressions for image sources, creating the exact reflected and transmitted fields, are given and their numerical evaluation is demonstrated. Nonradiating modes, guided by the slab and responsible for the loss of convergence of the image functions, are considered and extracted. The theory allows, for example, an analysis of finite ground planes in microstrip antenna structures.

  18. Theory of electrohydrodynamic instabilities in electrolytic cells

    NASA Technical Reports Server (NTRS)

    Bruinsma, R.; Alexander, S.

    1990-01-01

    The paper develops the theory of the hydrodynamic stability of an electrolytic cell as a function of the imposed electric current. A new electrohydrodynamic instability is encountered when the current is forced to exceed the Nernst limit. The convection is driven by the volume force exerted by the electric field on space charges in the electrolyte. This intrinsic instability is found to be easily masked by extrinsic convection sources such as gravity or stirring. A linear stability analysis is performed and a dimensionless number Le is derived whose value determines the convection pattern.

  19. DYNAMICS AND STAGNATION IN THE MALTHUSIAN EPOCH.

    PubMed

    Ashraf, Quamrul; Galor, Oded

    2011-08-01

    This paper examines the central hypothesis of the influential Malthusian theory, according to which improvements in the technological environment during the pre-industrial era had generated only temporary gains in income per capita, eventually leading to a larger, but not significantly richer, population. Exploiting exogenous sources of cross-country variations in land productivity and the level of technological advancement the analysis demonstrates that, in accordance with the theory, technological superiority and higher land productivity had significant positive effects on population density but insignificant effects on the standard of living, during the time period 1-1500 CE.

  20. Every document and picture tells a story: using internal corporate document reviews, semiotics, and content analysis to assess tobacco advertising.

    PubMed

    Anderson, S J; Dewhirst, T; Ling, P M

    2006-06-01

    In this article we present communication theory as a conceptual framework for conducting documents research on tobacco advertising strategies, and we discuss two methods for analysing advertisements: semiotics and content analysis. We provide concrete examples of how we have used tobacco industry documents archives and tobacco advertisement collections iteratively in our research to yield a synergistic analysis of these two complementary data sources. Tobacco promotion researchers should consider adopting these theoretical and methodological approaches.

  1. A conceptual care model for individualized care approach in cardiac rehabilitation--combining both illness representation and self-efficacy.

    PubMed

    Lau-Walker, Margaret

    2006-02-01

    This paper analyses the two prominent psychological theories of patient response--illness representation and self-efficacy--and explore the possibilities of the development of a conceptual individualized care model that would make use of both theories. Analysis of the literature established common themes that were used as the basis to form a conceptual framework intended to assist in the joint application of these theories to therapeutic settings. Both theories emphasize personal experience, pre-construction of self, individual response to illness and treatment, and that the patients' beliefs are more influential in their recovery than the severity of the illness. Where the theories are most divergent is their application to therapeutic interventions, which reflects the different sources of influence that each theory emphasizes. Based on their similarities and differences it is possible to integrate the two theories into a conceptual care model. The Interactive Care Model combines both theories of patient response and provides an explicit framework for further research into the design of effective therapeutic interventions in rehabilitation care.

  2. A source mechanism producing HF-induced plasma lines (HFPLS) with up-shifted frequencies

    NASA Technical Reports Server (NTRS)

    Kuo, S. P.; Lee, M. C.

    1992-01-01

    Attention is given to a nonlinear scattering process analyzed as a source mechanism producing the frequency up-shifted HFPLs observed in the Arecibo ionospheric heating experiments. A physical picture is offered to explain how Langmuir waves with frequencies greater than the HF heater wave frequency can be produced in the heating experiments and be detected by incoherent radars as frequency up-shifted HFPLs. Since the considered scattering process occurs in a region near the reflection height, it explains why the frequency up-shifted HFPLs should originate from the altitude near the reflection height as observed. The theory also shows that the amount of frequency up-shift is inversely proportional to the frequency of the HF heater and increases linearly with the electron temperature. The quantitative analysis of the theory shows a good agreement with the experimental results.

  3. Theory-of-mind development influences suggestibility and source monitoring.

    PubMed

    Bright-Paul, Alexandra; Jarrold, Christopher; Wright, Daniel B

    2008-07-01

    According to the mental-state reasoning model of suggestibility, 2 components of theory of mind mediate reductions in suggestibility across the preschool years. The authors examined whether theory-of-mind performance may be legitimately separated into 2 components and explored the memory processes underlying the associations between theory of mind and suggestibility, independent of verbal ability. Children 3 to 6 years old completed 6 theory-of-mind tasks and a postevent misinformation procedure. Contrary to the model's prediction, a single latent theory-of-mind factor emerged, suggesting a single-component rather than a dual-component conceptualization of theory-of-mind performance. This factor provided statistical justification for computing a single composite theory-of-mind score. Improvements in theory of mind predicted reductions in suggestibility, independent of verbal ability (Study 1, n = 72). Furthermore, once attribution biases were controlled (Study 2, n = 45), there was also a positive relationship between theory of mind and source memory, but not recognition performance. The findings suggest a substantial, and possibly causal, association between theory-of-mind development and resistance to suggestion, driven specifically by improvements in source monitoring.

  4. Bayesian Analysis of Recognition Memory: The Case of the List-Length Effect

    ERIC Educational Resources Information Center

    Dennis, Simon; Lee, Michael D.; Kinnell, Angela

    2008-01-01

    Recognition memory experiments are an important source of empirical constraints for theories of memory. Unfortunately, standard methods for analyzing recognition memory data have problems that are often severe enough to prevent clear answers being obtained. A key example is whether longer lists lead to poorer recognition performance. The presence…

  5. Conceptualizations of School Leadership among High School Principals in Jamaica

    ERIC Educational Resources Information Center

    Newman, Mairette

    2013-01-01

    Drawing on evidence from research that adopted a qualitative case study design and used grounded theory methods of data analysis, this study examined how selected high school principals in Jamaica conceptualize school leadership. Data were sourced from semi-structured interviews, field observations as well as from school, principal and official…

  6. Electronically Recorded Music as a Communication Medium: A Structural Analysis with Selected Bibliography.

    ERIC Educational Resources Information Center

    Jorgensen, Earl; Mabry, Edward A.

    During the past decade, the influence of electronically recorded music and the message it transmits have caused media scholars to reexamine and modify the theories upon which the basic process of communication is dependent. While the five primary functions (source, transmitter, channel, receiver, and destination) remain unchanged, an additional…

  7. A Culturally Sensitive Analysis of Culture in the Context of Context: When Is Enough Enough?

    ERIC Educational Resources Information Center

    Kahn, Peter H., Jr.

    Cultural context is not the sole source of human knowledge. Postmodern theory, in both its deconstructionist and affirmative approaches, offers an incomplete basis by which to study race, class, and gender, and undermines ethical interaction. Deconstructionism calls for the abandonment of generalizable research findings, asserting that the concept…

  8. Communicating Science to Impact Learning? A Phenomenological Inquiry into 4th and 5th Graders' Perceptions of Science Information Sources

    NASA Astrophysics Data System (ADS)

    Gelmez Burakgazi, Sevinc; Yildirim, Ali; Weeth Feinstein, Noah

    2016-04-01

    Rooted in science education and science communication studies, this study examines 4th and 5th grade students' perceptions of science information sources (SIS) and their use in communicating science to students. It combines situated learning theory with uses and gratifications theory in a qualitative phenomenological analysis. Data were gathered through classroom observations and interviews in four Turkish elementary schools. Focus group interviews with 47 students and individual interviews with 17 teachers and 10 parents were conducted. Participants identified a wide range of SIS, including TV, magazines, newspapers, internet, peers, teachers, families, science centers/museums, science exhibitions, textbooks, science books, and science camps. Students reported using various SIS in school-based and non-school contexts to satisfy their cognitive, affective, personal, and social integrative needs. SIS were used for science courses, homework/project assignments, examination/test preparations, and individual science-related research. Students assessed SIS in terms of the perceived accessibility of the sources, the quality of the content, and the content presentation. In particular, some sources such as teachers, families, TV, science magazines, textbooks, and science centers/museums ("directive sources") predictably led students to other sources such as teachers, families, internet, and science books ("directed sources"). A small number of sources crossed context boundaries, being useful in both school and out. Results shed light on the connection between science education and science communication in terms of promoting science learning.

  9. Near real-time estimation of the seismic source parameters in a compressed domain

    NASA Astrophysics Data System (ADS)

    Rodriguez, Ismael A. Vera

    Seismic events can be characterized by its origin time, location and moment tensor. Fast estimations of these source parameters are important in areas of geophysics like earthquake seismology, and the monitoring of seismic activity produced by volcanoes, mining operations and hydraulic injections in geothermal and oil and gas reservoirs. Most available monitoring systems estimate the source parameters in a sequential procedure: first determining origin time and location (e.g., epicentre, hypocentre or centroid of the stress glut density), and then using this information to initialize the evaluation of the moment tensor. A more efficient estimation of the source parameters requires a concurrent evaluation of the three variables. The main objective of the present thesis is to address the simultaneous estimation of origin time, location and moment tensor of seismic events. The proposed method displays the benefits of being: 1) automatic, 2) continuous and, depending on the scale of application, 3) of providing results in real-time or near real-time. The inversion algorithm is based on theoretical results from sparse representation theory and compressive sensing. The feasibility of implementation is determined through the analysis of synthetic and real data examples. The numerical experiments focus on the microseismic monitoring of hydraulic fractures in oil and gas wells, however, an example using real earthquake data is also presented for validation. The thesis is complemented with a resolvability analysis of the moment tensor. The analysis targets common monitoring geometries employed in hydraulic fracturing in oil wells. Additionally, it is presented an application of sparse representation theory for the denoising of one-component and three-component microseismicity records, and an algorithm for improved automatic time-picking using non-linear inversion constraints.

  10. An introduction to generalized functions with some applications in aerodynamics and aeroacoustics

    NASA Technical Reports Server (NTRS)

    Farassat, F.

    1994-01-01

    In this paper, we start with the definition of generalized functions as continuous linear functionals on the space of infinitely differentiable functions with compact support. The concept of generalization differentiation is introduced next. This is the most important concept in generalized function theory and the applications we present utilize mainly this concept. First, some of the results of classical analysis, such as Leibniz rule of differentiation under the integral sign and the divergence theorem, are derived using the generalized function theory. It is shown that the divergence theorem remains valid for discontinuous vector fields provided that the derivatives are all viewed as generalized derivatives. This implies that all conservation laws of fluid mechanics are valid as they stand for discontinuous fields with all derivatives treated as generalized deriatives. Once these derivatives are written as ordinary derivatives and jumps in the field parameters across discontinuities, the jump conditions can be easily found. For example, the unsteady shock jump conditions can be derived from mass and momentum conservation laws. By using a generalized function theory, this derivative becomes trivial. Other applications of the generalized function theory in aerodynamics discussed in this paper are derivation of general transport theorems for deriving governing equations of fluid mechanics, the interpretation of finite part of divergent integrals, derivation of Oswatiitsch integral equation of transonic flow, and analysis of velocity field discontinuities as sources of vorticity. Applications in aeroacoustics presented here include the derivation of the Kirchoff formula for moving surfaces,the noise from moving surfaces, and shock noise source strength based on the Ffowcs Williams-Hawkings equation.

  11. Strategy of investment in electricity sources--Market value of a power plant and the electricity market

    NASA Astrophysics Data System (ADS)

    Bartnik, R.; Hnydiuk-Stefan, A.; Buryn, Z.

    2017-11-01

    This paper reports the results of the investment strategy analysis in different electricity sources. New methodology and theory of calculating the market value of the power plant and value of the electricity market supplied by it are presented. The financial gain forms the most important criteria in the assessment of an investment by an investor. An investment strategy has to involve a careful analysis of each considered project in order that the right decision and selection will be made while various components of the projects will be considered. The latter primarily includes the aspects of risk and uncertainty. Profitability of an investment in the electricity sources (as well as others) is offered by the measures applicable for the assessment of the economic effectiveness of an investment based on calculations e.g. power plant market value and the value of the electricity that is supplied by a power plant. The values of such measures decide on an investment strategy in the energy sources. This paper contains analysis of exemplary calculations results of power plant market value and the electricity market value supplied by it.

  12. Theory of Noise Generation from Moving Bodies with an Application to Helicopter Rotors

    NASA Technical Reports Server (NTRS)

    Farassat, F.

    1975-01-01

    Several expressions for the determination of the acoustic field of moving bodies are presented. The analysis is based on the Ffowcs Williams-Hawkings equation. Applying some proposed criteria, one of these expressions is singled out for numerical computation of acoustic pressure signature. The compactness of sources is not assumed and the main results are not restricted by the observer position. The distinction between compact and noncompact sources on moving surfaces is discussed. Some thickness noise calculations of helicopter rotors and comparison with experiments are included which suggest this mechanism as the source of high-speed blade slap of rotors.

  13. Numerical and experimental analysis of fusion offset in splicing photonic crystal fiber with CO2 laser

    NASA Astrophysics Data System (ADS)

    Jin, Wa; Bi, Weihong; Fu, Guangwei

    2014-09-01

    Single mode fibers (SMFs) need more fusion energy than PCFs during a splicing process, and it is necessary to make some offsets of the center of heat source toward to the SMFs. Based on the study of characteristics of heat transfer of PCFs and SMFs during splicing process with CO2 laser as the heat source, this paper reports the first systematic analysis of the optimal splicing offset of splicing SMFs and PCFs in theory and experiments. The results show that fusion splicing offsets can be applied to control the air-hole collapse and realize the practical splicing process between PCFs and SMFs with low loss.

  14. Systematic review of empiricism and theory in domestic minor sex trafficking research.

    PubMed

    Twis, Mary K; Shelton, Beth Anne

    2018-01-01

    Empiricism and the application of human behavior theory to inquiry are regarded as markers of high-quality research. Unfortunately, scholars have noted that there are many gaps in theory and empiricism within the human trafficking literature, calling into question the legitimacy of policies and practices that are derived from the available data. To date, there has not been an analysis of the extent to which empirical methods and human behavior theory have been applied to domestic minor sex trafficking (DMST) research as a subcategory of human trafficking inquiry. To fill this gap in the literature, this systematic review was designed to assess the degree to which DMST publications are a) empirical, and b) apply human behavior theory to inquiry. This analysis also focuses on answering research questions related to patterns within DMST study data sources, and patterns of human behavior theory application. The results of this review indicate that a minority of sampled DMST publications are empirical, a minority of those articles that were empirical apply a specific human behavior theory within the research design and reporting of results, a minority of articles utilize data collected directly from DMST victims, and that there are no discernible patterns in the application of human behavior theory to DMST research. This research note suggests that DMST research is limited by the same challenges as the larger body of human trafficking scholarship. Based upon these overarching findings, specific recommendations are offered to DMST researchers who are committed to enhancing the quality of DMST scholarship.

  15. Modeling the Hydraulics of Root Growth in Three Dimensions with Phloem Water Sources1[C][OA

    PubMed Central

    Wiegers, Brandy S.; Cheer, Angela Y.; Silk, Wendy K.

    2009-01-01

    Primary growth is characterized by cell expansion facilitated by water uptake generating hydrostatic (turgor) pressure to inflate the cell, stretching the rigid cell walls. The multiple source theory of root growth hypothesizes that root growth involves transport of water both from the soil surrounding the growth zone and from the mature tissue higher in the root via phloem and protophloem. Here, protophloem water sources are used as boundary conditions in a classical, three-dimensional model of growth-sustaining water potentials in primary roots. The model predicts small radial gradients in water potential, with a significant longitudinal gradient. The results improve the agreement of theory with empirical studies for water potential in the primary growth zone of roots of maize (Zea mays). A sensitivity analysis quantifies the functional importance of apical phloem differentiation in permitting growth and reveals that the presence of phloem water sources makes the growth-sustaining water relations of the root relatively insensitive to changes in root radius and hydraulic conductivity. Adaptation to drought and other environmental stresses is predicted to involve more apical differentiation of phloem and/or higher phloem delivery rates to the growth zone. PMID:19542299

  16. Modeling the hydraulics of root growth in three dimensions with phloem water sources.

    PubMed

    Wiegers, Brandy S; Cheer, Angela Y; Silk, Wendy K

    2009-08-01

    Primary growth is characterized by cell expansion facilitated by water uptake generating hydrostatic (turgor) pressure to inflate the cell, stretching the rigid cell walls. The multiple source theory of root growth hypothesizes that root growth involves transport of water both from the soil surrounding the growth zone and from the mature tissue higher in the root via phloem and protophloem. Here, protophloem water sources are used as boundary conditions in a classical, three-dimensional model of growth-sustaining water potentials in primary roots. The model predicts small radial gradients in water potential, with a significant longitudinal gradient. The results improve the agreement of theory with empirical studies for water potential in the primary growth zone of roots of maize (Zea mays). A sensitivity analysis quantifies the functional importance of apical phloem differentiation in permitting growth and reveals that the presence of phloem water sources makes the growth-sustaining water relations of the root relatively insensitive to changes in root radius and hydraulic conductivity. Adaptation to drought and other environmental stresses is predicted to involve more apical differentiation of phloem and/or higher phloem delivery rates to the growth zone.

  17. Para-hydrogen and helium cluster size distributions in free jet expansions based on Smoluchowski theory with kernel scaling.

    PubMed

    Kornilov, Oleg; Toennies, J Peter

    2015-02-21

    The size distribution of para-H2 (pH2) clusters produced in free jet expansions at a source temperature of T0 = 29.5 K and pressures of P0 = 0.9-1.96 bars is reported and analyzed according to a cluster growth model based on the Smoluchowski theory with kernel scaling. Good overall agreement is found between the measured and predicted, Nk = A k(a) e(-bk), shape of the distribution. The fit yields values for A and b for values of a derived from simple collision models. The small remaining deviations between measured abundances and theory imply a (pH2)k magic number cluster of k = 13 as has been observed previously by Raman spectroscopy. The predicted linear dependence of b(-(a+1)) on source gas pressure was verified and used to determine the value of the basic effective agglomeration reaction rate constant. A comparison of the corresponding effective growth cross sections σ11 with results from a similar analysis of He cluster size distributions indicates that the latter are much larger by a factor 6-10. An analysis of the three body recombination rates, the geometric sizes and the fact that the He clusters are liquid independent of their size can explain the larger cross sections found for He.

  18. Development of Additional Hazard Assessment Models

    DTIC Science & Technology

    1977-03-01

    globules, their trajectory (the distance from the spill point to the impact point on the river bed), and the time required for sinking. Established theories ...chemicals, the dissolution rate is estimated by using eddy diffusivity surface renewal theories . The validity of predictions of these theories has been... theories and experimental data on aeration of rivers. * Describe dispersion in rivers with stationary area source and sources moving with the stream

  19. Reconstruction of Vectorial Acoustic Sources in Time-Domain Tomography

    PubMed Central

    Xia, Rongmin; Li, Xu; He, Bin

    2009-01-01

    A new theory is proposed for the reconstruction of curl-free vector field, whose divergence serves as acoustic source. The theory is applied to reconstruct vector acoustic sources from the scalar acoustic signals measured on a surface enclosing the source area. It is shown that, under certain conditions, the scalar acoustic measurements can be vectorized according to the known measurement geometry and subsequently be used to reconstruct the original vector field. Theoretically, this method extends the application domain of the existing acoustic reciprocity principle from a scalar field to a vector field, indicating that the stimulating vectorial source and the transmitted acoustic pressure vector (acoustic pressure vectorized according to certain measurement geometry) are interchangeable. Computer simulation studies were conducted to evaluate the proposed theory, and the numerical results suggest that reconstruction of a vector field using the proposed theory is not sensitive to variation in the detecting distance. The present theory may be applied to magnetoacoustic tomography with magnetic induction (MAT-MI) for reconstructing current distribution from acoustic measurements. A simulation on MAT-MI shows that, compared to existing methods, the present method can give an accurate estimation on the source current distribution and a better conductivity reconstruction. PMID:19211344

  20. Enhancing source location protection in wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Chen, Juan; Lin, Zhengkui; Wu, Di; Wang, Bailing

    2015-12-01

    Wireless sensor networks are widely deployed in the internet of things to monitor valuable objects. Once the object is monitored, the sensor nearest to the object which is known as the source informs the base station about the object's information periodically. It is obvious that attackers can capture the object successfully by localizing the source. Thus, many protocols have been proposed to secure the source location. However, in this paper, we examine that typical source location protection protocols generate not only near but also highly localized phantom locations. As a result, attackers can trace the source easily from these phantom locations. To address these limitations, we propose a protocol to enhance the source location protection (SLE). With phantom locations far away from the source and widely distributed, SLE improves source location anonymity significantly. Theory analysis and simulation results show that our SLE provides strong source location privacy preservation and the average safety period increases by nearly one order of magnitude compared with existing work with low communication cost.

  1. Sound produced by an oscillating arc in a high-pressure gas

    NASA Astrophysics Data System (ADS)

    Popov, Fedor K.; Shneider, Mikhail N.

    2017-08-01

    We suggest a simple theory to describe the sound generated by small periodic perturbations of a cylindrical arc in a dense gas. Theoretical analysis was done within the framework of the non-self-consistent channel arc model and supplemented with time-dependent gas dynamic equations. It is shown that an arc with power amplitude oscillations on the order of several percent is a source of sound whose intensity is comparable with external ultrasound sources used in experiments to increase the yield of nanoparticles in the high pressure arc systems for nanoparticle synthesis.

  2. Cytoscape.js: a graph theory library for visualisation and analysis.

    PubMed

    Franz, Max; Lopes, Christian T; Huck, Gerardo; Dong, Yue; Sumer, Onur; Bader, Gary D

    2016-01-15

    Cytoscape.js is an open-source JavaScript-based graph library. Its most common use case is as a visualization software component, so it can be used to render interactive graphs in a web browser. It also can be used in a headless manner, useful for graph operations on a server, such as Node.js. Cytoscape.js is implemented in JavaScript. Documentation, downloads and source code are available at http://js.cytoscape.org. gary.bader@utoronto.ca. © The Author 2015. Published by Oxford University Press.

  3. DYNAMICS AND STAGNATION IN THE MALTHUSIAN EPOCH

    PubMed Central

    Ashraf, Quamrul; Galor, Oded

    2013-01-01

    This paper examines the central hypothesis of the influential Malthusian theory, according to which improvements in the technological environment during the pre-industrial era had generated only temporary gains in income per capita, eventually leading to a larger, but not significantly richer, population. Exploiting exogenous sources of cross-country variations in land productivity and the level of technological advancement the analysis demonstrates that, in accordance with the theory, technological superiority and higher land productivity had significant positive effects on population density but insignificant effects on the standard of living, during the time period 1–1500 CE. PMID:25506082

  4. Item response theory analysis applied to the Spanish version of the Personal Outcomes Scale.

    PubMed

    Guàrdia-Olmos, J; Carbó-Carreté, M; Peró-Cebollero, M; Giné, C

    2017-11-01

    The study of measurements of quality of life (QoL) is one of the great challenges of modern psychology and psychometric approaches. This issue has greater importance when examining QoL in populations that were historically treated on the basis of their deficiency, and recently, the focus has shifted to what each person values and desires in their life, as in cases of people with intellectual disability (ID). Many studies of QoL scales applied in this area have attempted to improve the validity and reliability of their components by incorporating various sources of information to achieve consistency in the data obtained. The adaptation of the Personal Outcomes Scale (POS) in Spanish has shown excellent psychometric attributes, and its administration has three sources of information: self-assessment, practitioner and family. The study of possible congruence or incongruence of observed distributions of each item between sources is therefore essential to ensure a correct interpretation of the measure. The aim of this paper was to analyse the observed distribution of items and dimensions from the three Spanish POS information sources cited earlier, using the item response theory. We studied a sample of 529 people with ID and their respective practitioners and family member, and in each case, we analysed items and factors using Samejima's model of polytomic ordinal scales. The results indicated an important number of items with differential effects regarding sources, and in some cases, they indicated significant differences in the distribution of items, factors and sources of information. As a result of this analysis, we must affirm that the administration of the POS, considering three sources of information, was adequate overall, but a correct interpretation of the results requires that it obtain much more information to consider, as well as some specific items in specific dimensions. The overall ratings, if these comments are considered, could result in bias. © 2017 MENCAP and International Association of the Scientific Study of Intellectual and Developmental Disabilities and John Wiley & Sons Ltd.

  5. Biological Embedding: Evaluation and Analysis of an Emerging Concept for Nursing Scholarship

    PubMed Central

    Nist, Marliese Dion

    2016-01-01

    Aim The purpose of this paper is to report the analysis of the concept of biological embedding. Background Research that incorporates a life course perspective is becoming increasingly prominent in the health sciences. Biological embedding is a central concept in life course theory and may be important for nursing theories to enhance our understanding of health states in individuals and populations. Before the concept of biological embedding can be used in nursing theory and research, an analysis of the concept is required to advance it toward full maturity. Design Concept analysis. Data Sources PubMed, CINAHL and PsycINFO were searched for publications using the term ‘biological embedding’ or ‘biological programming’ and published through 2015. Methods An evaluation of the concept was first conducted to determine the concept’s level of maturity and was followed by a concept comparison, using the methods for concept evaluation and comparison described by Morse. Results A consistent definition of biological embedding – the process by which early life experience alters biological processes to affect adult health outcomes – was found throughout the literature. The concept has been used in several theories that describe the mechanisms through which biological embedding might occur and highlight its role in the development of health trajectories. Biological embedding is a partially mature concept, requiring concept comparison with an overlapping concept – biological programming – to more clearly establish the boundaries of biological embedding. Conclusions Biological embedding has significant potential for theory development and application in multiple academic disciplines, including nursing. PMID:27682606

  6. Millimeter wave generation by relativistic electron beams and microwave-plasma interaction

    NASA Astrophysics Data System (ADS)

    Kuo, Spencer

    1990-12-01

    The design and operation of a compact, high power, millimeter wave source (cusptron) has been completed and proven successful. Extensive theoretical analysis of cusptron beam and rf dynamics has been carried out and published. Theory agrees beautifully with experiment. Microwave Bragg scattering due to been achieved by using expanding plasmas to upshift rf signal frequencies.

  7. New York City's Children First Networks: Turning Accountability on Its Head

    ERIC Educational Resources Information Center

    Wohlstetter, Priscilla; Smith, Joanna; Gallagher, Andrew

    2013-01-01

    Purpose: The purpose of this paper is to report findings from an exploratory study of New York's Children First Networks (CFNs); to examine what is known about the CFNs thus far, drawing on new empirical research, as well as document review and analysis of secondary sources. Design/methodology/approach: Organizational learning theory guided this…

  8. Management Science in Higher Education Institutions: Case Studies from Greece

    ERIC Educational Resources Information Center

    Saiti, Anna

    2010-01-01

    Regardless of the source of funding, university quality is based on knowledge, teaching, and research, and hence cannot be run like private enterprises as they are expert organisations that provide solely a public service. The purpose of this paper is to investigate, through the analysis of case studies, whether or not management theory,…

  9. Equipping Every Student with Psychological Tools: A Vygotskian Guide to Establishing the Goals of Education

    ERIC Educational Resources Information Center

    Eun, Barohny

    2016-01-01

    The present conceptual analysis begins with an assertion that the most fundamental act in any educational endeavors is establishing their goals. The discussion proceeds to reviewing recent pertinent literature that presents Vygotsky's theory of development as a useful source in providing guidance to establishing the goals of education in rapidly…

  10. Disavowed Knowledge: Psychoanalysis, Education and Teaching. Studies in Curriculum Theory Series

    ERIC Educational Resources Information Center

    Taubman, Peter Maas

    2011-01-01

    This is the first and only book to detail the history of the century-long relationship between education and psychoanalysis. Relying on primary and secondary sources, it provides not only a historical context but also a psychoanalytically informed analysis. In considering what it means to think about teaching from a psychoanalytic perspective and…

  11. Corporate Characteristics, Political Embeddedness and Environmental Pollution by Large U.S. Corporations

    ERIC Educational Resources Information Center

    Prechel, Harland; Zheng, Lu

    2012-01-01

    Organizational and environmental sociology contain surprisingly few studies of the corporation as one of the sources of environmental pollution. To fill this gap, we focus on the parent company as the unit of analysis and elaborate environmental theories that focus on the organizational and political-legal causes of pollution. Using a compiled…

  12. A Content Analysis of the Value of Humanities Literature in Educational Leadership

    ERIC Educational Resources Information Center

    Monday, Ralph

    2012-01-01

    This qualitative study was completed to understand the themes that emerged from scholarly works on the use of humanities in leadership theory and practice between 1960 and 2011. Educational leadership has expanded from using only the methods of the social sciences to using methods from other sources to inform leadership, such as using the…

  13. Analysing and correcting the differences between multi-source and multi-scale spatial remote sensing observations.

    PubMed

    Dong, Yingying; Luo, Ruisen; Feng, Haikuan; Wang, Jihua; Zhao, Jinling; Zhu, Yining; Yang, Guijun

    2014-01-01

    Differences exist among analysis results of agriculture monitoring and crop production based on remote sensing observations, which are obtained at different spatial scales from multiple remote sensors in same time period, and processed by same algorithms, models or methods. These differences can be mainly quantitatively described from three aspects, i.e. multiple remote sensing observations, crop parameters estimation models, and spatial scale effects of surface parameters. Our research proposed a new method to analyse and correct the differences between multi-source and multi-scale spatial remote sensing surface reflectance datasets, aiming to provide references for further studies in agricultural application with multiple remotely sensed observations from different sources. The new method was constructed on the basis of physical and mathematical properties of multi-source and multi-scale reflectance datasets. Theories of statistics were involved to extract statistical characteristics of multiple surface reflectance datasets, and further quantitatively analyse spatial variations of these characteristics at multiple spatial scales. Then, taking the surface reflectance at small spatial scale as the baseline data, theories of Gaussian distribution were selected for multiple surface reflectance datasets correction based on the above obtained physical characteristics and mathematical distribution properties, and their spatial variations. This proposed method was verified by two sets of multiple satellite images, which were obtained in two experimental fields located in Inner Mongolia and Beijing, China with different degrees of homogeneity of underlying surfaces. Experimental results indicate that differences of surface reflectance datasets at multiple spatial scales could be effectively corrected over non-homogeneous underlying surfaces, which provide database for further multi-source and multi-scale crop growth monitoring and yield prediction, and their corresponding consistency analysis evaluation.

  14. Analysing and Correcting the Differences between Multi-Source and Multi-Scale Spatial Remote Sensing Observations

    PubMed Central

    Dong, Yingying; Luo, Ruisen; Feng, Haikuan; Wang, Jihua; Zhao, Jinling; Zhu, Yining; Yang, Guijun

    2014-01-01

    Differences exist among analysis results of agriculture monitoring and crop production based on remote sensing observations, which are obtained at different spatial scales from multiple remote sensors in same time period, and processed by same algorithms, models or methods. These differences can be mainly quantitatively described from three aspects, i.e. multiple remote sensing observations, crop parameters estimation models, and spatial scale effects of surface parameters. Our research proposed a new method to analyse and correct the differences between multi-source and multi-scale spatial remote sensing surface reflectance datasets, aiming to provide references for further studies in agricultural application with multiple remotely sensed observations from different sources. The new method was constructed on the basis of physical and mathematical properties of multi-source and multi-scale reflectance datasets. Theories of statistics were involved to extract statistical characteristics of multiple surface reflectance datasets, and further quantitatively analyse spatial variations of these characteristics at multiple spatial scales. Then, taking the surface reflectance at small spatial scale as the baseline data, theories of Gaussian distribution were selected for multiple surface reflectance datasets correction based on the above obtained physical characteristics and mathematical distribution properties, and their spatial variations. This proposed method was verified by two sets of multiple satellite images, which were obtained in two experimental fields located in Inner Mongolia and Beijing, China with different degrees of homogeneity of underlying surfaces. Experimental results indicate that differences of surface reflectance datasets at multiple spatial scales could be effectively corrected over non-homogeneous underlying surfaces, which provide database for further multi-source and multi-scale crop growth monitoring and yield prediction, and their corresponding consistency analysis evaluation. PMID:25405760

  15. Gravitational wave astronomy - astronomy of the 21st century

    NASA Astrophysics Data System (ADS)

    Dhurandhar, S. V.

    2011-03-01

    An enigmatic prediction of Einstein's general theory of relativity is gravitational waves. With the observed decay in the orbit of the Hulse-Taylor binary pulsar agreeing within a fraction of a percent with the theoretically computed decay from Einstein's theory, the existence of gravitational waves was firmly established. Currently there is a worldwide effort to detect gravitational waves with inteferometric gravitational wave observatories or detectors and several such detectors have been built or being built. The initial detectors have reached their design sensitivities and now the effort is on to construct advanced detectors which are expected to detect gravitational waves from astrophysical sources. The era of gravitational wave astronomy has arrived. This article describes the worldwide effort which includes the effort on the Indian front - the IndIGO project -, the principle underlying interferometric detectors both on ground and in space, the principal noise sources that plague such detectors, the astrophysical sources of gravitational waves that one expects to detect by these detectors and some glimpse of the data analysis methods involved in extracting the very weak gravitational wave signals from detector noise.

  16. Gravitational wave astronomy— astronomy of the 21st century

    NASA Astrophysics Data System (ADS)

    Dhurandhar, S. V.

    2011-12-01

    An enigmatic prediction of Einstein's general theory of relativity is gravitational waves. With the observed decay in the orbit of the Hulse-Taylor binary pulsar agreeing within a fraction of a percent with the theoretically computed decay from Einstein's theory, the existence of gravitational waves was firmly established. Currently there is a worldwide effort to detect gravitational waves with inteferometric gravitational wave observatories or detectors and several such detectors have been built or are being built. The initial detectors have reached their design sensitivities and now the effort is on to construct advanced detectors which are expected to detect gravitational waves from astrophysical sources. The era of gravitational wave astronomy has arrived. This article describes the worldwide effort which includes the effort on the Indian front— the IndIGO project —, the principle underlying interferometric detectors both on ground and in space, the principal noise sources that plague such detectors, the astrophysical sources of gravitational waves that one expects to detect by these detectors and some glimpse of the data analysis methods involved in extracting the very weak gravitational wave signals from detector noise.

  17. Every document and picture tells a story: using internal corporate document reviews, semiotics, and content analysis to assess tobacco advertising

    PubMed Central

    Anderson, S J; Dewhirst, T; Ling, P M

    2006-01-01

    In this article we present communication theory as a conceptual framework for conducting documents research on tobacco advertising strategies, and we discuss two methods for analysing advertisements: semiotics and content analysis. We provide concrete examples of how we have used tobacco industry documents archives and tobacco advertisement collections iteratively in our research to yield a synergistic analysis of these two complementary data sources. Tobacco promotion researchers should consider adopting these theoretical and methodological approaches. PMID:16728758

  18. Progress on a cognitive-motivational-relational theory of emotion.

    PubMed

    Lazarus, R S

    1991-08-01

    The 2 main tasks of this article are 1st, to examine what a theory of emotion must do and basic issues that it must address. These include definitional issues, whether or not physiological activity should be a defining attribute, categorical versus dimensional strategies, the reconciliation of biological universals with sociocultural sources of variability, and a classification of the emotions. The 2nd main task is to apply an analysis of appraisal patterns and the core relational themes that they produce to a number of commonly identified emotions. Anger, anxiety, sadness, and pride (to include 1 positive emotion) are used as illustrations. The purpose is to show the capability of a cognitive-motivational-relational theory to explain and predict the emotions. The role of coping in emotion is also discussed, and the article ends with a response to criticisms of a phenomenological, folk-theory outlook.

  19. Graph theory applied to noise and vibration control in statistical energy analysis models.

    PubMed

    Guasch, Oriol; Cortés, Lluís

    2009-06-01

    A fundamental aspect of noise and vibration control in statistical energy analysis (SEA) models consists in first identifying and then reducing the energy flow paths between subsystems. In this work, it is proposed to make use of some results from graph theory to address both issues. On the one hand, linear and path algebras applied to adjacency matrices of SEA graphs are used to determine the existence of any order paths between subsystems, counting and labeling them, finding extremal paths, or determining the power flow contributions from groups of paths. On the other hand, a strategy is presented that makes use of graph cut algorithms to reduce the energy flow from a source subsystem to a receiver one, modifying as few internal and coupling loss factors as possible.

  20. Theory for noise of propellers in angular inflow with parametric studies and experimental verification

    NASA Technical Reports Server (NTRS)

    Hanson, Donald B.; Parzych, David J.

    1993-01-01

    This report presents the derivation of a frequency domain theory and working equations for radiation of propeller harmonic noise in the presence of angular inflow. In applying the acoustic analogy, integration over the tangential coordinate of the source region is performed numerically, permitting the equations to be solved without approximation for any degree of angular inflow. Inflow angle is specified in terms of yaw, pitch, and roll angles of the aircraft. Since these can be arbitrarily large, the analysis applies with equal accuracy to propellers and helicopter rotors. For thickness and loading, the derivation is given in complete detail with working equations for near and far field. However, the quadrupole derivation has been carried only far enough to show feasibility of the numerical approach. Explicit formulas are presented for computation of source elements, evaluation of Green's functions, and location of observer points in various visual and retarded coordinate systems. The resulting computer program, called WOBBLE has been written in FORTRAN and follows the notation of this report very closely. The new theory is explored to establish the effects of varying inflow angle on axial and circumferential directivity. Also, parametric studies were performed to evaluate various phenomena outside the capabilities of earlier theories, such as an unsteady thickness effect. Validity of the theory was established by comparison with test data from conventional propellers and Prop Fans in flight and in wind tunnels under a variety of operating conditions and inflow angles.

  1. Gene by Social-Context Interactions for Number of Sexual Partners Among White Male Youths: Genetics-informed Sociology

    PubMed Central

    Guo, Guang; Tong, Yuying; Cai, Tianji

    2010-01-01

    In this study, we set out to investigate whether introducing molecular genetic measures into an analysis of sexual partner variety will yield novel sociological insights. The data source is the white male DNA sample in the National Longitudinal Study of Adolescent Health. Our empirical analysis has produced a robust protective effect of the 9R/9R genotype relative to the Any10R genotype in the dopamine transporter gene (DAT1). The gene-environment interaction analysis demonstrates that the protective effect of 9R/9R tends to be lost in schools in which higher proportions of students start having sex early or among those with relatively low levels of cognitive ability. Our genetics-informed sociological analysis suggests that the “one size” of a single social theory may not fit all. Explaining a human trait or behavior may require a theory that accommodates the complex interplay between social contextual and individual influences and genetic predispositions. PMID:19569400

  2. Global crop production forecasting data system analysis

    NASA Technical Reports Server (NTRS)

    Castruccio, P. A. (Principal Investigator); Loats, H. L.; Lloyd, D. G.

    1978-01-01

    The author has identified the following significant results. Findings led to the development of a theory of radiometric discrimination employing the mathematical framework of the theory of discrimination between scintillating radar targets. The theory indicated that the functions which drive accuracy of discrimination are the contrast ratio between targets, and the number of samples, or pixels, observed. Theoretical results led to three primary consequences, as regards the data system: (1) agricultural targets must be imaged at correctly chosen times, when the relative evolution of the crop's development is such as to maximize their contrast; (2) under these favorable conditions, the number of observed pixels can be significantly reduced with respect to wall-to-wall measurements; and (3) remotely sensed radiometric data must be suitably mixed with other auxiliary data, derived from external sources.

  3. Immigration theory for a new century: some problems and opportunities.

    PubMed

    Portes, A

    1997-01-01

    "This essay examines some of the pitfalls in contemporary immigration theory and reviews some of the most promising developments in research in this field. As a data-driven field [of] study, immigration has not had to contend with grand generalizations for highly abstract theorizing. On the contrary, the bias has run in the opposite direction, that is toward ground-level studies of particular migrant groups or analysis of official migration policies. As the distillate of past research in the field and a source of guidance for future work, theory represents one of the most valuable products of our collective intellectual endeavor. Ways to foster it and problems presented by certain common misunderstandings about the meaning and scope of scientific theorizing are discussed." The geographical focus is on the United States. excerpt

  4. Balancing practicality and hydrologic realism: a parsimonious approach for simulating rapid groundwater recharge via unsaturated-zone preferential flow

    USGS Publications Warehouse

    Mirus, Benjamin B.; Nimmo, J.R.

    2013-01-01

    The impact of preferential flow on recharge and contaminant transport poses a considerable challenge to water-resources management. Typical hydrologic models require extensive site characterization, but can underestimate fluxes when preferential flow is significant. A recently developed source-responsive model incorporates film-flow theory with conservation of mass to estimate unsaturated-zone preferential fluxes with readily available data. The term source-responsive describes the sensitivity of preferential flow in response to water availability at the source of input. We present the first rigorous tests of a parsimonious formulation for simulating water table fluctuations using two case studies, both in arid regions with thick unsaturated zones of fractured volcanic rock. Diffuse flow theory cannot adequately capture the observed water table responses at both sites; the source-responsive model is a viable alternative. We treat the active area fraction of preferential flow paths as a scaled function of water inputs at the land surface then calibrate the macropore density to fit observed water table rises. Unlike previous applications, we allow the characteristic film-flow velocity to vary, reflecting the lag time between source and deep water table responses. Analysis of model performance and parameter sensitivity for the two case studies underscores the importance of identifying thresholds for initiation of film flow in unsaturated rocks, and suggests that this parsimonious approach is potentially of great practical value.

  5. General theory of remote gaze estimation using the pupil center and corneal reflections.

    PubMed

    Guestrin, Elias Daniel; Eizenman, Moshe

    2006-06-01

    This paper presents a general theory for the remote estimation of the point-of-gaze (POG) from the coordinates of the centers of the pupil and corneal reflections. Corneal reflections are produced by light sources that illuminate the eye and the centers of the pupil and corneal reflections are estimated in video images from one or more cameras. The general theory covers the full range of possible system configurations. Using one camera and one light source, the POG can be estimated only if the head is completely stationary. Using one camera and multiple light sources, the POG can be estimated with free head movements, following the completion of a multiple-point calibration procedure. When multiple cameras and multiple light sources are used, the POG can be estimated following a simple one-point calibration procedure. Experimental and simulation results suggest that the main sources of gaze estimation errors are the discrepancy between the shape of real corneas and the spherical corneal shape assumed in the general theory, and the noise in the estimation of the centers of the pupil and corneal reflections. A detailed example of a system that uses the general theory to estimate the POG on a computer screen is presented.

  6. An Analysis of the Factors That Motivate Undergraduate Alumni Donors at University of the Pacific Based on Social Exchange Theory

    ERIC Educational Resources Information Center

    Dial, Janet Schellhase

    2012-01-01

    Institutions of higher education rely upon the support of their alumni to provide financial stability. This outward show of confidence by alumni is also an important indication for external constituents who rank colleges and universities based on funding sources such as corporations and foundations. Private universities, in particular, have been…

  7. Exploring the validity and reliability of a questionnaire for evaluating veterinary clinical teachers' supervisory skills during clinical rotations.

    PubMed

    Boerboom, T B B; Dolmans, D H J M; Jaarsma, A D C; Muijtjens, A M M; Van Beukelen, P; Scherpbier, A J J A

    2011-01-01

    Feedback to aid teachers in improving their teaching requires validated evaluation instruments. When implementing an evaluation instrument in a different context, it is important to collect validity evidence from multiple sources. We examined the validity and reliability of the Maastricht Clinical Teaching Questionnaire (MCTQ) as an instrument to evaluate individual clinical teachers during short clinical rotations in veterinary education. We examined four sources of validity evidence: (1) Content was examined based on theory of effective learning. (2) Response process was explored in a pilot study. (3) Internal structure was assessed by confirmatory factor analysis using 1086 student evaluations and reliability was examined utilizing generalizability analysis. (4) Relations with other relevant variables were examined by comparing factor scores with other outcomes. Content validity was supported by theory underlying the cognitive apprenticeship model on which the instrument is based. The pilot study resulted in an additional question about supervision time. A five-factor model showed a good fit with the data. Acceptable reliability was achievable with 10-12 questionnaires per teacher. Correlations between the factors and overall teacher judgement were strong. The MCTQ appears to be a valid and reliable instrument to evaluate clinical teachers' performance during short rotations.

  8. [Regulation framework of watershed landscape pattern for non-point source pollution control based on 'source-sink' theory: A case study in the watershed of Maluan Bay, Xiamen City, China].

    PubMed

    Huang, Ning; Wang, Hong Ying; Lin, Tao; Liu, Qi Ming; Huang, Yun Feng; Li, Jian Xiong

    2016-10-01

    Watershed landscape pattern regulation and optimization based on 'source-sink' theory for non-point source pollution control is a cost-effective measure and still in the exploratory stage. Taking whole watershed as the research object, on the basis of landscape ecology, related theories and existing research results, a regulation framework of watershed landscape pattern for non-point source pollution control was developed at two levels based on 'source-sink' theory in this study: 1) at watershed level: reasonable basic combination and spatial pattern of 'source-sink' landscape was analyzed, and then holistic regulation and optimization method of landscape pattern was constructed; 2) at landscape patch level: key 'source' landscape was taken as the focus of regulation and optimization. Firstly, four identification criteria of key 'source' landscape including landscape pollutant loading per unit area, landscape slope, long and narrow transfer 'source' landscape, pollutant loading per unit length of 'source' landscape along the riverbank were developed. Secondly, nine types of regulation and optimization methods for different key 'source' landscape in rural and urban areas were established, according to three regulation and optimization rules including 'sink' landscape inlay, banding 'sink' landscape supplement, pollutants capacity of original 'sink' landscape enhancement. Finally, the regulation framework was applied for the watershed of Maluan Bay in Xiamen City. Holistic regulation and optimization mode of watershed landscape pattern of Maluan Bay and key 'source' landscape regulation and optimization measures for the three zones were made, based on GIS technology, remote sensing images and DEM model.

  9. Para-hydrogen and helium cluster size distributions in free jet expansions based on Smoluchowski theory with kernel scaling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kornilov, Oleg; Toennies, J. Peter

    The size distribution of para-H{sub 2} (pH{sub 2}) clusters produced in free jet expansions at a source temperature of T{sub 0} = 29.5 K and pressures of P{sub 0} = 0.9–1.96 bars is reported and analyzed according to a cluster growth model based on the Smoluchowski theory with kernel scaling. Good overall agreement is found between the measured and predicted, N{sub k} = A k{sup a} e{sup −bk}, shape of the distribution. The fit yields values for A and b for values of a derived from simple collision models. The small remaining deviations between measured abundances and theory imply a (pH{submore » 2}){sub k} magic number cluster of k = 13 as has been observed previously by Raman spectroscopy. The predicted linear dependence of b{sup −(a+1)} on source gas pressure was verified and used to determine the value of the basic effective agglomeration reaction rate constant. A comparison of the corresponding effective growth cross sections σ{sub 11} with results from a similar analysis of He cluster size distributions indicates that the latter are much larger by a factor 6-10. An analysis of the three body recombination rates, the geometric sizes and the fact that the He clusters are liquid independent of their size can explain the larger cross sections found for He.« less

  10. Nature of the Galactic centre NIR-excess sources. I. What can we learn from the continuum observations of the DSO/G2 source?

    NASA Astrophysics Data System (ADS)

    Zajaček, Michal; Britzen, Silke; Eckart, Andreas; Shahzamanian, Banafsheh; Busch, Gerold; Karas, Vladimír; Parsa, Marzieh; Peissker, Florian; Dovčiak, Michal; Subroweit, Matthias; Dinnbier, František; Zensus, J. Anton

    2017-06-01

    Context. The Dusty S-cluster Object (DSO/G2) orbiting the supermassive black hole (Sgr A*) in the Galactic centre has been monitored in both near-infrared continuum and line emission. There has been a dispute about the character and the compactness of the object: it being interpreted as either a gas cloud or a dust-enshrouded star. A recent analysis of polarimetry data in Ks-band (2.2 μm) allows us to put further constraints on the geometry of the DSO. Aims: The purpose of this paper is to constrain the nature and the geometry of the DSO. Methods: We compared 3D radiative transfer models of the DSO with the near-infrared (NIR) continuum data including polarimetry. In the analysis, we used basic dust continuum radiative transfer theory implemented in the 3D Monte Carlo code Hyperion. Moreover, we implemented analytical results of the two-body problem mechanics and the theory of non-thermal processes. Results: We present a composite model of the DSO - a dust-enshrouded star that consists of a stellar source, dusty, optically thick envelope, bipolar cavities, and a bow shock. This scheme can match the NIR total as well as polarized properties of the observed spectral energy distribution (SED). The SED may be also explained in theory by a young pulsar wind nebula that typically exhibits a large linear polarization degree due to magnetospheric synchrotron emission. Conclusions: The analysis of NIR polarimetry data combined with the radiative transfer modelling shows that the DSO is a peculiar source of compact nature in the S cluster (r ≲ 0.04 pc). It is most probably a young stellar object embedded in a non-spherical dusty envelope, whose components include optically thick dusty envelope, bipolar cavities, and a bow shock. Alternatively, the continuum emission could be of a non-thermal origin due to the presence of a young neutron star and its wind nebula. Although there has been so far no detection of X-ray and radio counterparts of the DSO, the analysis of the neutron star model shows that young, energetic neutron stars similar to the Crab pulsar could in principle be detected in the S cluster with current NIR facilities and they appear as apparent reddened, near-infrared-excess sources. The searches for pulsars in the NIR bands can thus complement standard radio searches, which can put further constraints on the unexplored pulsar population in the Galactic centre. Both thermal and non-thermal models are in accordance with the observed compactness, total as well polarized continuum emission of the DSO.

  11. The mathematical theory of signal processing and compression-designs

    NASA Astrophysics Data System (ADS)

    Feria, Erlan H.

    2006-05-01

    The mathematical theory of signal processing, named processor coding, will be shown to inherently arise as the computational time dual of Shannon's mathematical theory of communication which is also known as source coding. Source coding is concerned with signal source memory space compression while processor coding deals with signal processor computational time compression. Their combination is named compression-designs and referred as Conde in short. A compelling and pedagogically appealing diagram will be discussed highlighting Conde's remarkable successful application to real-world knowledge-aided (KA) airborne moving target indicator (AMTI) radar.

  12. Optical damage observed in the LHMEL II output coupler

    NASA Astrophysics Data System (ADS)

    Eric, John J.; Bagford, John O.; Devlin, Christie L. H.; Hull, Robert J.; Seibert, Daniel B.

    2008-01-01

    During the annual NIST calibration testing done at the LHMEL facility in FY06 on its high energy Carbon-Dioxide lasers, the LHMEL II device suffered severe damage to the internal surface of its ZnSe output coupler optics. The damage occurred during a high power, short duration run and it was believed to have been the result of a significant amount of surface contaminants interacting with the LHMEL cavity beam. Initial theories as to the source of the contamination led to the inspection of the vacuum grease that seals the piping that supplies the source gases to the laser cavity. Other contamination sources were considered, and analysis was conducted in an effort to identify the material found at the damage sites on the optic, but the tests were mainly inconclusive. Some procedure changes were initiated to identify possible contamination before high energy laser operation in an attempt to mitigate and possibly prevent the continued occurrence of damage to the output coupler window. This paper is to illustrate the type and extent of the damage encountered, highlight some of the theories as to the contamination source, and serve as a notice as to the severity and consequences of damage that is possible even due to small amounts of foreign material in a high energy laser environment.

  13. Exact Holography of Massive M2-brane Theories and Entanglement Entropy

    NASA Astrophysics Data System (ADS)

    Jang, Dongmin; Kim, Yoonbai; Kwon, O.-Kab; Tolla, D. D.

    2018-01-01

    We test the gauge/gravity duality between the N = 6 mass-deformed ABJM theory with Uk(N) × U-k(N) gauge symmetry and the 11-dimensional supergravity on LLM geometries with SO(4)=ℤk × SO(4)=ℤk isometry. Our analysis is based on the evaluation of vacuum expectation values of chiral primary operators from the supersymmetric vacua of mass-deformed ABJM theory and from the implementation of Kaluza-Klein (KK) holography to the LLM geometries. We focus on the chiral primary operator (CPO) with conformal dimension Δ = 1. The non-vanishing vacuum expectation value (vev) implies the breaking of conformal symmetry. In that case, we show that the variation of the holographic entanglement entropy (HEE) from it's value in the CFT, is related to the non-vanishing one-point function due to the relevant deformation as well as the source field. Applying Ryu Takayanagi's HEE conjecture to the 4-dimensional gravity solutions, which are obtained from the KK reduction of the 11-dimensional LLM solutions, we calculate the variation of the HEE. We show how the vev and the value of the source field determine the HEE.

  14. The Relationship between Finnish Student Teachers' Practical Theories, Sources, and Teacher Education

    ERIC Educational Resources Information Center

    Pitkäniemi, Harri; Karlsson, Liisa; Stenberg, Katariina

    2014-01-01

    The purpose of this research is two-fold: 1) to describe what kind of practical theories student teachers have in the Finnish class teacher education context and 2) to analyse their differences and similarities at the initial and final phase of teacher education. We further analyse the relationship between the practical theories and their sources.…

  15. NLO QCD effective field theory analysis of W+W- production at the LHC including fermionic operators

    NASA Astrophysics Data System (ADS)

    Baglio, Julien; Dawson, Sally; Lewis, Ian M.

    2017-10-01

    We study the impact of anomalous gauge boson and fermion couplings on the production of W+W- pairs at the LHC. Helicity amplitudes are presented separately to demonstrate the sources of new physics contributions and the impact of QCD and electroweak corrections. The QCD corrections have important effects on the fits to anomalous couplings, in particular when one W boson is longitudinally polarized and the other is transversely polarized. In effective field theory language, we demonstrate that the dimension-6 approximation to constraining new physics effects in W+W- pair production fails at pT˜500 - 1000 GeV .

  16. Integrals and integral equations in linearized wing theory

    NASA Technical Reports Server (NTRS)

    Lomax, Harvard; Heaslet, Max A; Fuller, Franklyn B

    1951-01-01

    The formulas of subsonic and supersonic wing theory for source, doublet, and vortex distributions are reviewed and a systematic presentation is provided which relates these distributions to the pressure and to the vertical induced velocity in the plane of the wing. It is shown that care must be used in treating the singularities involved in the analysis and that the order of integration is not always reversible. Concepts suggested by the irreversibility of order of integration are shown to be useful in the inversion of singular integral equations when operational techniques are used. A number of examples are given to illustrate the methods presented, attention being directed to supersonic flight speed.

  17. Body and Surface Wave Modeling of Observed Seismic Events. Part 2.

    DTIC Science & Technology

    1987-05-12

    is based on expand - ing the complete three dimensional solution of the wave equation expressed in cylindrical S coordinates in an asymptotic form which...using line source (2-D) theory. It is based on expand - ing the complete three dimensional solution of the wave equation expressed in cylindrical...generating synthetic point-source seismograms for shear dislocation sources using line source (2-D) theory. It is based on expanding the complete three

  18. Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package

    NASA Astrophysics Data System (ADS)

    Donges, Jonathan F.; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik V.; Marwan, Norbert; Dijkstra, Henk A.; Kurths, Jürgen

    2015-11-01

    We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology.

  19. Softening the Blow of Social Exclusion: The Responsive Theory of Social Exclusion.

    PubMed

    Freedman, Gili; Williams, Kipling D; Beer, Jennifer S

    2016-01-01

    Social exclusion is an interactive process between multiple people, yet previous research has focused almost solely on the negative impacts on targets. What advice is there for people on the other side (i.e., sources) who want to minimize its negative impact and preserve their own reputation? To provide an impetus for research on the interactive nature of exclusion, we propose the Responsive Theory of Social Exclusion. Our theory postulates that targets and sources' needs are better maintained if sources use clear, explicit verbal communication. We propose that sources have three options: explicit rejection (clearly stating no), ostracism (ignoring), and ambiguous rejection (being unclear). Drawing on psychology, sociology, communications, and business research, we propose that when sources use explicit rejection, targets' feelings will be less hurt, their needs will be better protected, and sources will experience less backlash and emotional toil than if sources use ambiguous rejection or ostracism. Finally, we propose how the language of rejections may impact both parties.

  20. Using Generalizability Theory to Examine Sources of Variance in Observed Behaviors within High School Classrooms

    ERIC Educational Resources Information Center

    Abry, Tashia; Cash, Anne H.; Bradshaw, Catherine P.

    2014-01-01

    Generalizability theory (GT) offers a useful framework for estimating the reliability of a measure while accounting for multiple sources of error variance. The purpose of this study was to use GT to examine multiple sources of variance in and the reliability of school-level teacher and high school student behaviors as observed using the tool,…

  1. Efficient image enhancement using sparse source separation in the Retinex theory

    NASA Astrophysics Data System (ADS)

    Yoon, Jongsu; Choi, Jangwon; Choe, Yoonsik

    2017-11-01

    Color constancy is the feature of the human vision system (HVS) that ensures the relative constancy of the perceived color of objects under varying illumination conditions. The Retinex theory of machine vision systems is based on the HVS. Among Retinex algorithms, the physics-based algorithms are efficient; however, they generally do not satisfy the local characteristics of the original Retinex theory because they eliminate global illumination from their optimization. We apply the sparse source separation technique to the Retinex theory to present a physics-based algorithm that satisfies the locality characteristic of the original Retinex theory. Previous Retinex algorithms have limited use in image enhancement because the total variation Retinex results in an overly enhanced image and the sparse source separation Retinex cannot completely restore the original image. In contrast, our proposed method preserves the image edge and can very nearly replicate the original image without any special operation.

  2. Getting down to cases: the revival of casuistry in bioethics.

    PubMed

    Arras, J D

    1991-02-01

    This article examines the emergence of casuistical case analysis as a methodological alternative to more theory-driven approaches in bioethics research and education. Focusing on The Abuse of Casuistry by A. Jonsen and S. Toulmin, the article articulates the most characteristic features of this modern-day casuistry (e.g., the priority allotted to case interpretation and analogical reasoning over abstract theory, the resemblance of casuistry to common law traditions, the 'open texture' of its principles, etc.) and discusses some problems with casuistry as an 'anti-theoretical' method. It is argued that casuistry so defined is 'theory modest' rather than 'theory free' and that ethical theory can still play a significant role in casuistical analysis; that casuistical analyses will encounter conflicting 'deep' interpretations of our social practices and institutions, and are therefore unlikely sources of increased social consensus on controversial bioethical questions; that its conventionalism raises questions about casuistry's ability to criticize norms embedded in the societal consensus; and that casuistry's emphasis upon analogical reasoning may tend to reinforce the individualistic nature of much bioethical writing. It is concluded that, not-withstanding these problems, casuistry represents a promising alternative to the regnant model of 'applied ethics' (i.e., to the ritualistic invocation of the so-called 'principles of bioethics'). The pedagogical implications of casuistry are addressed throughout the paper and include the following recommendations: (1) use real cases, (2) make them long, richly detailed and comprehensive, (3) present complex sequences of cases, (4) stress the problem of 'moral diagnosis', and (5) be ever mindful of the limits of casuistical analysis.

  3. Effect of transverse nonuniformity of the rf field on the efficiency of microwave sources driven by linear electron beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nusinovich, G.S.; Sinitsyn, O.V.

    This paper contains a simple analytical theory that allows one to evaluate the effect of transverse nonuniformity of the rf field on the interaction efficiency in various microwave sources driven by linear electron beams. The theory is, first, applied to the systems where the beams of cylindrical symmetry interact with rf fields of microwave circuits having Cartesian geometry. Also, various kinds of microwave devices driven by sheet electron beams (orotrons, clinotrons) are considered. The theory can be used for evaluating the efficiency of novel sources of coherent terahertz radiation.

  4. Common law morality.

    PubMed

    Arras, John D

    1990-01-01

    Arras reviews A.R. Jonsen and S. Toulmin's The Abuse of Casuistry: A History of Moral Reasoning (University of California Press; 1988). Jonsen and Toulmin, both of whom worked with the federal National Commission for the Protection of Human Subjects, critique the role of ethical theory in practical deliberation and attack the belief that moral theories are "universal, mutually exclusive sources of...ethical truth." They argue that "all genuine ethics is already concrete and particular," and champion casuistical reasoning with its emphasis on particularity and practical judgment. Arras find The Abuse of Casuistry to be "a fascinating and thought-provoking study of moral methodology that will enrich our understanding of moral reasoning and quicken the ongoing debate over the appropriate role of ethical theory in bioethical analysis." He questions, however, the authors' faith in casuistry as the means to achieve social consensus on biomedical controversies.

  5. Landscape ecological risk assessment study in arid land

    NASA Astrophysics Data System (ADS)

    Gong, Lu; Amut, Aniwaer; Shi, Qingdong; Wang, Gary Z.

    2007-09-01

    The ecosystem risk assessment is an essential decision making system for predicting the reconstruction and recovery of a damaged ecosystem after intensive mankind activities. The sustainability of environment and resources of the lake ecosystem in arid districts have been paid close attention to by international communities as well as numerous experts and scholars. The ecological risk assessment offered a scientific foundation for making the decision and execution of ecological risk management. Bosten Lake, the largest inland freshwater lake in China, is the main water source of the industrial and agricultural production as well as the local residence in Yanqi basin, Kuara city and Yuri County in the southern Xinjiang. Bosten Lake also provides a direct water source for emergency transportation in the Lower Reaches of Tarim River. However, with the intensive utilizations of water and soil resources, the environmental condition in the Bosten Lake has become more and more serious. In this study, the theory and method of landscape ecological risk assessment has been practiced using 3S technologies combined with the frontier theory of landscape ecology. Defining the mainly risk resource including flood, drought, water pollution and rich nutrition of water has been evaluated based on the ecosystem risk assessment system. The main process includes five stages: regional natural resources analysis, risk receptor selection, risk sources evaluation, exposure and hazard analysis, and integrated risk assessment. Based on the risk assessment results, the environmental risk management countermeasure has been determined.

  6. Local spectrum analysis of field propagation in an anisotropic medium. Part I. Time-harmonic fields.

    PubMed

    Tinkelman, Igor; Melamed, Timor

    2005-06-01

    The phase-space beam summation is a general analytical framework for local analysis and modeling of radiation from extended source distributions. In this formulation, the field is expressed as a superposition of beam propagators that emanate from all points in the source domain and in all directions. In this Part I of a two-part investigation, the theory is extended to include propagation in anisotropic medium characterized by a generic wave-number profile for time-harmonic fields; in a companion paper [J. Opt. Soc. Am. A 22, 1208 (2005)], the theory is extended to time-dependent fields. The propagation characteristics of the beam propagators in a homogeneous anisotropic medium are considered. With use of Gaussian windows for the local processing of either ordinary or extraordinary electromagnetic field distributions, the field is represented by a phase-space spectral distribution in which the propagating elements are Gaussian beams that are formulated by using Gaussian plane-wave spectral distributions over the extended source plane. By applying saddle-point asymptotics, we extract the Gaussian beam phenomenology in the anisotropic environment. The resulting field is parameterized in terms of the spatial evolution of the beam curvature, beam width, etc., which are mapped to local geometrical properties of the generic wave-number profile. The general results are applied to the special case of uniaxial crystal, and it is found that the asymptotics for the Gaussian beam propagators, as well as the physical phenomenology attached, perform remarkably well.

  7. Flow of Funds Modeling for Localized Financial Markets: An Application of Spatial Price and Allocation Activity Analysis Models.

    DTIC Science & Technology

    1981-01-01

    on modeling the managerial aspects of the firm. The second has been the application to economic theory led by ...individual portfolio optimization problems which were embedded in a larger global optimization problem. In the global problem, portfolios were linked by market ...demand quantities or be given by linear demand relationships. As in~ the source markets , the model

  8. Interior Noise

    NASA Technical Reports Server (NTRS)

    Mixson, John S.; Wilby, John F.

    1991-01-01

    The generation and control of flight vehicle interior noise is discussed. Emphasis is placed on the mechanisms of transmission through airborne and structure-borne paths and the control of cabin noise by path modification. Techniques for identifying the relative contributions of the various source-path combinations are also discussed along with methods for the prediction of aircraft interior noise such as those based on the general modal theory and statistical energy analysis.

  9. Finite-key security analysis of quantum key distribution with imperfect light sources

    DOE PAGES

    Mizutani, Akihiro; Curty, Marcos; Lim, Charles Ci Wen; ...

    2015-09-09

    In recent years, the gap between theory and practice in quantum key distribution (QKD) has been significantly narrowed, particularly for QKD systems with arbitrarily flawed optical receivers. The status for QKD systems with imperfect light sources is however less satisfactory, in the sense that the resulting secure key rates are often overly dependent on the quality of state preparation. This is especially the case when the channel loss is high. Very recently, to overcome this limitation, Tamaki et al proposed a QKD protocol based on the so-called 'rejected data analysis', and showed that its security in the limit of infinitelymore » long keys is almost independent of any encoding flaw in the qubit space, being this protocol compatible with the decoy state method. Here, as a step towards practical QKD, we show that a similar conclusion is reached in the finite-key regime, even when the intensity of the light source is unstable. More concretely, we derive security bounds for a wide class of realistic light sources and show that the bounds are also efficient in the presence of high channel loss. Our results strongly suggest the feasibility of long distance provably secure communication with imperfect light sources.« less

  10. Gamma-ray burst theory: Back to the drawing board

    NASA Technical Reports Server (NTRS)

    Harding, Alice K.

    1994-01-01

    Gamma-ray bursts have always been intriguing sources to study in terms of particle acceleration, but not since their discovery two decades ago has the theory of these objects been in such turmoil. Prior to the launch of Compton Gamma-Ray Observatory and observations by Burst and Transient Source Experiment (BATSE), there was strong evidence pointing to magnetized Galactic neutron stars as the sources of gamma-ray bursts. However, since BATSE the observational picture has changed dramatically, requiring much more distant and possibly cosmological sources. I review the history of gamma-ray burst theory from the era of growing consensus for nearby neutron stars to the recent explosion of halo and cosmological models and the impact of the present confusion on the particle acceleration problem.

  11. A Preliminary ZEUS Lightning Location Error Analysis Using a Modified Retrieval Theory

    NASA Technical Reports Server (NTRS)

    Elander, Valjean; Koshak, William; Phanord, Dieudonne

    2004-01-01

    The ZEUS long-range VLF arrival time difference lightning detection network now covers both Europe and Africa, and there are plans for further expansion into the western hemisphere. In order to fully optimize and assess ZEUS lightning location retrieval errors and to determine the best placement of future receivers expected to be added to the network, a software package is being developed jointly between the NASA Marshall Space Flight Center (MSFC) and the University of Nevada Las Vegas (UNLV). The software package, called the ZEUS Error Analysis for Lightning (ZEAL), will be used to obtain global scale lightning location retrieval error maps using both a Monte Carlo approach and chi-squared curvature matrix theory. At the core of ZEAL will be an implementation of an Iterative Oblate (IO) lightning location retrieval method recently developed at MSFC. The IO method will be appropriately modified to account for variable wave propagation speed, and the new retrieval results will be compared with the current ZEUS retrieval algorithm to assess potential improvements. In this preliminary ZEAL work effort, we defined 5000 source locations evenly distributed across the Earth. We then used the existing (as well as potential future ZEUS sites) to simulate arrival time data between source and ZEUS site. A total of 100 sources were considered at each of the 5000 locations, and timing errors were selected from a normal distribution having a mean of 0 seconds and a standard deviation of 20 microseconds. This simulated "noisy" dataset was analyzed using the IO algorithm to estimate source locations. The exact locations were compared with the retrieved locations, and the results are summarized via several color-coded "error maps."

  12. Source Similarity and Social Media Health Messages: Extending Construal Level Theory to Message Sources.

    PubMed

    Young, Rachel

    2015-09-01

    Social media users post messages about health goals and behaviors to online social networks. Compared with more traditional sources of health communication such as physicians or health journalists, peer sources are likely to be perceived as more socially close or similar, which influences how messages are processed. This experimental study uses construal level theory of psychological distance to predict how mediated health messages from peers influence health-related cognition and behavioral intention. Participants were exposed to source cues that identified peer sources as being either highly attitudinally and demographically similar to or different from participants. As predicted by construal level theory, participants who perceived sources of social media health messages as highly similar listed a greater proportion of beliefs about the feasibility of health behaviors and a greater proportion of negative beliefs, while participants who perceived sources as more dissimilar listed a greater proportion of positive beliefs about the health behaviors. Results of the study could be useful in determining how health messages from peers could encourage individuals to set realistic health goals.

  13. Dogs and Monsters: Moral Status Claims in the Fiction of Dean Koontz.

    PubMed

    Smith, Stephen W

    2016-03-01

    This article explores conceptions of moral status in the work of American thriller author Dean Koontz. It begins by examining some of the general theories of moral status used by philosophers to determine whether particular entities have moral status. This includes both uni-criterial theories and multi-criterial theories of moral status. After this examination, the article argues for exploring bioethics conceptions in popular fiction. Popular fiction is considered a rich source for analysis because it provides not only a good approximation of the beliefs of ordinary members of the moral community, but also explores important issues in a context where ordinary individuals are likely to encounter them. Following on from this, the article then explores theories of moral status in the context of Koontz's novels. In particular, the article focuses on the novel Watchers and Koontz's Frankenstein series. Through these works, Koontz indicates that entities have moral status for a variety of reasons and thus presumably, he is a proponent of multi-criterial theories of moral status. The article concludes with an examination of what this might mean for our understanding of moral status claims generally.

  14. Sociological analysis and comparative education

    NASA Astrophysics Data System (ADS)

    Woock, Roger R.

    1981-12-01

    It is argued that comparative education is essentially a derivative field of study, in that it borrows theories and methods from academic disciplines. After a brief humanistic phase, in which history and philosophy were central for comparative education, sociology became an important source. In the mid-50's and 60's, sociology in the United States was characterised by Structural Functionalism as a theory, and Social Survey as a dominant methodology. Both were incorporated into the development of comparative education. Increasingly in the 70's, and certainly today, the new developments in sociology are characterised by an attack on Positivism, which is seen as the philosophical position underlying both functionalism and survey methods. New or re-discovered theories with their attendant methodologies included Marxism, Phenomenological Sociology, Critical Theory, and Historical Social Science. The current relationship between comparative education and social science is one of uncertainty, but since social science is seen to be returning to its European roots, the hope is held out for the development of an integrated social theory and method which will provide a much stronger basis for developments in comparative education.

  15. Relations between episodic memory, suggestibility, theory of mind, and cognitive inhibition in the preschool child.

    PubMed

    Melinder, Annika; Endestad, Tor; Magnussen, Svein

    2006-12-01

    The development of episodic memory, its relation to theory of mind (ToM), executive functions (e.g., cognitive inhibition), and to suggestibility was studied. Children (n= 115) between 3 and 6 years of age saw two versions of a video film and were tested for their memory of critical elements of the videos. Results indicated similar developmental trends for all memory measures, ToM, and inhibition, but ToM and inhibition were not associated with any memory measures. Correlations involving source memory was found in relation to specific questions, whereas inhibition and ToM were significantly correlated to resistance to suggestions. A regression analysis showed that age was the main contributor to resistance to suggestions, to correct source monitoring, and to correct responses to specific questions. Inhibition was also a significant main predictor of resistance to suggestive questions, whereas the relative contribution of ToM was wiped out when an extended model was tested.

  16. Vertical tilts of tropospheric waves - Observations and theory

    NASA Technical Reports Server (NTRS)

    Ebisuzaki, Wesley

    1991-01-01

    Two methods are used to investigate the vertical tilts of planetary waves as functions of zonal wavenumber and frequency. The vertical tilts are computed by cross-spectral analysis of the geopotential heights at different pressures. In the midlatitude troposphere, the eastward-moving waves had a westward tilt with height, as expected, but the westward-moving waves with frequencies higher than 0.2/d showed statistically significant eastward vertical tilts. For a free Rossby wave, this implies that the Eliassen-Palm flux is downward along with its energy propagation. A downward energy propagation suggests an upper-level source of these waves. It is proposed that the eastward-tilting waves were forced by the nonlinear interaction of stationary waves and baroclinically unstable cyclone-scale waves. The predicted vertical tilt and phase speed were consistent with the observations. In addition, simulations of a general circulation model were analyzed. In the control run, eastward-tilting waves disappeared when the sources of stationary waves were removed. This is consistent with the present theory.

  17. An acoustic experimental and theoretical investigation of single disc propellers

    NASA Technical Reports Server (NTRS)

    Bumann, Elizabeth A.; Korkan, Kenneth D.

    1989-01-01

    An experimental study of the acoustic field associated with two, three, and four blade propeller configurations with a blade root angle of 50 deg was performed in the Texas A&M University 5 ft. x 6 ft. acoustically-insulated subsonic wind tunnel. A waveform analysis package was utilized to obtain experimental acoustic time histories, frequency spectra, and overall sound pressure level (OASPL) and served as a basis for comparison to the theoretical acoustic compact source theory of Succi (1979). Valid for subsonic tip speeds, the acoustic analysis replaced each blade by an array of spiraling point sources which exhibited a unique force vector and volume. The computer analysis of Succi was modified to include a propeller performance strip analysis which used a NACA 4-digit series airfoil data bank to calculate lift and drag for each blade segment given the geometry and motion of the propeller. Theoretical OASPL predictions were found to moderately overpredict experimental values for all operating conditions and propeller configurations studied.

  18. Density matrix perturbation theory for magneto-optical response of periodic insulators

    NASA Astrophysics Data System (ADS)

    Lebedeva, Irina; Tokatly, Ilya; Rubio, Angel

    2015-03-01

    Density matrix perturbation theory offers an ideal theoretical framework for the description of response of solids to arbitrary electromagnetic fields. In particular, it allows to consider perturbations introduced by uniform electric and magnetic fields under periodic boundary conditions, though the corresponding potentials break the translational invariance of the Hamiltonian. We have implemented the density matrix perturbation theory in the open-source Octopus code on the basis of the efficient Sternheimer approach. The procedures for responses of different order to electromagnetic fields, including electric polarizability, orbital magnetic susceptibility and magneto-optical response, have been developed and tested by comparison with the results for finite systems and for wavefunction-based perturbation theory, which is already available in the code. Additional analysis of the orbital magneto-optical response is performed on the basis of analytical models. Symmetry limitations to observation of the magneto-optical response are discussed. The financial support from the Marie Curie Fellowship PIIF-GA-2012-326435 (RespSpatDisp) is gratefully acknowledged.

  19. A multidimensional analysis of the epistemic origins of nursing theories, models, and frameworks.

    PubMed

    Beckstead, Jason W; Beckstead, Laura Grace

    2006-01-01

    The purpose of this article is to introduce our notion of epistemic space and to demonstrate its utility for understanding the origins and trajectories of nursing theory in the 20th century using multidimensional scaling (MDS). A literature review was conducted on primary and secondary sources written by and about 20 nurse theorists to investigate whether or not they cited 129 different scholars in the fields of anthropology, biology, nursing, philosophy, psychology, and sociology. Seventy-four scholars were identified as having been cited by at least two nurse theorists (319 citations total). Proximity scores, quantifying the similarity among nurse theorists based on proportions of shared citations, were calculated and analyzed using MDS. The emergent model of epistemic space that accommodated these similarities among nurse theorists revealed the systematic influence of scholars from various fields, notably psychology, biology, and philosophy. We believe that this schema and resulting taxonomy will prove useful for furthering our understanding of the relationships among nursing theories and theories in other fields of science.

  20. [Three types of self-esteem: its characteristic differences of contingency and contentment of sources of self-esteem].

    PubMed

    Ito, Masaya; Kawasaki, Naoki; Kodama, Masahiro

    2011-02-01

    Previous research and theory (Crocker & Wolfe, 2001; Kernis, 2003) suggests that adaptive self-esteem stems from just being oneself, and is characterized by a sense of authenticity (SOA). Maladaptive self-esteem is derived from meeting external standards and social comparisons, and is characterized by a sense of superiority (SOS). Thus, the qualitative difference between SOA and SOS depends on the sources of self-esteem. We hypothesized that SOA is related to internal sources of self-esteem, while SOS is related to external sources. In order to control for covariance, global self-esteem was also examined in a questionnaire survey of self-esteem that was administered to 273 university students. The results of a partial correlation analysis showed that SOA was positively correlated with internal sources of self-esteem such as committed activities and efforts for self-development. In contrast, SOS was positively correlated with external sources of self-esteem such as approval from others and appearance. These results mainly support our hypotheses.

  1. Determining the optimal forensic DNA analysis procedure following investigation of sample quality.

    PubMed

    Hedell, Ronny; Hedman, Johannes; Mostad, Petter

    2018-07-01

    Crime scene traces of various types are routinely sent to forensic laboratories for analysis, generally with the aim of addressing questions about the source of the trace. The laboratory may choose to analyse the samples in different ways depending on the type and quality of the sample, the importance of the case and the cost and performance of the available analysis methods. Theoretically well-founded guidelines for the choice of analysis method are, however, lacking in most situations. In this paper, it is shown how such guidelines can be created using Bayesian decision theory. The theory is applied to forensic DNA analysis, showing how the information from the initial qPCR analysis can be utilized. It is assumed the alternatives for analysis are using a standard short tandem repeat (STR) DNA analysis assay, using the standard assay and a complementary assay, or the analysis may be cancelled following quantification. The decision is based on information about the DNA amount and level of DNA degradation of the forensic sample, as well as case circumstances and the cost for analysis. Semi-continuous electropherogram models are used for simulation of DNA profiles and for computation of likelihood ratios. It is shown how tables and graphs, prepared beforehand, can be used to quickly find the optimal decision in forensic casework.

  2. Detailed Requirements Analysis for a Management Information System for the Department of Family Practice and Community Medicine at Silas B. Hays Army Community Hospital, Fort Ord, California

    DTIC Science & Technology

    1989-03-01

    Chapter II. Chapter UI discusses the theory of information systems and the analysis and design of such systems. The last section of Chapter II introduces...34 Improved personnel morale and job satisfaction. Doctors and hospital administrators are trying to recover from the medical computing lag which has...discussed below). The primary source of equipment authorizations is the Table of Distribution and Allowances ( TDA ) which shows the equipment authorized to be

  3. Coherent transport and energy flow patterns in photosynthesis under incoherent excitation.

    PubMed

    Pelzer, Kenley M; Can, Tankut; Gray, Stephen K; Morr, Dirk K; Engel, Gregory S

    2014-03-13

    Long-lived coherences have been observed in photosynthetic complexes after laser excitation, inspiring new theories regarding the extreme quantum efficiency of photosynthetic energy transfer. Whether coherent (ballistic) transport occurs in nature and whether it improves photosynthetic efficiency remain topics of debate. Here, we use a nonequilibrium Green's function analysis to model exciton transport after excitation from an incoherent source (as opposed to coherent laser excitation). We find that even with an incoherent source, the rate of environmental dephasing strongly affects exciton transport efficiency, suggesting that the relationship between dephasing and efficiency is not an artifact of coherent excitation. The Green's function analysis provides a clear view of both the pattern of excitonic fluxes among chromophores and the multidirectionality of energy transfer that is a feature of coherent transport. We see that even in the presence of an incoherent source, transport occurs by qualitatively different mechanisms as dephasing increases. Our approach can be generalized to complex synthetic systems and may provide a new tool for optimizing synthetic light harvesting materials.

  4. Thermodynamics of greenhouse systems for the northern latitudes: analysis, evaluation and prospects for primary energy saving.

    PubMed

    Bronchart, Filip; De Paepe, Michel; Dewulf, Jo; Schrevens, Eddie; Demeyer, Peter

    2013-04-15

    In Flanders and the Netherlands greenhouse production systems produce economically important quantities of vegetables, fruit and ornamentals. Indoor environmental control has resulted in high primary energy use. Until now, the research on saving primary energy in greenhouse systems has been mainly based on analysis of energy balances. However, according to the thermodynamic theory, an analysis based on the concept of exergy (free energy) and energy can result in new insights and primary energy savings. Therefore in this paper, we analyse the exergy and energy of various processes, inputs and outputs of a general greenhouse system. Also a total system analysis is then performed by linking the exergy analysis with a dynamic greenhouse climate growth simulation model. The exergy analysis indicates that some processes ("Sources") lie at the origin of several other processes, both destroying the exergy of primary energy inputs. The exergy destruction of these Sources is caused primarily by heat and vapour loss. Their impact can be compensated by exergy input from heating, solar radiation, or both. If the exergy destruction of these Sources is reduced, the necessary compensation can also be reduced. This can be accomplished through insulating the greenhouse and making the building more airtight. Other necessary Sources, namely transpiration and loss of CO2, have a low exergy destruction compared to the other Sources. They are therefore the best candidate for "pump" technologies ("vapour heat pump" and "CO2 pump") designed to have a low primary energy use. The combination of these proposed technologies results in an exergy efficient greenhouse with the highest primary energy savings. It can be concluded that exergy analyses add additional information compared to only energy analyses and it supports the development of primary energy efficient greenhouse systems. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Softening the Blow of Social Exclusion: The Responsive Theory of Social Exclusion

    PubMed Central

    Freedman, Gili; Williams, Kipling D.; Beer, Jennifer S.

    2016-01-01

    Social exclusion is an interactive process between multiple people, yet previous research has focused almost solely on the negative impacts on targets. What advice is there for people on the other side (i.e., sources) who want to minimize its negative impact and preserve their own reputation? To provide an impetus for research on the interactive nature of exclusion, we propose the Responsive Theory of Social Exclusion. Our theory postulates that targets and sources’ needs are better maintained if sources use clear, explicit verbal communication. We propose that sources have three options: explicit rejection (clearly stating no), ostracism (ignoring), and ambiguous rejection (being unclear). Drawing on psychology, sociology, communications, and business research, we propose that when sources use explicit rejection, targets’ feelings will be less hurt, their needs will be better protected, and sources will experience less backlash and emotional toil than if sources use ambiguous rejection or ostracism. Finally, we propose how the language of rejections may impact both parties. PMID:27777566

  6. Systems thinking applied to safety during manual handling tasks in the transport and storage industry.

    PubMed

    Goode, Natassia; Salmon, Paul M; Lenné, Michael G; Hillard, Peter

    2014-07-01

    Injuries resulting from manual handling tasks represent an on-going problem for the transport and storage industry. This article describes an application of a systems theory-based approach, Rasmussen's (1997. Safety Science 27, 183), risk management framework, to the analysis of the factors influencing safety during manual handling activities in a freight handling organisation. Observations of manual handling activities, cognitive decision method interviews with workers (n=27) and interviews with managers (n=35) were used to gather information about three manual handling activities. Hierarchical task analysis and thematic analysis were used to identify potential risk factors and performance shaping factors across the levels of Rasmussen's framework. These different data sources were then integrated using Rasmussen's Accimap technique to provide an overall analysis of the factors influencing safety during manual handling activities in this context. The findings demonstrate how a systems theory-based approach can be applied to this domain, and suggest that policy-orientated, rather than worker-orientated, changes are required to prevent future manual handling injuries. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. NLO QCD effective field theory analysis of W +W - production at the LHC including fermionic operators

    DOE PAGES

    Baglio, Julien; Dawson, Sally; Lewis, Ian M.

    2017-10-03

    In this paper, we study the impact of anomalous gauge boson and fermion couplings on the production of W +W - pairs at the LHC. Helicity amplitudes are presented separately to demonstrate the sources of new physics contributions and the impact of QCD and electroweak corrections. The QCD corrections have important effects on the fits to anomalous couplings, in particular when one W boson is longitudinally polarized and the other is transversely polarized. In effective field theory language, we demonstrate that the dimension-6 approximation to constraining new physics effects in W +W - pair production fails at p T ~more » 500 - 1000 GeV.« less

  8. On the long range propagation of sound over irregular terrain

    NASA Technical Reports Server (NTRS)

    Howe, M. S.

    1984-01-01

    The theory of sound propagation over randomly irregular, nominally plane terrain of finite impedance is discussed. The analysis is an extension of the theory of coherent scatter originally proposed by Biot for an irregular rigid surface. It combines Biot's approach, wherein the surface irregularities are modeled by a homogeneous distribution of hemispherical bosses, with more conventional analyses in which the ground is modeled as a smooth plane of finite impedance. At sufficiently low frequencies the interaction of the surface irregularities with the nearfield of a ground-based source leads to the production of surface waves, which are effective in penetrating the ground shadow zone predicted for a smooth surface of the same impedance.

  9. NLO QCD effective field theory analysis of W +W - production at the LHC including fermionic operators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baglio, Julien; Dawson, Sally; Lewis, Ian M.

    In this paper, we study the impact of anomalous gauge boson and fermion couplings on the production of W +W - pairs at the LHC. Helicity amplitudes are presented separately to demonstrate the sources of new physics contributions and the impact of QCD and electroweak corrections. The QCD corrections have important effects on the fits to anomalous couplings, in particular when one W boson is longitudinally polarized and the other is transversely polarized. In effective field theory language, we demonstrate that the dimension-6 approximation to constraining new physics effects in W +W - pair production fails at p T ~more » 500 - 1000 GeV.« less

  10. [Evaluation of national prevention campaigns against AIDS: analysis model].

    PubMed

    Hausser, D; Lehmann, P; Dubois, F; Gutzwiller, F

    1987-01-01

    The evaluation of the "Stop-Aids" campaign is based upon a model of behaviour modification (McAlister) which includes the communication theory of McGuire and the social learning theory of Bandura. Using this model, it is possible to define key variables that are used to measure the impact of the campaign. Process evaluation allows identification of multipliers that reinforce and confirm the initial message of prevention (source) thereby encouraging behaviour modifications that are likely to reduce the transmission of HIV (condom use, no sharing of injection material, monogamous relationship, etc.). Twelve studies performed by seven teams in the three linguistic areas contribute to the project. A synthesis of these results will be performed by the IUMSP.

  11. Source-Free Exchange-Correlation Magnetic Fields in Density Functional Theory.

    PubMed

    Sharma, S; Gross, E K U; Sanna, A; Dewhurst, J K

    2018-03-13

    Spin-dependent exchange-correlation energy functionals in use today depend on the charge density and the magnetization density: E xc [ρ, m]. However, it is also correct to define the functional in terms of the curl of m for physical external fields: E xc [ρ,∇ × m]. The exchange-correlation magnetic field, B xc , then becomes source-free. We study this variation of the theory by uniquely removing the source term from local and generalized gradient approximations to the functional. By doing so, the total Kohn-Sham moments are improved for a wide range of materials for both functionals. Significantly, the moments for the pnictides are now in good agreement with experiment. This source-free method is simple to implement in all existing density functional theory codes.

  12. Communal Resources in Open Source Software Development

    ERIC Educational Resources Information Center

    Spaeth, Sebastian; Haefliger, Stefan; von Krogh, Georg; Renzl, Birgit

    2008-01-01

    Introduction: Virtual communities play an important role in innovation. The paper focuses on the particular form of collective action in virtual communities underlying as Open Source software development projects. Method: Building on resource mobilization theory and private-collective innovation, we propose a theory of collective action in…

  13. Quantitative estimation of minimum offset for multichannel surface-wave survey with actively exciting source

    USGS Publications Warehouse

    Xu, Y.; Xia, J.; Miller, R.D.

    2006-01-01

    Multichannel analysis of surface waves is a developing method widely used in shallow subsurface investigations. The field procedures and related parameters are very important for successful applications. Among these parameters, the source-receiver offset range is seldom discussed in theory and normally determined by empirical or semi-quantitative methods in current practice. This paper discusses the problem from a theoretical perspective. A formula for quantitatively evaluating a layered homogenous elastic model was developed. The analytical results based on simple models and experimental data demonstrate that the formula is correct for surface wave surveys for near-surface applications. ?? 2005 Elsevier B.V. All rights reserved.

  14. Experimental assessment of theory for refraction of sound by a shear layer

    NASA Technical Reports Server (NTRS)

    Schlinker, R. H.; Amiet, R. K.

    1978-01-01

    The refraction angle and amplitude changes associated with sound transmission through a circular, open-jet shear layer were studied in a 0.91 m diameter open jet acoustic research tunnel. Free stream Mach number was varied from 0.1 to 0.4. Good agreement between refraction angle correction theory and experiment was obtained over the test Mach number, frequency and angle measurement range for all on-axis acoustic source locations. For off-axis source positions, good agreement was obtained at a source-to-shear layer separation distance greater than the jet radius. Measureable differences between theory and experiment occurred at a source-to-shear layer separation distance less than one jet radius. A shear layer turbulence scattering experiment was conducted at 90 deg to the open jet axis for the same free stream Mach numbers and axial source locations used in the refraction study. Significant discrete tone spectrum broadening and tone amplitude changes were observed at open jet Mach numbers above 0.2 and at acoustic source frequencies greater than 5 kHz. More severe turbulence scattering was observed for downstream source locations.

  15. Quantum noise in SIS mixers

    NASA Astrophysics Data System (ADS)

    Zorin, A. B.

    1985-03-01

    In the present, quantum-statistical analysis of SIS heterodyne mixer performance, the conventional three-port model of the mixer circuit and the microscopic theory of superconducting tunnel junctions are used to derive a general expression for a noise parameter previously used for the case of parametric amplifiers. This expression is numerically evaluated for various quasiparticle current step widths, dc bias voltages, local oscillator powers, signal frequencies, signal source admittances, and operation temperatures.

  16. A Theory of Conditional Information with Applications.

    DTIC Science & Technology

    1994-03-01

    reviewing inStru#66ns, searching dsting data sources, gathering and the collection of elformation. Send comments regarding On burden estimate or any other...residing at the nexus of so final analysis all information has a common context, many intellectual subtleties that have come into scientific namely the...a A b = 0). There is all too little expli- useful in managing data bases, combining data , cit distinction made between absolutely true statements

  17. Design and analysis of optimised class E power amplifier using shunt capacitance in the output structure

    NASA Astrophysics Data System (ADS)

    Hayati, Mohsen; Roshani, Sobhan; Zirak, Ali Reza

    2017-05-01

    In this paper, a class E power amplifier (PA) with operating frequency of 1 MHz is presented. MOSFET non-linear drain-to-source parasitic capacitance, linear external capacitance at drain-to-source port and linear shunt capacitance in the output structure are considered in design theory. One degree of freedom is added to the design of class E PA, by assuming the shunt capacitance in the output structure in the analysis. With this added design degree of freedom it is possible to achieve desired values for several parameters, such as output voltage, load resistance and operating frequency, while both zero voltage and zero derivative switching (ZVS and ZDS) conditions are satisfied. In the conventional class E PA, high value of peak switch voltage results in limitations for the design of amplifier, while in the presented structure desired specifications could be achieved with the safe margin of peak switch voltage. The results show that higher operating frequency and output voltage can also be achieved, compared to the conventional structure. PSpice software is used in order to simulate the designed circuit. The presented class E PA is designed, fabricated and measured. The measured results are in good agreement with simulation and theory results.

  18. Associations between cognitive biases and domains of schizotypy in a non-clinical sample.

    PubMed

    Aldebot Sacks, Stephanie; Weisman de Mamani, Amy Gina; Garcia, Cristina Phoenix

    2012-03-30

    Schizotypy is a non-clinical manifestation of the same underlying biological factors that give rise to psychotic disorders (Claridge and Beech, 1995). Research on normative populations scoring high on schizotypy is valuable because it may help elucidate the predisposition to schizophrenia (Jahshan and Sergi, 2007) and because performance is not confounded by issues present in schizophrenia samples. In the current study, a Confirmatory Factor Analysis was conducted using several comprehensive measures of schizotypy. As expected and replicating prior research, a four-factor model of schizotypy emerged including a positive, a negative, a cognitive disorganization, and an impulsive nonconformity factor. We also evaluated how each factor related to distinct cognitive biases. In support of hypotheses, increased self-certainty, decreased theory of mind, and decreased source memory were associated with higher scores on the positive factor; decreased theory of mind was associated with higher scores on the negative factor; and increased self-certainty was associated with greater impulsive nonconformity. Unexpectedly, decreased self-certainty and increased theory of mind were associated with greater cognitive disorganization, and decreased source memory was associated with greater impulsive nonconformity. These findings offer new insights by highlighting cognitive biases that may be risk factors for psychosis. Published by Elsevier Ireland Ltd.

  19. Type 2 and type 3 burst theory

    NASA Technical Reports Server (NTRS)

    Smith, D. F.

    1973-01-01

    The present state of the theory of type 3 bursts is reviewed by dividing the problem into the exciting agency, radiation source, and propagation of radiation between the source and the observer. In-situ measurements indicate that the excitors are electron streams of energy about 40 keV which are continuously relaxing. An investigation of neutralization of an electron stream indicates that n sub s is much less than 100,000 n sub e, where n sub s is the stream density and n sub e the coronal electron density. In situ observations are consistent with this result. An analysis of propagation of electrons in the current sheets of coronal streamers shows that such propagation at heights greater than 1 solar radius is impossible. The mechanisms for radiation are reviewed; it is shown that fundamental radiation at high frequencies (approximately 100 MHz) is highly beamed in the radial direction and that near the earth second harmonic radiation must be dominant. Because of beaming of the fundamental at high frequencies, it can often be quite weak near the limb so that the second harmonic is dominant. In considering propagation to the observer, the results of scattering of radiation are discussed. The present state of the theory of type 2 bursts is reviewed in the same manner as type 3 bursts.

  20. Theory-based design and field-testing of an intervention to support women choosing surgery for breast cancer: BresDex.

    PubMed

    Sivell, Stephanie; Marsh, William; Edwards, Adrian; Manstead, Antony S R; Clements, Alison; Elwyn, Glyn

    2012-02-01

    Design and undertake usability and field-testing evaluation of a theory-guided decision aid (BresDex) in supporting women choosing surgery for early breast cancer. An extended Theory of Planned Behavior (TPB) and the Common Sense Model of Illness Representations (CSM) guided the design of BresDex. BresDex was evaluated and refined across 3 cycles by interviewing 6 women without personal history of breast cancer, 8 women with personal history of breast cancer who had completed treatment and 11 women newly diagnosed with breast cancer. Participants were interviewed for views on content, presentation (usability) and perceived usefulness towards deciding on treatment (utility). Framework analysis was used, guided by the extended TPB and the CSM. BresDex was positively received in content and presentation (usability). It appeared an effective support to decision-making and useful source for further information, particularly in clarifying attitudes, social norms and perceived behavioral control, and presenting consequences of decisions (utility). This study illustrates the potential benefit of the extended TPB and CSM in designing a decision aid to support women choosing breast cancer surgery. BresDex could provide decision-making support and serve as an additional source of information, to complement the care received from the clinical team. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  1. The application of foraging theory to the information searching behaviour of general practitioners.

    PubMed

    Dwairy, Mai; Dowell, Anthony C; Stahl, Jean-Claude

    2011-08-23

    General Practitioners (GPs) employ strategies to identify and retrieve medical evidence for clinical decision making which take workload and time constraints into account. Optimal Foraging Theory (OFT) initially developed to study animal foraging for food is used to explore the information searching behaviour of General Practitioners. This study is the first to apply foraging theory within this context.Study objectives were: 1. To identify the sequence and steps deployed in identifiying and retrieving evidence for clinical decision making. 2. To utilise Optimal Foraging Theory to assess the effectiveness and efficiency of General Practitioner information searching. GPs from the Wellington region of New Zealand were asked to document in a pre-formatted logbook the steps and outcomes of an information search linked to their clinical decision making, and fill in a questionnaire about their personal, practice and information-searching backgrounds. A total of 115/155 eligible GPs returned a background questionnaire, and 71 completed their information search logbook. GPs spent an average of 17.7 minutes addressing their search for clinical information. Their preferred information sources were discussions with colleagues (38% of sources) and books (22%). These were the two most profitable information foraging sources (15.9 min and 9.5 min search time per answer, compared to 34.3 minutes in databases). GPs nearly always accessed another source when unsuccessful (95% after 1st source), and frequently when successful (43% after 2nd source). Use of multiple sources accounted for 41% of searches, and increased search success from 70% to 89%. By consulting in foraging terms the most 'profitable' sources of information (colleagues, books), rapidly switching sources when unsuccessful, and frequently double checking, GPs achieve an efficient trade-off between maximizing search success and information reliability, and minimizing searching time. As predicted by foraging theory, GPs trade time-consuming evidence-based (electronic) information sources for sources with a higher information reward per unit time searched. Evidence-based practice must accommodate these 'real world' foraging pressures, and Internet resources should evolve to deliver information as effectively as traditional methods of information gathering.

  2. An asymptotic theory of supersonic propeller noise

    NASA Technical Reports Server (NTRS)

    Envia, Edmane

    1992-01-01

    A theory for predicting the noise field of supersonic propellers with realistic blade geometries is presented. The theory, which utilizes a large-blade-count approximation, provides an efficient formula for predicting the radiation of sound from all three sources of propeller noise. Comparisons with a full numerical integration indicate that the levels predicted by this formula are quite accurate. Calculations also show that, for high speed propellers, the noise radiated by the Lighthill quadrupole source is rather substantial when compared with the noise radiated by the blade thickness and loading sources. Results from a preliminary application of the theory indicate that the peak noise level generated by a supersonic propeller initially increases with increasing tip helical Mach number, but is eventually reaches a plateau and does not increase further. The predicted trend shows qualitative agreement with the experimental observations.

  3. Constructing and Verifying Program Theory Using Source Documentation

    ERIC Educational Resources Information Center

    Renger, Ralph

    2010-01-01

    Making the program theory explicit is an essential first step in Theory Driven Evaluation (TDE). Once explicit, the program logic can be established making necessary links between the program theory, activities, and outcomes. Despite its importance evaluators often encounter situations where the program theory is not explicitly stated. Under such…

  4. Energetic Phenomena on the Sun: The Solar Maximum Mission Flare Workshop. Proceedings

    NASA Technical Reports Server (NTRS)

    Kundu, Mukul (Editor); Woodgate, Bruce (Editor)

    1986-01-01

    The general objectives of the conference were as follows: (1) Synthesize flare studies after three years of Solar Maximum Mission (SSM) data analysis. Encourage a broader participation in the SMM data analysis and combine this more fully with theory and other data sources-data obtained with other spacecraft such as the HINOTORI, p78-1, and ISEE-3 spacecrafts, and with the Very Large Array (VLA) and many other ground-based instruments. Many coordinated data sets, unprecedented in their breadth of coverage and multiplicity of sources, had been obtained within the structure of the Solar Maximum Year (SMY). (2) Stimulate joint studies, and publication in the general scientific literature. The intended primary benefit was for informal collaborations to be started or broadened at the Workshops with subsequent publications. (3) Provide a special publication resulting from the Workshop.

  5. Automorphic properties of low energy string amplitudes in various dimensions

    NASA Astrophysics Data System (ADS)

    Green, Michael B.; Russo, Jorge G.; Vanhove, Pierre

    2010-04-01

    This paper explores the moduli-dependent coefficients of higher-derivative interactions that appear in the low-energy expansion of the four-supergraviton amplitude of maximally supersymmetric string theory compactified on a d torus. These automorphic functions are determined for terms up to order ∂6R4 and various values of d by imposing a variety of consistency conditions. They satisfy Laplace eigenvalue equations with or without source terms, whose solutions are given in terms of Eisenstein series, or more general automorphic functions, for certain parabolic subgroups of the relevant U-duality groups. The ultraviolet divergences of the corresponding supergravity field theory limits are encoded in various logarithms, although the string theory expressions are finite. This analysis includes intriguing representations of SL(d) and SO(d,d) Eisenstein series in terms of toroidally compactified one and two-loop string and supergravity amplitudes.

  6. Comparison between the Health Belief Model and Subjective Expected Utility Theory: predicting incontinence prevention behaviour in post-partum women.

    PubMed

    Dolman, M; Chase, J

    1996-08-01

    A small-scale study was undertaken to test the relative predictive power of the Health Belief Model and Subjective Expected Utility Theory for the uptake of a behaviour (pelvic floor exercises) to reduce post-partum urinary incontinence in primigravida females. A structured questionnaire was used to gather data relevant to both models from a sample antenatal and postnatal primigravida women. Questions examined the perceived probability of becoming incontinent, the perceived (dis)utility of incontinence, the perceived probability of pelvic floor exercises preventing future urinary incontinence, the costs and benefits of performing pelvic floor exercises and sources of information and knowledge about incontinence. Multiple regression analysis focused on whether or not respondents intended to perform pelvic floor exercises and the factors influencing their decisions. Aggregated data were analysed to compare the Health Belief Model and Subjective Expected Utility Theory directly.

  7. A steady and oscillatory kernel function method for interfering surfaces in subsonic, transonic and supersonic flow. [prediction analysis techniques for airfoils

    NASA Technical Reports Server (NTRS)

    Cunningham, A. M., Jr.

    1976-01-01

    The theory, results and user instructions for an aerodynamic computer program are presented. The theory is based on linear lifting surface theory, and the method is the kernel function. The program is applicable to multiple interfering surfaces which may be coplanar or noncoplanar. Local linearization was used to treat nonuniform flow problems without shocks. For cases with imbedded shocks, the appropriate boundary conditions were added to account for the flow discontinuities. The data describing nonuniform flow fields must be input from some other source such as an experiment or a finite difference solution. The results are in the form of small linear perturbations about nonlinear flow fields. The method was applied to a wide variety of problems for which it is demonstrated to be significantly superior to the uniform flow method. Program user instructions are given for easy access.

  8. International Test Comparisons: Reviewing Translation Error in Different Source Language-Target Language Combinations

    ERIC Educational Resources Information Center

    Zhao, Xueyu; Solano-Flores, Guillermo; Qian, Ming

    2018-01-01

    This article addresses test translation review in international test comparisons. We investigated the applicability of the theory of test translation error--a theory of the multidimensionality and inevitability of test translation error--across source language-target language combinations in the translation of PISA (Programme of International…

  9. A Framework for Developing Vocational Education Theory and Practice.

    ERIC Educational Resources Information Center

    Nelson, Eugene A.; Pautler, Albert J.

    1988-01-01

    Asserts that education lacks a validated theory of learning and teaching. Chief among the causes is the lack of a framework within which diverse theories can be integrated. General systems theory is proposed as a source for a framework. (JOW)

  10. An application of the theory of planned behaviour to study the influencing factors of participation in source separation of food waste

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karim Ghani, Wan Azlina Wan Ab., E-mail: wanaz@eng.upm.edu.my; Rusli, Iffah Farizan, E-mail: iffahrusli@yahoo.com; Biak, Dayang Radiah Awang, E-mail: dayang@eng.upm.edu.my

    Highlights: ► Theory of planned behaviour (TPB) has been conducted to identify the influencing factors for participation in source separation of food waste using self administered questionnaires. ► The findings suggested several implications for the development and implementation of waste separation at home programme. ► The analysis indicates that the attitude towards waste separation is determined as the main predictors where this in turn could be a significant predictor of the repondent’s actual food waste separation behaviour. ► To date, none of similar have been reported elsewhere and this finding will be beneficial to local Authorities as indicator in designingmore » campaigns to promote the use of waste separation programmes to reinforce the positive attitudes. - Abstract: Tremendous increases in biodegradable (food waste) generation significantly impact the local authorities, who are responsible to manage, treat and dispose of this waste. The process of separation of food waste at its generation source is identified as effective means in reducing the amount food waste sent to landfill and can be reused as feedstock to downstream treatment processes namely composting or anaerobic digestion. However, these efforts will only succeed with positive attitudes and highly participations rate by the public towards the scheme. Thus, the social survey (using questionnaires) to analyse public’s view and influencing factors towards participation in source separation of food waste in households based on the theory of planned behaviour technique (TPB) was performed in June and July 2011 among selected staff in Universiti Putra Malaysia, Serdang, Selangor. The survey demonstrates that the public has positive intention in participating provided the opportunities, facilities and knowledge on waste separation at source are adequately prepared by the respective local authorities. Furthermore, good moral values and situational factors such as storage convenience and collection times are also encouraged public’s involvement and consequently, the participations rate. The findings from this study may provide useful indicator to the waste management authorities in Malaysia in identifying mechanisms for future development and implementation of food waste source separation activities in household programmes and communication campaign which advocate the use of these programmes.« less

  11. Multiscale and Multitemporal Urban Remote Sensing

    NASA Astrophysics Data System (ADS)

    Mesev, V.

    2012-07-01

    The remote sensing of urban areas has received much attention from scientists conducting studies on measuring sprawl, congestion, pollution, poverty, and environmental encroachment. Yet much of the research is case and data-specific where results are greatly influenced by prevailing local conditions. There seems to be a lack of epistemological links between remote sensing and conventional theoretical urban geography; in other words, an oversight for the appreciation of how urban theory fuels urban change and how urban change is measured by remotely sensed data. This paper explores basic urban theories such as centrality, mobility, materiality, nature, public space, consumption, segregation and exclusion, and how they can be measured by remote sensing sources. In particular, the link between structure (tangible objects) and function (intangible or immaterial behavior) is addressed as the theory that supports the wellknow contrast between land cover and land use classification from remotely sensed data. The paper then couches these urban theories and contributions from urban remote sensing within two analytical fields. The first is the search for an "appropriate" spatial scale of analysis, which is conveniently divided between micro and macro urban remote sensing for measuring urban structure, understanding urban processes, and perhaps contributions to urban theory at a variety of scales of analysis. The second is on the existence of a temporal lag between materiality of urban objects and the planning process that approved their construction, specifically how time-dependence in urban structural-functional models produce temporal lags that alter the causal links between societal and political functional demands and structural ramifications.

  12. Improving access to health information for older migrants by using grounded theory and social network analysis to understand their information behaviour and digital technology use.

    PubMed

    Goodall, K T; Newman, L A; Ward, P R

    2014-11-01

    Migrant well-being can be strongly influenced by the migration experience and subsequent degree of mainstream language acquisition. There is little research on how older Culturally And Linguistically Diverse (CALD) migrants who have 'aged in place' find health information, and the role which digital technology plays in this. Although the research for this paper was not focused on cancer, we draw out implications for providing cancer-related information to this group. We interviewed 54 participants (14 men and 40 women) aged 63-94 years, who were born in Italy or Greece, and who migrated to Australia mostly as young adults after World War II. Constructivist grounded theory and social network analysis were used for data analysis. Participants identified doctors, adult children, local television, spouse, local newspaper and radio as the most important information sources. They did not generally use computers, the Internet or mobile phones to access information. Literacy in their birth language, and the degree of proficiency in understanding and using English, influenced the range of information sources accessed and the means used. The ways in which older CALD migrants seek and access information has important implications for how professionals and policymakers deliver relevant information to them about cancer prevention, screening, support and treatment, particularly as information and resources are moved online as part of e-health. © 2014 John Wiley & Sons Ltd.

  13. Accuracy analysis and design of A3 parallel spindle head

    NASA Astrophysics Data System (ADS)

    Ni, Yanbing; Zhang, Biao; Sun, Yupeng; Zhang, Yuan

    2016-03-01

    As functional components of machine tools, parallel mechanisms are widely used in high efficiency machining of aviation components, and accuracy is one of the critical technical indexes. Lots of researchers have focused on the accuracy problem of parallel mechanisms, but in terms of controlling the errors and improving the accuracy in the stage of design and manufacturing, further efforts are required. Aiming at the accuracy design of a 3-DOF parallel spindle head(A3 head), its error model, sensitivity analysis and tolerance allocation are investigated. Based on the inverse kinematic analysis, the error model of A3 head is established by using the first-order perturbation theory and vector chain method. According to the mapping property of motion and constraint Jacobian matrix, the compensatable and uncompensatable error sources which affect the accuracy in the end-effector are separated. Furthermore, sensitivity analysis is performed on the uncompensatable error sources. The sensitivity probabilistic model is established and the global sensitivity index is proposed to analyze the influence of the uncompensatable error sources on the accuracy in the end-effector of the mechanism. The results show that orientation error sources have bigger effect on the accuracy in the end-effector. Based upon the sensitivity analysis results, the tolerance design is converted into the issue of nonlinearly constrained optimization with the manufacturing cost minimum being the optimization objective. By utilizing the genetic algorithm, the allocation of the tolerances on each component is finally determined. According to the tolerance allocation results, the tolerance ranges of ten kinds of geometric error sources are obtained. These research achievements can provide fundamental guidelines for component manufacturing and assembly of this kind of parallel mechanisms.

  14. Downscattering due to Wind Outflows in Compact X-ray Sources: Theory and Interpretation

    NASA Technical Reports Server (NTRS)

    Titarchuk, Lev; Shrader, Chris

    2004-01-01

    A number of recent lines of evidence point towards the presence of hot, outflowing plasma from the central regions of compact Galactic and extragalactic X-ray sources. Additionally, it has long been noted that many of these sources exhibit an "excess" continuum component, above approx. 10 keV, usually attributed to Compton Reflection from a static medium. Motivated by these facts, as well as by recent observational constraints on the Compton reflection models - specifically apparently discrepant variability timescales for line and continuum components in some cases - we consider possible of effects of out-flowing plasma on the high-energy continuum spectra of accretion powered compact objects. We present a general formulation for photon downscattering diffusion which includes recoil and Comptonization effects due to divergence of the flow. We then develop an analytical theory for the spectral formation in such systems that allows us to derive formulae for the emergent spectrum. Finally we perform the analytical model fitting on several Galactic X-ray binaries. Objects which have been modeled with high-covering-fraction Compton reflectors, such as GS1353-64 are included in our analysis. In addition, Cyg X-3, is which is widely believed to be characterized by dense circumstellar winds with temperature of order 10(exp 6) K, provides an interesting test case. Data from INTEGRAL and RXTE covering the approx. 3 - 300 keV range are used in our analysis. We further consider the possibility that the widely noted distortion of the power-law continuum above 10 keV may in some cases be explained by these spectral softening effects.

  15. On the role of glottis-interior sources in the production of voiced sound.

    PubMed

    Howe, M S; McGowan, R S

    2012-02-01

    The voice source is dominated by aeroacoustic sources downstream of the glottis. In this paper an investigation is made of the contribution to voiced speech of secondary sources within the glottis. The acoustic waveform is ultimately determined by the volume velocity of air at the glottis, which is controlled by vocal fold vibration, pressure forcing from the lungs, and unsteady backreactions from the sound and from the supraglottal air jet. The theory of aerodynamic sound is applied to study the influence on the fine details of the acoustic waveform of "potential flow" added-mass-type glottal sources, glottis friction, and vorticity either in the glottis-wall boundary layer or in the portion of the free jet shear layer within the glottis. These sources govern predominantly the high frequency content of the sound when the glottis is near closure. A detailed analysis performed for a canonical, cylindrical glottis of rectangular cross section indicates that glottis-interior boundary/shear layer vortex sources and the surface frictional source are of comparable importance; the influence of the potential flow source is about an order of magnitude smaller. © 2012 Acoustical Society of America

  16. Evaluation of Littoral Combat Ships for Open-Ocean Anti-Submarine Warfare

    DTIC Science & Technology

    2016-03-01

    known. Source: R. R. Hill, R. G. Carl, and L. E. Champagne , “Using Agent-Based Simulation to Empirically Examine Search Theory Using a Historical Case...coverage over a small area. Source: R. R. Hill, R. G. Carl, and L. E. Champagne , “Using Agent-Based Simulation to Empirically Examine Search Theory...Defense Tech, May 30. Hill, R R, R G Carl, and L E Champagne . “Using agent-based simulation to empirically examine search theory using a

  17. Wormhole solutions in f(R) gravity satisfying energy conditions

    NASA Astrophysics Data System (ADS)

    Mazharimousavi, S. Habib; Halilsoy, M.

    2016-10-01

    Without reference to exotic sources construction of viable wormholes in Einstein’s general relativity remained ever a myth. With the advent of modified theories, however, specifically the f(R) theory, new hopes arose for the possibility of such objects. From this token, we construct traversable wormholes in f(R) theory supported by a fluid source which respects at least the weak energy conditions. We provide an example (Example 1) of asymptotically flat wormhole in f(R) gravity without ghosts.

  18. [Simulation of CO2 exchange between forest canopy and atmosphere].

    PubMed

    Diao, Yiwei; Wang, Anzhi; Jin, Changjie; Guan, Dexin; Pei, Tiefan

    2006-12-01

    Estimating the scalar source/sink distribution of CO2 and its vertical fluxes within and above forest canopy continues to be a critical research problem in biosphere-atmosphere exchange processes and plant ecology. With broad-leaved Korean pine forest in Changbai Mountains as test object, and based on Raupach's localized near field theory, the source/sink and vertical flux distribution of CO2 within and above forest canopy were modeled through an inverse Lagrangian dispersion analysis. This model correctly predicted a strong positive CO2 source strength in the deeper layers of the canopy due to soil-plant respiration, and a strong CO2 sink in the upper layers of the canopy due to the assimilation by sunlit foliage. The foliage in the top layer of canopy changed from a CO2 source in the morning to a CO2 sink in the afternoon, while the soil constituted a strong CO2 source all the day. The simulation results accorded well with the eddy covariance CO2 flux measurements within and above the canopy, and the average precision was 89%. The CO2 exchange predicted by the analysis was averagely 15% higher than that of the eddy correlation, but exhibited identical temporal trend. Atmospheric stability remarkably affected the CO2 exchange between forest canopy and atmosphere.

  19. Sources for Developing a Theory of Visual Literacy.

    ERIC Educational Resources Information Center

    Hortin, John A.

    Organized as a bibliographic essay, this paper examines the many sources available for developing a theory of visual literacy. Several definitions are offered in order to clarify the meaning of the term "visual literacy" so that meaningful research can be conducted on the topic. Based on the review of resources, three recommendations are offered…

  20. Hamiltonian surface charges using external sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Troessaert, Cédric, E-mail: troessaert@cecs.cl

    2016-05-15

    In this work, we interpret part of the boundary conditions as external sources in order to partially solve the integrability problem present in the computation of surface charges associated to gauge symmetries in the hamiltonian formalism. We start by describing the hamiltonian structure of external symmetries preserving the action up to a transformation of the external sources of the theory. We then extend these results to the computation of surface charges for field theories with non-trivial boundary conditions.

  1. FIA: An Open Forensic Integration Architecture for Composing Digital Evidence

    NASA Astrophysics Data System (ADS)

    Raghavan, Sriram; Clark, Andrew; Mohay, George

    The analysis and value of digital evidence in an investigation has been the domain of discourse in the digital forensic community for several years. While many works have considered different approaches to model digital evidence, a comprehensive understanding of the process of merging different evidence items recovered during a forensic analysis is still a distant dream. With the advent of modern technologies, pro-active measures are integral to keeping abreast of all forms of cyber crimes and attacks. This paper motivates the need to formalize the process of analyzing digital evidence from multiple sources simultaneously. In this paper, we present the forensic integration architecture (FIA) which provides a framework for abstracting the evidence source and storage format information from digital evidence and explores the concept of integrating evidence information from multiple sources. The FIA architecture identifies evidence information from multiple sources that enables an investigator to build theories to reconstruct the past. FIA is hierarchically composed of multiple layers and adopts a technology independent approach. FIA is also open and extensible making it simple to adapt to technological changes. We present a case study using a hypothetical car theft case to demonstrate the concepts and illustrate the value it brings into the field.

  2. Probing Cosmic Infrared Sources: A Computer Modeling Approach

    DTIC Science & Technology

    1992-06-01

    developed to study various physical phenomena involving dust grains, e.g., molecule formation on grains, grain formation in expanding circumstellar...EVALUATION OF METHODS OF ANALYSIS IN INFRARED ASTR9?NOMY 16 4.0 THEORETICAL STUDIES INVOLVING DUST GRAINS., 16 4.1 Theory of Molecule Formation on Dust Grains...17 4.2 Modeling Grain Formation in Stellar Outflows 7 18 4.3 Infrared Emission from Fractal Grains * 19 4.4 Photochemistry in Circumstellar Envelopes

  3. Adaptations and Analysis of the AFIT Noise Radar Network for Indoor Navigation

    DTIC Science & Technology

    2013-03-01

    capable of producing bistatic/multistatic radar images. NTR is unique because it utilizes amplified random thermal noise as its transmission waveform...structure and operation of NTR is described. A minutia of the EM theory describing the various phenomenon found when operating RF devices in indoor...construction of NTR is simple in comparison to other CW radars. The system begins with a commercial thermal noise source, which produces a uniform

  4. Crowdsourced Formal Verification: A Business Case Analysis Toward a Human-Centered Business Model

    DTIC Science & Technology

    2015-06-01

    literacycampaignmc.org/wp-content/uploads/2011/11/ Compressed-State-of-Literacy-MC1.pdf Ryan , R. M., & Deci , E. L. (2000). Self - determination theory and the...crowd- sourced formal verification games provide intrinsic motivation. Ryan and Deci (2000) sum- marized three needs that drive the intrinsic motivation...competence, relatedness, and au- tonomy. Therefore, such games have to embrace the self - determination of the customers. Games, per se, can satisfy

  5. Crime in Schools and Colleges: A Study of Offenders and Arrestees Reported via National Incident-Based Reporting System Data. The CARD Report: Crime Analysis, Research and Development Unit

    ERIC Educational Resources Information Center

    Noonan, James H.; Vavra, Malissa C.

    2007-01-01

    Data from a variety of sources about crime in schools and colleges and characteristics of the people who commit these offenses provide key input in developing theories and operational applications that can help combat crime in this nation's schools, colleges, and universities. Given the myriad of data available, the objective of this study is to…

  6. Urban air quality estimation study, phase 1

    NASA Technical Reports Server (NTRS)

    Diamante, J. M.; Englar, T. S., Jr.; Jazwinski, A. H.

    1976-01-01

    Possibilities are explored for applying estimation theory to the analysis, interpretation, and use of air quality measurements in conjunction with simulation models to provide a cost effective method of obtaining reliable air quality estimates for wide urban areas. The physical phenomenology of real atmospheric plumes from elevated localized sources is discussed. A fluctuating plume dispersion model is derived. Individual plume parameter formulations are developed along with associated a priori information. Individual measurement models are developed.

  7. Origins of life: a comparison of theories and application to Mars

    NASA Technical Reports Server (NTRS)

    Davis, W. L.; McKay, C. P.

    1996-01-01

    The field of study that deals with the origins of life does not have a consensus for a theory of life's origin. An analysis of the range of theories offered shows that they share some common features that may be reliable predictors when considering the possible origins of life on another planet. The fundamental datum dealing with the origins of life is that life appeared early in the history of the Earth, probably before 3.5 Ga and possibly before 3.8 Ga. What might be called the standard theory (the Oparin-Haldane theory) posits the production of organic molecules on the early Earth followed by chemical reactions that produced increased organic complexity leading eventually to organic life capable of reproduction, mutation, and selection using organic material as nutrients. A distinct class of other theories (panspermia theories) suggests that life was carried to Earth from elsewhere--these theories receive some support from recent work on planetary impact processes. Other alternatives to the standard model suggest that life arose as an inorganic (clay) form and/or that the initial energy source was not organic material but chemical energy or sunlight. We find that the entire range of current theories suggests that liquid water is the quintessential environmental criterion for both the origin and sustenance of life. It is therefore of interest that during the time that life appeared on Earth we have evidence for liquid water present on the surface of Mars.

  8. Constructive neutral evolution: exploring evolutionary theory's curious disconnect.

    PubMed

    Stoltzfus, Arlin

    2012-10-13

    Constructive neutral evolution (CNE) suggests that neutral evolution may follow a stepwise path to extravagance. Whether or not CNE is common, the mere possibility raises provocative questions about causation: in classical neo-Darwinian thinking, selection is the sole source of creativity and direction, the only force that can cause trends or build complex features. However, much of contemporary evolutionary genetics departs from the conception of evolution underlying neo-Darwinism, resulting in a widening gap between what formal models allow, and what the prevailing view of the causes of evolution suggests. In particular, a mutationist conception of evolution as a 2-step origin-fixation process has been a source of theoretical innovation for 40 years, appearing not only in the Neutral Theory, but also in recent breakthroughs in modeling adaptation (the "mutational landscape" model), and in practical software for sequence analysis. In this conception, mutation is not a source of raw materials, but an agent that introduces novelty, while selection is not an agent that shapes features, but a stochastic sieve. This view, which now lays claim to important theoretical, experimental, and practical results, demands our attention. CNE provides a way to explore its most significant implications about the role of variation in evolution. Alex Kondrashov, Eugene Koonin and Johann Peter Gogarten reviewed this article.

  9. Thermal Damage Analysis in Biological Tissues Under Optical Irradiation: Application to the Skin

    NASA Astrophysics Data System (ADS)

    Fanjul-Vélez, Félix; Ortega-Quijano, Noé; Solana-Quirós, José Ramón; Arce-Diego, José Luis

    2009-07-01

    The use of optical sources in medical praxis is increasing nowadays. In this study, different approaches using thermo-optical principles that allow us to predict thermal damage in irradiated tissues are analyzed. Optical propagation is studied by means of the radiation transport theory (RTT) equation, solved via a Monte Carlo analysis. Data obtained are included in a bio-heat equation, solved via a numerical finite difference approach. Optothermal properties are considered for the model to be accurate and reliable. Thermal distribution is calculated as a function of optical source parameters, mainly optical irradiance, wavelength and exposition time. Two thermal damage models, the cumulative equivalent minutes (CEM) 43 °C approach and the Arrhenius analysis, are used. The former is appropriate when dealing with dosimetry considerations at constant temperature. The latter is adequate to predict thermal damage with arbitrary temperature time dependence. Both models are applied and compared for the particular application of skin thermotherapy irradiation.

  10. Testing theories to explore the drivers of cities' atmospheric emissions.

    PubMed

    Lankao, Patricia Romero; Tribbia, John L; Nychka, Doug

    2009-06-01

    Despite a growing body of evidence demonstrating the importance of cities as sources of many local, regional, and global impacts on the atmosphere, ecosystems, and human populations, most theories on the relationship between society and the environment have focused on the global or national level. A variety of theories exist on human-environment interactions; for example, ecological modernization, urban transitions, and human ecology. However, with the exception of urban transitions, these theories have been mainly concerned with nation states and have ignored the subnational and local (city) levels. This article aims at filling this gap by employing ordinary least squares regression to examine these theories at the city level using the STIRPAT formula. It finds that with the exception of population (which shows an unstable relationship with the impacts indicators applied in the analysis) a remarkable level of variation exists in the importance of drivers across the three exercises. This led us to conclude that urban atmospheric pollutants result from diverse activities (e.g., transportation, industrial), are formed through different processes (vehicle combustion, biomass burning), have a residence time ranging from hours to years, and are the outcome of diverse sets of societal and environmental drivers.

  11. Numerical evaluation of longitudinal motions of Wigley hulls advancing in waves by using Bessho form translating-pulsating source Green'S function

    NASA Astrophysics Data System (ADS)

    Xiao, Wenbin; Dong, Wencai

    2016-06-01

    In the framework of 3D potential flow theory, Bessho form translating-pulsating source Green's function in frequency domain is chosen as the integral kernel in this study and hybrid source-and-dipole distribution model of the boundary element method is applied to directly solve the velocity potential for advancing ship in regular waves. Numerical characteristics of the Green function show that the contribution of local-flow components to velocity potential is concentrated at the nearby source point area and the wave component dominates the magnitude of velocity potential in the far field. Two kinds of mathematical models, with or without local-flow components taken into account, are adopted to numerically calculate the longitudinal motions of Wigley hulls, which demonstrates the applicability of translating-pulsating source Green's function method for various ship forms. In addition, the mesh analysis of discrete surface is carried out from the perspective of ship-form characteristics. The study shows that the longitudinal motion results by the simplified model are somewhat greater than the experimental data in the resonant zone, and the model can be used as an effective tool to predict ship seakeeping properties. However, translating-pulsating source Green function method is only appropriate for the qualitative analysis of motion response in waves if the ship geometrical shape fails to satisfy the slender-body assumption.

  12. Persistence and resistance to extinction in the domestic dog: Basic research and applications to canine training.

    PubMed

    Hall, Nathaniel J

    2017-08-01

    This review summarizes the research investigating behavioral persistence and resistance to extinction in the dog. The first part of this paper reviews Behavioral Momentum Theory and its applications to Applied Behavior Analysis and training of pet dogs with persistent behavioral problems. I also highlight how research on Behavioral Momentum Theory can be applied to the training of detection dogs in an attempt to enhance detection performance in the presence of behavioral disruptors common in operational settings. In the second part of this review, I highlight more basic research on behavioral persistence with dogs, and how breed differences and experiences with humans as alternative sources of reinforcement can influence dogs' resistance to extinction of a target behavior. Applied Behavior Analysis and Behavior Momentum Theory have important applications for behavioral treatments to reduce the persistence of problem behavior in dogs and for the development of enhanced training methods that enhance the persistence of working dogs. Dogs can also be leveraged as natural models of stereotypic behavior and for exploring individual differences in behavioral persistence by evaluating breed and environmental variables associated with differences in canine persistance. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Inter-noise 89 - Engineering for environmental noise control; Proceedings of the International Conference on Noise Control Engineering, Newport Beach, CA, Dec. 4-6, 1989. Vols. 1 & 2

    NASA Astrophysics Data System (ADS)

    Maling, George C., Jr.

    Recent advances in noise analysis and control theory and technology are discussed in reviews and reports. Topics addressed include noise generation; sound-wave propagation; noise control by external treatments; vibration and shock generation, transmission, isolation, and reduction; multiple sources and paths of environmental noise; noise perception and the physiological and psychological effects of noise; instrumentation, signal processing, and analysis techniques; and noise standards and legal aspects. Diagrams, drawings, graphs, photographs, and tables of numerical data are provided.

  14. Generalizability Theory and Classical Test Theory

    ERIC Educational Resources Information Center

    Brennan, Robert L.

    2011-01-01

    Broadly conceived, reliability involves quantifying the consistencies and inconsistencies in observed scores. Generalizability theory, or G theory, is particularly well suited to addressing such matters in that it enables an investigator to quantify and distinguish the sources of inconsistencies in observed scores that arise, or could arise, over…

  15. Toward a Unified Theory of Context Dependence.

    ERIC Educational Resources Information Center

    Hanna, Gerald S.; Oaster, Thomas R.

    1978-01-01

    Traces a major source of confusion in the literature on passage dependence and integrates the relevant concepts into a general theory of context dependence. Sample items and data illustrate practical applications of the theory. (AA)

  16. Source phase shift - A new phenomenon in wave propagation due to anelasticity. [in free oscillations of earth model

    NASA Technical Reports Server (NTRS)

    Buland, R.; Yuen, D. A.; Konstanty, K.; Widmer, R.

    1985-01-01

    The free oscillations of an anelastic earth model due to earthquakes were calculated directly by means of the correspondence principle from wave propagation theory. The formulation made it possible to find the source phase which is not predictable using first order perturbation theory. The predicted source phase was largest for toroidal modes with source components proportional to the radial strain scalar instead of the radial displacement scalar. The source phase increased in relation to the overtone number. In addition, large relative differences were found in the excitation modulus and the phase when the elastic excitation was small. The effect was sufficient to bias estimates of source properties and elastic structure.

  17. Einstein’s quadrupole formula from the kinetic-conformal Hořava theory

    NASA Astrophysics Data System (ADS)

    Bellorín, Jorge; Restuccia, Alvaro

    We analyze the radiative and nonradiative linearized variables in a gravity theory within the family of the nonprojectable Hořava theories, the Hořava theory at the kinetic-conformal point. There is no extra mode in this formulation, the theory shares the same number of degrees of freedom with general relativity. The large-distance effective action, which is the one we consider, can be given in a generally-covariant form under asymptotically flat boundary conditions, the Einstein-aether theory under the condition of hypersurface orthogonality on the aether vector. In the linearized theory, we find that only the transverse-traceless tensorial modes obey a sourced wave equation, as in general relativity. The rest of variables are nonradiative. The result is gauge-independent at the level of the linearized theory. For the case of a weak source, we find that the leading mode in the far zone is exactly Einstein’s quadrupole formula of general relativity, if some coupling constants are properly identified. There are no monopoles nor dipoles in this formulation, in distinction to the nonprojectable Horava theory outside the kinetic-conformal point. We also discuss some constraints on the theory arising from the observational bounds on Lorentz-violating theories.

  18. Flight School in the Virtual Environment: Capabilities and Risks of Executing a Simulations-Based Flight Training Program

    DTIC Science & Technology

    2012-05-17

    theories work together to explain learning in aviation—behavioral learning theory , cognitive learning theory , constructivism, experiential ...solve problems, and make decisions. Experiential learning theory incorporates both behavioral and cognitive theories .104 This theory harnesses the...34Evaluation of the Effectiveness of Flight School XXI," 7. 106 David A. Kolb , Experiential Learning : Experience as the Source of

  19. Formative research to develop theory-based messages for a Western Australian child drowning prevention television campaign: study protocol

    PubMed Central

    Denehy, Mel; Crawford, Gemma; Leavy, Justine; Nimmo, Lauren; Jancey, Jonine

    2016-01-01

    Introduction Worldwide, children under the age of 5 years are at particular risk of drowning. Responding to this need requires the development of evidence-informed drowning prevention strategies. Historically, drowning prevention strategies have included denying access, learning survival skills and providing supervision, as well as education and information which includes the use of mass media. Interventions underpinned by behavioural theory and formative evaluation tend to be more effective, yet few practical examples exist in the drowning and/or injury prevention literature. The Health Belief Model and Social Cognitive Theory will be used to explore participants' perspectives regarding proposed mass media messaging. This paper describes a qualitative protocol to undertake formative research to develop theory-based messages for a child drowning prevention campaign. Methods and analysis The primary data source will be focus group interviews with parents and caregivers of children under 5 years of age in metropolitan and regional Western Australia. Qualitative content analysis will be used to analyse the data. Ethics and dissemination This study will contribute to the drowning prevention literature to inform the development of future child drowning prevention mass media campaigns. Findings from the study will be disseminated to practitioners, policymakers and researchers via international conferences, peer and non-peer-reviewed journals and evidence summaries. The study was submitted and approved by the Curtin University Human Research Ethics Committee. PMID:27207621

  20. Selection of important ecological source patches base on Green Infrastructure theory: A case study of Wuhan city

    NASA Astrophysics Data System (ADS)

    Ke, Yuanyuan; Yu, Yan; Tong, Yan

    2018-01-01

    Selecting urban ecological patches is of great significance for constructing urban green infrastructure network, protecting urban biodiversity and ecological environment. With the support of GIS technology, a criterion for selecting sources of patches was developed according to existing planning. Then ecological source patches of terrestrial organism, aquatic and amphibious organism were selected in Wuhan city. To increase the connectivity of the ecological patches and achieve greater ecological protection benefits, the green infrastructure networks in Wuhan city were constructed with the minimum path analysis method. Finally, the characteristics of ecological source patches were analyzed with landscape metrics, and ecological protection importance degree of ecological source patches were evaluated comprehensively. The results showed that there were 23 important ecological source patches in Wuhan city, among which Sushan Temple Forest Patch, Lu Lake and Shangshe Lake Wetland Patch were the most important in all kinds of patches for ecological protection. This study can provide a scientific basis for the preservation of urban ecological space, the delineation of natural conservation areas and the protection of biological diversity.

  1. MpTheory Java library: a multi-platform Java library for systems biology based on the Metabolic P theory.

    PubMed

    Marchetti, Luca; Manca, Vincenzo

    2015-04-15

    MpTheory Java library is an open-source project collecting a set of objects and algorithms for modeling observed dynamics by means of the Metabolic P (MP) theory, that is, a mathematical theory introduced in 2004 for modeling biological dynamics. By means of the library, it is possible to model biological systems both at continuous and at discrete time. Moreover, the library comprises a set of regression algorithms for inferring MP models starting from time series of observations. To enhance the modeling experience, beside a pure Java usage, the library can be directly used within the most popular computing environments, such as MATLAB, GNU Octave, Mathematica and R. The library is open-source and licensed under the GNU Lesser General Public License (LGPL) Version 3.0. Source code, binaries and complete documentation are available at http://mptheory.scienze.univr.it. luca.marchetti@univr.it, marchetti@cosbi.eu Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  2. Local Choices: Rationality and the Contextuality of Decision-Making

    PubMed Central

    Vlaev, Ivo

    2018-01-01

    Rational explanation is ubiquitous in psychology and social sciences, ranging from rational analysis, expectancy-value theories, ideal observer models, mental logic to probabilistic frameworks, rational choice theory, and informal “folk psychological” explanation. However, rational explanation appears to be challenged by apparently systematic irrationality observed in psychological experiments, especially in the field of judgement and decision-making (JDM). Here, it is proposed that the experimental results require not that rational explanation should be rejected, but that rational explanation is local, i.e., within a context. Thus, rational models need to be supplemented with a theory of contextual shifts. We review evidence in JDM that patterns of choices are often consistent within contexts, but unstable between contexts. We also demonstrate that for a limited, though reasonably broad, class of decision-making domains, recent theoretical models can be viewed as providing theories of contextual shifts. It is argued that one particular significant source of global inconsistency arises from a cognitive inability to represent absolute magnitudes, whether for perceptual variables, utilities, payoffs, or probabilities. This overall argument provides a fresh perspective on the scope and limits of human rationality. PMID:29301289

  3. Local Choices: Rationality and the Contextuality of Decision-Making.

    PubMed

    Vlaev, Ivo

    2018-01-02

    Rational explanation is ubiquitous in psychology and social sciences, ranging from rational analysis, expectancy-value theories, ideal observer models, mental logic to probabilistic frameworks, rational choice theory, and informal "folk psychological" explanation. However, rational explanation appears to be challenged by apparently systematic irrationality observed in psychological experiments, especially in the field of judgement and decision-making (JDM). Here, it is proposed that the experimental results require not that rational explanation should be rejected, but that rational explanation is local , i.e., within a context. Thus, rational models need to be supplemented with a theory of contextual shifts. We review evidence in JDM that patterns of choices are often consistent within contexts, but unstable between contexts. We also demonstrate that for a limited, though reasonably broad, class of decision-making domains, recent theoretical models can be viewed as providing theories of contextual shifts. It is argued that one particular significant source of global inconsistency arises from a cognitive inability to represent absolute magnitudes, whether for perceptual variables, utilities, payoffs, or probabilities. This overall argument provides a fresh perspective on the scope and limits of human rationality.

  4. Qualitative Evaluation Methods in Ethics Education: A Systematic Review and Analysis of Best Practices.

    PubMed

    Watts, Logan L; Todd, E Michelle; Mulhearn, Tyler J; Medeiros, Kelsey E; Mumford, Michael D; Connelly, Shane

    2017-01-01

    Although qualitative research offers some unique advantages over quantitative research, qualitative methods are rarely employed in the evaluation of ethics education programs and are often criticized for a lack of rigor. This systematic review investigated the use of qualitative methods in studies of ethics education. Following a review of the literature in which 24 studies were identified, each study was coded based on 16 best practices characteristics in qualitative research. General thematic analysis and grounded theory were found to be the dominant approaches used. Researchers are effectively executing a number of best practices, such as using direct data sources, structured data collection instruments, non-leading questioning, and expert raters. However, other best practices were rarely present in the courses reviewed, such as collecting data using multiple sources, methods, raters, and timepoints, evaluating reliability, and employing triangulation analyses to assess convergence. Recommendations are presented for improving future qualitative research studies in ethics education.

  5. Compressive sensing sectional imaging for single-shot in-line self-interference incoherent holography

    NASA Astrophysics Data System (ADS)

    Weng, Jiawen; Clark, David C.; Kim, Myung K.

    2016-05-01

    A numerical reconstruction method based on compressive sensing (CS) for self-interference incoherent digital holography (SIDH) is proposed to achieve sectional imaging by single-shot in-line self-interference incoherent hologram. The sensing operator is built up based on the physical mechanism of SIDH according to CS theory, and a recovery algorithm is employed for image restoration. Numerical simulation and experimental studies employing LEDs as discrete point-sources and resolution targets as extended sources are performed to demonstrate the feasibility and validity of the method. The intensity distribution and the axial resolution along the propagation direction of SIDH by angular spectrum method (ASM) and by CS are discussed. The analysis result shows that compared to ASM the reconstruction by CS can improve the axial resolution of SIDH, and achieve sectional imaging. The proposed method may be useful to 3D analysis of dynamic systems.

  6. Identification and addressing reduction-related misconceptions

    NASA Astrophysics Data System (ADS)

    Gal-Ezer, Judith; Trakhtenbrot, Mark

    2016-07-01

    Reduction is one of the key techniques used for problem-solving in computer science. In particular, in the theory of computation and complexity (TCC), mapping and polynomial reductions are used for analysis of decidability and computational complexity of problems, including the core concept of NP-completeness. Reduction is a highly abstract technique that involves revealing close non-trivial connections between problems that often seem to have nothing in common. As a result, proper understanding and application of reduction is a serious challenge for students and a source of numerous misconceptions. The main contribution of this paper is detection of such misconceptions, analysis of their roots, and proposing a way to address them in an undergraduate TCC course. Our observations suggest that the main source of the misconceptions is the false intuitive rule "the bigger is a set/problem, the harder it is to solve". Accordingly, we developed a series of exercises for proactive prevention of these misconceptions.

  7. Testing for EMC (electromagnetic compatibility) in the clinical environment.

    PubMed

    Paperman, D; David, Y; Martinez, M

    1996-01-01

    Testing for electromagnetic compatibility (EMC) in the clinical environment introduces a host of complex conditions not normally encountered under laboratory conditions. In the clinical environment, various radio-frequency (RF) sources of electromagnetic interference (EMI) may be present throughout the entire spectrum of interest. Isolating and analyzing the impact from the sources of interference to medical devices involves a multidisciplinary approach based on training in, and knowledge of, the following: operation of medical devices and their susceptibility to EMI; RF propagation modalities and interaction theory; spectrum analysis systems and techniques (preferably with signature analysis capabilities) and calibrated antennas; the investigation methodology of suspected EMC problems, and testing protocols and standards. Using combinations of standard test procedures adapted for the clinical environment with personnel that have an understanding of radio-frequency behavior increases the probability of controlling, proactively, EMI in the clinical environment, thus providing for a safe and more effective patient care environment.

  8. Gravitational Lenses and the Structure and Evolution of Galaxies

    NASA Technical Reports Server (NTRS)

    Oliversen, Ronald J. (Technical Monitor); Kochanek, Christopher

    2004-01-01

    During the first year of the project we completed five papers, each of which represents a new direction in the theory and interpretation of gravitational lenses. In the first paper, The Importance of Einstein Rings, we developed the first theory for the formation and structure of the Einstein rings formed by lensing extended sources like the host galaxies of quasar and radio sources. In the second paper, Cusped Mass Models Of Gravitational Lenses, we introduced a new class of lens models. In the third paper, Global Probes of the Impact of Baryons on Dark Matter Halos, we made the first globally consistent models for the separation distribution of gravitational lenses including both galaxy and cluster lenses. The last two papers explore the properties of two lenses in detail. During the second year we have focused more closely on the relationship of baryons and dark matter. In the third year we have been further examining the relationship between baryons and dark matter. In the present year we extended our statistical analysis of lens mass distributions using a self-similar model for the halo mass distribution as compared to the luminous galaxy.

  9. Theory of noise equivalent power of a high-temperature superconductor far-infrared bolometer in a photo-thermoelectrical mode of operation

    NASA Astrophysics Data System (ADS)

    Kaila, M. M.; Russell, G. J.

    2000-12-01

    We present a theory of noise equivalent power (NEP) and related parameters for a high-temperature superconductor (HTSC) bolometer in which temperature and resistance are the noise sources for open circuit operation and phonon and resistance are the noise sources for voltage-biased operation of the bolometer. The bolometer is designed to use a photo-thermoelectrical mode of operation. A mathematical formulation for the open circuit operation is first presented followed by an analysis of the heterodyne case with a bias applied in constant voltage mode. For the first time electrothermal (ET) and thermoelectrical (TE) feedback are treated in the heat balance equation simultaneously. A parallel resistance geometry consisting of thermoelectric and HTSC material legs has been chosen for the device. Computations for the ET-TE feedback show that the response time improves by three orders of magnitude and the responsivity becomes double for the same TE feedback. In the heat balance equation we have included among the heat transfer processes the temperature dependence of the thermal conductance at the bolometer-substrate interface for the dynamic state.

  10. Theory-of-Mind Development Influences Suggestibility and Source Monitoring

    ERIC Educational Resources Information Center

    Bright-Paul, Alexandra; Jarrold, Christopher; Wright, Daniel B.

    2008-01-01

    According to the mental-state reasoning model of suggestibility, 2 components of theory of mind mediate reductions in suggestibility across the preschool years. The authors examined whether theory-of-mind performance may be legitimately separated into 2 components and explored the memory processes underlying the associations between theory of mind…

  11. Coding Issues in Grounded Theory

    ERIC Educational Resources Information Center

    Moghaddam, Alireza

    2006-01-01

    This paper discusses grounded theory as one of the qualitative research designs. It describes how grounded theory generates from data. Three phases of grounded theory--open coding, axial coding, and selective coding--are discussed, along with some of the issues which are the source of debate among grounded theorists, especially between its…

  12. Applications of Jungian Type Theory to Counselor Education.

    ERIC Educational Resources Information Center

    Dilley, Josiah S.

    1987-01-01

    Describes Carl Jung's theory of psychological type and the Myers-Briggs Type Indicator (MBTI), an instrument to assess Jungian type. Cites sources of information on the research and application of the theory and the MBTI. Explores how knowledge of type theory can be useful to counselor educators. (Author)

  13. Unique effects and moderators of effects of sources on self-efficacy: A model-based meta-analysis.

    PubMed

    Byars-Winston, Angela; Diestelmann, Jacob; Savoy, Julia N; Hoyt, William T

    2017-11-01

    Self-efficacy beliefs are strong predictors of academic pursuits, performance, and persistence, and in theory are developed and maintained by 4 classes of experiences Bandura (1986) referred to as sources: performance accomplishments (PA), vicarious learning (VL), social persuasion (SP), and affective arousal (AA). The effects of sources on self-efficacy vary by performance domain and individual difference factors. In this meta-analysis (k = 61 studies of academic self-efficacy; N = 8,965), we employed B. J. Becker's (2009) model-based approach to examine cumulative effects of the sources as a set and unique effects of each source, controlling for the others. Following Becker's recommendations, we used available data to create a correlation matrix for the 4 sources and self-efficacy, then used these meta-analytically derived correlations to test our path model. We further examined moderation of these associations by subject area (STEM vs. non-STEM), grade, sex, and ethnicity. PA showed by far the strongest unique association with self-efficacy beliefs. Subject area was a significant moderator, with sources collectively predicting self-efficacy more strongly in non-STEM (k = 14) compared with STEM (k = 47) subjects (R2 = .37 and .22, respectively). Within studies of STEM subjects, grade level was a significant moderator of the coefficients in our path model, as were 2 continuous study characteristics (percent non-White and percent female). Practical implications of the findings and future research directions are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  14. Theory of CW lidar aerosol backscatter measurements and development of a 2.1 microns solid-state pulsed laser radar for aerosol backscatter profiling

    NASA Technical Reports Server (NTRS)

    Kavaya, Michael J.; Henderson, Sammy W.; Frehlich, R. G.

    1991-01-01

    The performance and calibration of a focused, continuous wave, coherent detection CO2 lidar operated for the measurement of atmospheric backscatter coefficient, B(m), was examined. This instrument functions by transmitting infrared (10 micron) light into the atmosphere and collecting the light which is scattered in the rearward direction. Two distinct modes of operation were considered. In volume mode, the scattered light energy from many aerosols is detected simultaneously, whereas in the single particle mode (SPM), the scattered light energy from a single aerosol is detected. The analysis considered possible sources of error for each of these two cases, and also considered the conditions where each technique would have superior performance. The analysis showed that, within reasonable assumptions, the value of B(m) could be accurately measured by either the VM or the SPM method. The understanding of the theory developed during the analysis was also applied to a pulsed CO2 lidar. Preliminary results of field testing of a solid state 2 micron lidar using a CW oscillator is included.

  15. Psychotherapy research needs theory. Outline for an epistemology of the clinical exchange.

    PubMed

    Salvatore, Sergio

    2011-09-01

    This paper provides an analysis of a basic assumption grounding the clinical research: the ontological autonomy of psychotherapy-based on the idea that the clinical exchange is sufficiently distinguished from other social objects (i.e. exchange between teacher and pupils, or between buyer and seller, or interaction during dinner, and so forth). A criticism of such an assumption is discussed together with the proposal of a different epistemological interpretation, based on the distinction between communicative dynamics and the process of psychotherapy-psychotherapy is a goal-oriented process based on the general dynamics of human communication. Theoretical and methodological implications are drawn from such a view: It allows further sources of knowledge to be integrated within clinical research (i.e. those coming from other domains of analysis of human communication); it also enables a more abstract definition of the psychotherapy process to be developed, leading to innovative views of classical critical issues, like the specific-nonspecific debate. The final part of the paper is devoted to presenting a model of human communication--the Semiotic Dialogical Dialectic Theory--which is meant as the framework for the analysis of psychotherapy.

  16. A Guided Tour of Mathematical Methods - 2nd Edition

    NASA Astrophysics Data System (ADS)

    Snieder, Roel

    2004-09-01

    Mathematical methods are essential tools for all physical scientists. This second edition provides a comprehensive tour of the mathematical knowledge and techniques that are needed by students in this area. In contrast to more traditional textbooks, all the material is presented in the form of problems. Within these problems the basic mathematical theory and its physical applications are well integrated. The mathematical insights that the student acquires are therefore driven by their physical insight. Topics that are covered include vector calculus, linear algebra, Fourier analysis, scale analysis, complex integration, Green's functions, normal modes, tensor calculus, and perturbation theory. The second edition contains new chapters on dimensional analysis, variational calculus, and the asymptotic evaluation of integrals. This book can be used by undergraduates, and lower-level graduate students in the physical sciences. It can serve as a stand-alone text, or as a source of problems and examples to complement other textbooks. All the material is presented in the form of problems Mathematical insights are gained by getting the reader to develop answers themselves Many applications of the mathematics are given

  17. Independent component analysis for automatic note extraction from musical trills

    NASA Astrophysics Data System (ADS)

    Brown, Judith C.; Smaragdis, Paris

    2004-05-01

    The method of principal component analysis, which is based on second-order statistics (or linear independence), has long been used for redundancy reduction of audio data. The more recent technique of independent component analysis, enforcing much stricter statistical criteria based on higher-order statistical independence, is introduced and shown to be far superior in separating independent musical sources. This theory has been applied to piano trills and a database of trill rates was assembled from experiments with a computer-driven piano, recordings of a professional pianist, and commercially available compact disks. The method of independent component analysis has thus been shown to be an outstanding, effective means of automatically extracting interesting musical information from a sea of redundant data.

  18. A review of the theory of trailing edge noise

    NASA Technical Reports Server (NTRS)

    Howe, M. S.

    1978-01-01

    Literature on the theory of the generation of sound by the interaction of low Mach number turbulent flow with the edge of a semi-infinite rigid plate is critically reviewed. Three different approaches to the subject are identified, consisting of theories based on (1) Lighthill's acoustic analogy; (2) the solution of special, linearized hydroacoustic problems; and (3) ad hoc aerodynamic source models. When appropriately interpreted, all relevant theories produce essentially identical predictions in the limit of very small Mach numbers. None of the theories discusses the implications of the Kutta condition, however, nor of the effect of forward flight and source motion relative to the trailing edge. An outline of a redevelopment of the theory is included to give a unified view of the problem, exhibit the significance of the various approximations, and incorporate the effect of mean motion and of the Kutta condition.

  19. Toward a brain-based theory of beauty.

    PubMed

    Ishizu, Tomohiro; Zeki, Semir

    2011-01-01

    We wanted to learn whether activity in the same area(s) of the brain correlate with the experience of beauty derived from different sources. 21 subjects took part in a brain-scanning experiment using functional magnetic resonance imaging. Prior to the experiment, they viewed pictures of paintings and listened to musical excerpts, both of which they rated on a scale of 1-9, with 9 being the most beautiful. This allowed us to select three sets of stimuli--beautiful, indifferent and ugly--which subjects viewed and heard in the scanner, and rated at the end of each presentation. The results of a conjunction analysis of brain activity showed that, of the several areas that were active with each type of stimulus, only one cortical area, located in the medial orbito-frontal cortex (mOFC), was active during the experience of musical and visual beauty, with the activity produced by the experience of beauty derived from either source overlapping almost completely within it. The strength of activation in this part of the mOFC was proportional to the strength of the declared intensity of the experience of beauty. We conclude that, as far as activity in the brain is concerned, there is a faculty of beauty that is not dependent on the modality through which it is conveyed but which can be activated by at least two sources--musical and visual--and probably by other sources as well. This has led us to formulate a brain-based theory of beauty.

  20. Study of noise sources in a subsonic fan using measured blade pressures and acoustic theory

    NASA Technical Reports Server (NTRS)

    Hanson, D. B.

    1975-01-01

    Sources of noise in a 1.4 m (4.6 ft) diameter subsonic tip speed propulsive fan running statically outdoors are studied using a combination of techniques. Signals measured with pressure transducers on a rotor blade are plotted in a format showing the space-time history of inlet distortion. Study of these plots visually and with statistical correlation analysis confirms that the inlet flow contains long, thin eddies of turbulence. Turbulence generated in the boundary layer of the shroud upstream of the rotor tips was not found to be an important noise source. Fan noise is diagnosed by computing narrowband spectra of rotor and stator sound power and comparing these with measured sound power spectra. Rotor noise is computed from spectra of the measured blade pressures and stator noise is computed using the author's stator noise theory. It is concluded that the rotor and stator sources contribute about equally at frequencies in the vicinity of the first three harmonics of blade passing frequency. At higher frequencies, the stator contribution diminishes rapidly and the rotor/inlet turbulence mechanism dominates. Two parametric studies are performed by using the rotor noise calculation procedure which was correlated with test. In the first study, the effects on noise spectrum and directivity are calculated for changes in turbulence properties, rotational Mach number, number of blades, and stagger angle. In the second study the influences of design tip speed and blade number on noise are evaluated.

  1. Local spectrum analysis of field propagation in an anisotropic medium. Part II. Time-dependent fields.

    PubMed

    Tinkelman, Igor; Melamed, Timor

    2005-06-01

    In Part I of this two-part investigation [J. Opt. Soc. Am. A 22, 1200 (2005)], we presented a theory for phase-space propagation of time-harmonic electromagnetic fields in an anisotropic medium characterized by a generic wave-number profile. In this Part II, these investigations are extended to transient fields, setting a general analytical framework for local analysis and modeling of radiation from time-dependent extended-source distributions. In this formulation the field is expressed as a superposition of pulsed-beam propagators that emanate from all space-time points in the source domain and in all directions. Using time-dependent quadratic-Lorentzian windows, we represent the field by a phase-space spectral distribution in which the propagating elements are pulsed beams, which are formulated by a transient plane-wave spectrum over the extended-source plane. By applying saddle-point asymptotics, we extract the beam phenomenology in the anisotropic environment resulting from short-pulsed processing. Finally, the general results are applied to the special case of uniaxial crystal and compared with a reference solution.

  2. Neural correlates of confidence during item recognition and source memory retrieval: evidence for both dual-process and strength memory theories.

    PubMed

    Hayes, Scott M; Buchler, Norbou; Stokes, Jared; Kragel, James; Cabeza, Roberto

    2011-12-01

    Although the medial-temporal lobes (MTL), PFC, and parietal cortex are considered primary nodes in the episodic memory network, there is much debate regarding the contributions of MTL, PFC, and parietal subregions to recollection versus familiarity (dual-process theory) and the feasibility of accounts on the basis of a single memory strength process (strength theory). To investigate these issues, the current fMRI study measured activity during retrieval of memories that differed quantitatively in terms of strength (high vs. low-confidence trials) and qualitatively in terms of recollection versus familiarity (source vs. item memory tasks). Support for each theory varied depending on which node of the episodic memory network was considered. Results from MTL best fit a dual-process account, as a dissociation was found between a right hippocampal region showing high-confidence activity during the source memory task and bilateral rhinal regions showing high-confidence activity during the item memory task. Within PFC, several left-lateralized regions showed greater activity for source than item memory, consistent with recollective orienting, whereas a right-lateralized ventrolateral area showed low-confidence activity in both tasks, consistent with monitoring processes. Parietal findings were generally consistent with strength theory, with dorsal areas showing low-confidence activity and ventral areas showing high-confidence activity in both tasks. This dissociation fits with an attentional account of parietal functions during episodic retrieval. The results suggest that both dual-process and strength theories are partly correct, highlighting the need for an integrated model that links to more general cognitive theories to account for observed neural activity during episodic memory retrieval.

  3. Multiscale modeling of lithium ion batteries: thermal aspects

    PubMed Central

    Zausch, Jochen

    2015-01-01

    Summary The thermal behavior of lithium ion batteries has a huge impact on their lifetime and the initiation of degradation processes. The development of hot spots or large local overpotentials leading, e.g., to lithium metal deposition depends on material properties as well as on the nano- und microstructure of the electrodes. In recent years a theoretical structure emerges, which opens the possibility to establish a systematic modeling strategy from atomistic to continuum scale to capture and couple the relevant phenomena on each scale. We outline the building blocks for such a systematic approach and discuss in detail a rigorous approach for the continuum scale based on rational thermodynamics and homogenization theories. Our focus is on the development of a systematic thermodynamically consistent theory for thermal phenomena in batteries at the microstructure scale and at the cell scale. We discuss the importance of carefully defining the continuum fields for being able to compare seemingly different phenomenological theories and for obtaining rules to determine unknown parameters of the theory by experiments or lower-scale theories. The resulting continuum models for the microscopic and the cell scale are numerically solved in full 3D resolution. The complex very localized distributions of heat sources in a microstructure of a battery and the problems of mapping these localized sources on an averaged porous electrode model are discussed by comparing the detailed 3D microstructure-resolved simulations of the heat distribution with the result of the upscaled porous electrode model. It is shown, that not all heat sources that exist on the microstructure scale are represented in the averaged theory due to subtle cancellation effects of interface and bulk heat sources. Nevertheless, we find that in special cases the averaged thermal behavior can be captured very well by porous electrode theory. PMID:25977870

  4. Neural Correlates of Confidence during Item Recognition and Source Memory Retrieval: Evidence for Both Dual-process and Strength Memory Theories

    PubMed Central

    Hayes, Scott M.; Buchler, Norbou; Stokes, Jared; Kragel, James; Cabeza, Roberto

    2012-01-01

    Although the medial-temporal lobes (MTL), PFC, and parietal cortex are considered primary nodes in the episodic memory network, there is much debate regarding the contributions of MTL, PFC, and parietal subregions to recollection versus familiarity (dual-process theory) and the feasibility of accounts on the basis of a single memory strength process (strength theory). To investigate these issues, the current fMRI study measured activity during retrieval of memories that differed quantitatively in terms of strength (high vs. low-confidence trials) and qualitatively in terms of recollection versus familiarity (source vs. item memory tasks). Support for each theory varied depending on which node of the episodic memory network was considered. Results from MTL best fit a dual-process account, as a dissociation was found between a right hippocampal region showing high-confidence activity during the source memory task and bilateral rhinal regions showing high-confidence activity during the item memory task. Within PFC, several left-lateralized regions showed greater activity for source than item memory, consistent with recollective orienting, whereas a right-lateralized ventrolateral area showed low-confidence activity in both tasks, consistent with monitoring processes. Parietal findings were generally consistent with strength theory, with dorsal areas showing low-confidence activity and ventral areas showing high-confidence activity in both tasks. This dissociation fits with an attentional account of parietal functions during episodic retrieval. The results suggest that both dual-process and strength theories are partly correct, highlighting the need for an integrated model that links to more general cognitive theories to account for observed neural activity during episodic memory retrieval. PMID:21736454

  5. Experimental derivation of the fluence non-uniformity correction for air kerma near brachytherapy linear sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vianello, E. A.; Almeida, C. E. de

    2008-07-15

    In brachytherapy, one of the elements to take into account for measurements free in air is the non-uniformity of the photon fluence due to the beam divergence that causes a steep dose gradient near the source. The correction factors for this phenomenon have been usually evaluated by two available theories by Kondo and Randolph [Radiat. Res. 13, 37-60 (1960)] and Bielajew [Phys. Med. Biol. 35, 517-538 (1990)], both conceived for point sources. This work presents the experimental validation of the Monte Carlo calculations made by Rodriguez and deAlmeida [Phys. Med. Biol. 49, 1705-1709 (2004)] for the non-uniformity correction specifically formore » a Cs-137 linear source measured using a Farmer type ionization chamber. The experimental values agree very well with the Monte Carlo calculations and differ from the results predicted by both theoretical models widely used. This result confirms that for linear sources there are some important differences at short distances from the source and emphasizes that those theories should not be used for linear sources. The data provided in this study confirm the limitations of the mentioned theories when linear sources are used. Considering the difficulties and uncertainties associated with the experimental measurements, it is recommended to use the Monte Carlo data to assess the non-uniformity factors for linear sources in situations that require this knowledge.« less

  6. Nanoseismic sources made in the laboratory: source kinematics and time history

    NASA Astrophysics Data System (ADS)

    McLaskey, G.; Glaser, S. D.

    2009-12-01

    When studying seismic signals in the field, the analysis of source mechanisms is always obscured by propagation effects such as scattering and reflections due to the inhomogeneous nature of the earth. To get around this complication, we measure seismic waves (wavelengths from 2 mm to 300 mm) in laboratory-sized specimens of extremely homogeneous isotropic materials. We are able to study the focal mechanism and time history of nanoseismic sources produced by fracture, impact, and sliding friction, roughly six orders of magnitude smaller and more rapid than typical earthquakes. Using very sensitive broadband conical piezoelectric sensors, we are able to measure surface normal displacements down to a few pm (10^-12 m) in amplitude. Thick plate specimens of homogeneous materials such as glass, steel, gypsum, and polymethylmethacrylate (PMMA) are used as propagation media in the experiments. Recorded signals are in excellent agreement with theoretically determined Green’s functions obtained from a generalized ray theory code for an infinite plate geometry. Extremely precise estimates of the source time history are made via full waveform inversion from the displacement time histories recorded by an array of at least ten sensors. Each channel is sampled at a rate of 5 MHz. The system is absolutely calibrated using the normal impact of a tiny (~1 mm) ball on the surface of the specimen. The ball impact induces a force pulse into the specimen a few ms in duration. The amplitude, duration, and shape of the force pulse were found to be well approximated by Hertzian-derived impact theory, while the total change in momentum of the ball is independently measured from its incoming and rebound velocities. Another calibration source, the sudden fracture of a thin-walled glass capillary tube laid on its side and loaded against the surface of the specimen produces a similar point force, this time with a source function very nearly a step in time with rise time of less than 500 ns. The force at which the capillary breaks is recorded using a force sensor and is used for absolute calibration. A third set of nanoseismic sources were generated from frictional sliding. In this case, the location and spatial extent of the source along the cm-scale fault is not precisely known and must be determined. These sources are much more representative of earthquakes and the determination of their focal mechanisms is the subject of ongoing research. Sources of this type have been observed on a great range of time scales with rise times ranging from 500 ns to hundreds of ms. This study tests the generality of the seismic source representation theory. The unconventional scale, geometry, and experimental arrangement facilitates the discussion of issues such as the point source approximation, the origin of uncertainty in moment tensor inversions, the applicability of magnitude calculations for non-double-couple sources, and the relationship between momentum and seismic moment.

  7. The contrasting roles of Planck's constant in classical and quantum theories

    NASA Astrophysics Data System (ADS)

    Boyer, Timothy H.

    2018-04-01

    We trace the historical appearance of Planck's constant in physics, and we note that initially the constant did not appear in connection with quanta. Furthermore, we emphasize that Planck's constant can appear in both classical and quantum theories. In both theories, Planck's constant sets the scale of atomic phenomena. However, the roles played in the foundations of the theories are sharply different. In quantum theory, Planck's constant is crucial to the structure of the theory. On the other hand, in classical electrodynamics, Planck's constant is optional, since it appears only as the scale factor for the (homogeneous) source-free contribution to the general solution of Maxwell's equations. Since classical electrodynamics can be solved while taking the homogenous source-free contribution in the solution as zero or non-zero, there are naturally two different theories of classical electrodynamics, one in which Planck's constant is taken as zero and one where it is taken as non-zero. The textbooks of classical electromagnetism present only the version in which Planck's constant is taken to vanish.

  8. Recognition memory, self-other source memory, and theory-of-mind in children with autism spectrum disorder.

    PubMed

    Lind, Sophie E; Bowler, Dermot M

    2009-09-01

    This study investigated semantic and episodic memory in autism spectrum disorder (ASD), using a task which assessed recognition and self-other source memory. Children with ASD showed undiminished recognition memory but significantly diminished source memory, relative to age- and verbal ability-matched comparison children. Both children with and without ASD showed an "enactment effect", demonstrating significantly better recognition and source memory for self-performed actions than other-person-performed actions. Within the comparison group, theory-of-mind (ToM) task performance was significantly correlated with source memory, specifically for other-person-performed actions (after statistically controlling for verbal ability). Within the ASD group, ToM task performance was not significantly correlated with source memory (after controlling for verbal ability). Possible explanations for these relations between source memory and ToM are considered.

  9. Human brain networks in physiological aging: a graph theoretical analysis of cortical connectivity from EEG data.

    PubMed

    Vecchio, Fabrizio; Miraglia, Francesca; Bramanti, Placido; Rossini, Paolo Maria

    2014-01-01

    Modern analysis of electroencephalographic (EEG) rhythms provides information on dynamic brain connectivity. To test the hypothesis that aging processes modulate the brain connectivity network, EEG recording was conducted on 113 healthy volunteers. They were divided into three groups in accordance with their ages: 36 Young (15-45 years), 46 Adult (50-70 years), and 31 Elderly (>70 years). To evaluate the stability of the investigated parameters, a subgroup of 10 subjects underwent a second EEG recording two weeks later. Graph theory functions were applied to the undirected and weighted networks obtained by the lagged linear coherence evaluated by eLORETA on cortical sources. EEG frequency bands of interest were: delta (2-4 Hz), theta (4-8 Hz), alpha1 (8-10.5 Hz), alpha2 (10.5-13 Hz), beta1 (13-20 Hz), beta2 (20-30 Hz), and gamma (30-40 Hz). The spectral connectivity analysis of cortical sources showed that the normalized Characteristic Path Length (λ) presented the pattern Young > Adult>Elderly in the higher alpha band. Elderly also showed a greater increase in delta and theta bands than Young. The correlation between age and λ showed that higher ages corresponded to higher λ in delta and theta and lower in the alpha2 band; this pattern reflects the age-related modulation of higher (alpha) and decreased (delta) connectivity. The Normalized Clustering coefficient (γ) and small-world network modeling (σ) showed non-significant age-modulation. Evidence from the present study suggests that graph theory can aid in the analysis of connectivity patterns estimated from EEG and can facilitate the study of the physiological and pathological brain aging features of functional connectivity networks.

  10. Recent advances in coding theory for near error-free communications

    NASA Technical Reports Server (NTRS)

    Cheung, K.-M.; Deutsch, L. J.; Dolinar, S. J.; Mceliece, R. J.; Pollara, F.; Shahshahani, M.; Swanson, L.

    1991-01-01

    Channel and source coding theories are discussed. The following subject areas are covered: large constraint length convolutional codes (the Galileo code); decoder design (the big Viterbi decoder); Voyager's and Galileo's data compression scheme; current research in data compression for images; neural networks for soft decoding; neural networks for source decoding; finite-state codes; and fractals for data compression.

  11. Radicalization: An Overview and Annotated Bibliography of Open-Source Literature

    DTIC Science & Technology

    2006-12-15

    particularly with liberal, democratic, and humanistic Muslims Phares points to Jihadism as the main root cause of terrorism and suggests that defending...An Overview and Annotated Bibliography of Open-Source Literature 155 of ambiguity), epistemic and existential needs theory (need for closure...This book presents Terror Management Theory, which addresses behavioral and psychological responses to terrorist events. An existential

  12. The occultation of Epsilon Geminorum by Mars - Analysis of McDonald data. [turbulent scintillation in light curves

    NASA Technical Reports Server (NTRS)

    Africano, J.; De Vaucouleurs, G.; Evans, D. S.; Finkel, B. E.; Nather, R. E.; Palm, C.; Silverberg, E.; Wiant, J.; Hubbard, W. B.; Jokipii, J. R.

    1977-01-01

    An analysis of observations of the occultation of Epsilon Gem by Mars on April 8, 1976, is presented. The data were obtained by three neighboring telescopes at McDonald Observatory. Intensity fluctuations on time scales of the order of 100 ms were observed simultaneously at the three telescopes. As the observations compare well with predictions of turbulent scintillation theory, it is concluded that such fluctuations were probably largely the effect of stellar scintillations in the Martian atmosphere. The stellar diameter is included as a parameter in the theory but in a way which differs from previously published interpretations of occultations of extended sources by planetary atmospheres. Scintillations govern the experimental uncertainty in the deduction of the scale height of the high Martian atmosphere. A density scale height of 9.9 + or - 2.5 km is obtained at an altitude of 74 + or - 8 km above the mean surface. For CO 2 gas, this result corresponds to a temperature of 190 + or - 50 K.

  13. Tackling wicked problems: how theories of agency can provide new insights.

    PubMed

    Varpio, Lara; Aschenbrener, Carol; Bates, Joanna

    2017-04-01

    This paper reviews why and how theories of agency can be used as analytical lenses to help health professions education (HPE) scholars address our community's wicked problems. Wicked problems are those that resist clear problem statements, defy traditional analysis approaches, and refuse definitive resolution (e.g. student remediation, assessments of professionalism, etc.). We illustrate how theories of agency can provide new insights into such challenges by examining the application of these theories to one particular wicked problem in HPE: interprofessional education (IPE). After searching the HPE literature and finding that theories of agency had received little attention, we borrowed techniques from narrative literature reviews to search databases indexing a broad scope of disciplines (i.e. ERIC, Web of Science, Scopus, MEDLINE and PubMed) for publications (1994-2014) that: (i) examined agency, or (ii) incorporated an agency-informed analytical perspective. The lead author identified the theories of agency used in these articles, and reviewed the texts on agency cited therein and the original sources of each theory. We identified 10 theories of agency that we considered to be applicable to HPE's wicked problems. To select a subset of theories for presentation in this paper, we discussed each theory in relation to some of HPE's wicked problems. Through debate and reflection, we unanimously agreed on the applicability of a subset of theories for illuminating HPE's wicked problems. This subset is described in this paper. We present four theories of agency: Butler's post-structural formulation; Giddens' sociological formulation; cultural historical activity theory's formulation, and Bandura's social cognitive psychology formulation. We introduce each theory and apply each to the challenges of engaging in IPE. Theories of agency can inform HPE scholarship in novel and generative ways. Each theory offers new insights into the roots of wicked problems and means for contending with them. © 2017 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  14. [Effects of attitude formation, persuasive message, and source expertise on attitude change: an examination based on the Elaboration Likelihood Model and the Attitude Formation Theory].

    PubMed

    Nakamura, M; Saito, K; Wakabayashi, M

    1990-04-01

    The purpose of this study was to investigate how attitude change is generated by the recipient's degree of attitude formation, evaluative-emotional elements contained in the persuasive messages, and source expertise as a peripheral cue in the persuasion context. Hypotheses based on the Attitude Formation Theory of Mizuhara (1982) and the Elaboration Likelihood Model of Petty and Cacioppo (1981, 1986) were examined. Eighty undergraduate students served as subjects in the experiment, the first stage of which involving manipulating the degree of attitude formation with respect to nuclear power development. Then, the experimenter presented persuasive messages with varying combinations of evaluative-emotional elements from a source with either high or low expertise on the subject. Results revealed a significant interaction effect on attitude change among attitude formation, persuasive message and the expertise of the message source. That is, high attitude formation subjects resisted evaluative-emotional persuasion from the high expertise source while low attitude formation subjects changed their attitude when exposed to the same persuasive message from a low expertise source. Results exceeded initial predictions based on the Attitude Formation Theory and the Elaboration Likelihood Model.

  15. Elemental, isotopic, and geochronological variability in Mogollon-Datil volcanic province archaeological obsidian, southwestern USA: Solving issues of intersource discrimination

    USGS Publications Warehouse

    Shackley, M. Steven; Morgan, Leah; Pyle, Douglas

    2017-01-01

    Solving issues of intersource discrimination in archaeological obsidian is a recurring problem in geoarchaeological investigation, particularly since the number of known sources of archaeological obsidian worldwide has grown nearly exponentially in the last few decades, and the complexity of archaeological questions asked has grown equally so. These two parallel aspects of archaeological investigation have required more exacting understanding of the geological relationship between sources and the more accurate analysis of these sources of archaeological obsidian. This is particularly the case in the North American Southwest where the frequency of archaeological investigation is some of the highest in the world, and the theory and method used to interpret that record has become increasingly nuanced. Here, we attempt to unravel the elemental similarity of archaeological obsidian in the Mogollon-Datil volcanic province of southwestern New Mexico where some of the most important and extensively distributed sources are located and the elemental similarity between the sources is great even though the distance between the sources is large. Uniting elemental, isotopic, and geochronological analyses as an intensive pilot study, we unpack this complexity to provide greater understanding of these important sources of archaeological obsidian.

  16. Interference of Photons from a Weak Laser and a Quantum Dot

    NASA Astrophysics Data System (ADS)

    Ritchie, David; Bennett, Anthony; Patel, Raj; Nicoll, Christine; Shields, Andrew

    2010-03-01

    We demonstrate two-photon interference from two unsynchronized sources operating via different physical processes [1]. One source is spontaneous emission from the X^- state of an electrically-driven InAs/GaAs single quantum dot with μeV linewidth, the other stimulated emission from a laser with a neV linewidth. We mix the emission from these sources on a balanced non-polarising beam splitter and measure correlations in the photons that exit using Si-avalanche photodiodes and a time-correlated counting card. By periodically switching the polarisation state of the weak laser we simultaneously measure the correlation for parallel and orthogonally polarised sources, corresponding to maximum and minimum degrees of interference. When the two sources have the same intensity, a reduction in the correlation function at time zero for the case of parallel photon sources clearly indicates this interference effect. To quantify the degree of interference, we develop a theory that predicts the correlation function. Data and experiment are then compared for a range of intensity ratios. Based on this analysis we infer a wave-function overlap of 91%, which is remarkable given the fundamental differences between the two sources. [1] Bennett A. J et al Nature Physics, 5, 715--717 (2009).

  17. Performance of a permanent-magnet helicon source at 27 and 13 MHz

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Francis F.

    2012-09-15

    A small helicon source is used to create dense plasma and inject it into a large chamber. A permanent magnet is used for the dc magnetic field (B-field), making the system very simple and compact. Though theory predicts that better antenna coupling will occur at 27.12 MHz, it was found that 13.56 MHz surprisingly gives even higher density due to practical effects not included in theory. Complete density n and electron temperature T{sub e} profiles are measured at three distances below the source. The plasma inside the source is also measured with a special probe, even under the antenna. Themore » density there is lower than expected because the plasma created is immediately ejected, filling the experimental chamber. The advantage of helicons over inductively coupled plasmas (with no B-field) increases with RF power. At high B-fields, edge ionization by the Trivelpiece-Gould mode can be seen. These results are useful for design of multiple-tube, large-area helicon sources for plasma etching and deposition because problems are encountered which cannot be foreseen by theory alone.« less

  18. Coalescent: an open-source and scalable framework for exact calculations in coalescent theory

    PubMed Central

    2012-01-01

    Background Currently, there is no open-source, cross-platform and scalable framework for coalescent analysis in population genetics. There is no scalable GUI based user application either. Such a framework and application would not only drive the creation of more complex and realistic models but also make them truly accessible. Results As a first attempt, we built a framework and user application for the domain of exact calculations in coalescent analysis. The framework provides an API with the concepts of model, data, statistic, phylogeny, gene tree and recursion. Infinite-alleles and infinite-sites models are considered. It defines pluggable computations such as counting and listing all the ancestral configurations and genealogies and computing the exact probability of data. It can visualize a gene tree, trace and visualize the internals of the recursion algorithm for further improvement and attach dynamically a number of output processors. The user application defines jobs in a plug-in like manner so that they can be activated, deactivated, installed or uninstalled on demand. Multiple jobs can be run and their inputs edited. Job inputs are persisted across restarts and running jobs can be cancelled where applicable. Conclusions Coalescent theory plays an increasingly important role in analysing molecular population genetic data. Models involved are mathematically difficult and computationally challenging. An open-source, scalable framework that lets users immediately take advantage of the progress made by others will enable exploration of yet more difficult and realistic models. As models become more complex and mathematically less tractable, the need for an integrated computational approach is obvious. Object oriented designs, though has upfront costs, are practical now and can provide such an integrated approach. PMID:23033878

  19. Using Network Theory to Understand Seismic Noise in Dense Arrays

    NASA Astrophysics Data System (ADS)

    Riahi, N.; Gerstoft, P.

    2015-12-01

    Dense seismic arrays offer an opportunity to study anthropogenic seismic noise sources with unprecedented detail. Man-made sources typically have high frequency, low intensity, and propagate as surface waves. As a result attenuation restricts their measurable footprint to a small subset of sensors. Medium heterogeneities can further introduce wave front perturbations that limit processing based on travel time. We demonstrate a non-parametric technique that can reliably identify very local events within the array as a function of frequency and time without using travel-times. The approach estimates the non-zero support of the array covariance matrix and then uses network analysis tools to identify clusters of sensors that are sensing a common source. We verify the method on simulated data and then apply it to the Long Beach (CA) geophone array. The method exposes a helicopter traversing the array, oil production facilities with different characteristics, and the fact that noise sources near roads tend to be around 10-20 Hz.

  20. Self-care Concept Analysis in Cancer Patients: An Evolutionary Concept Analysis.

    PubMed

    Hasanpour-Dehkordi, Ali

    2016-01-01

    Self-care is a frequently used concept in both the theory and the clinical practice of nursing and is considered an element of nursing theory by Orem. The aim of this paper is to identify the core attributes of the self-care concept in cancer patients. We used Rodgers' evolutionary method of concept analysis. The articles published in English language from 1980 to 2015 on nursing and non-nursing disciplines were analyzed. Finally, 85 articles, an MSc thesis, and a PhD thesis were selected, examined, and analyzed in-depth. Two experts checked the process of analysis and monitored and reviewed the articles. The analysis showed that self-care concept is determined by four attributes of education, interaction, self-control, and self-reliance. Three types of antecedents in the present study were client-related (self-efficacy, self-esteem), system-related (adequate sources, social networks, and cultural factors), and healthcare professionals-related (participation). The self-care concept has considerably evolved among patients with chronic diseases, particularly cancer, over the past 35 years, and nurses have managed to enhance their knowledge about self-care remarkably for the clients so that the nurses in healthcare teams have become highly efficient and able to assume the responsibility for self-care teams.

  1. Toward a more comprehensive analysis of the role of organizational culture in child sexual abuse in institutional contexts.

    PubMed

    Palmer, Donald; Feldman, Valerie

    2017-12-01

    This article draws on a report prepared for the Australian Royal Commission into Institutional Responses to Child Sexual Abuse (Palmer et al., 2016) to develop a more comprehensive analysis of the role that organizational culture plays in child sexual abuse in institutional contexts, where institutional contexts are taken to be formal organizations that include children among their members (referred to here as "youth-serving organizations"). We begin by integrating five strains of theory and research on organizational culture from organizational sociology and management theory into a unified framework for analysis. We then elaborate the main paths through which organizational culture can influence child sexual abuse in youth-serving organizations. We then use our unified analytic framework and our understanding of the main paths through which organizational culture can influence child sexual abuse in youth-serving organizations to analyze the role that organizational culture plays in the perpetration, detection, and response to child sexual abuse in youth-serving organizations. We selectively illustrate our analysis with case materials compiled by the Royal Commission into Institutional Responses to Child Sexual Abuse and reports of child sexual abuse published in a variety of other sources. We conclude with a brief discussion of the policy implications of our analysis. Copyright © 2017. Published by Elsevier Ltd.

  2. Self-care Concept Analysis in Cancer Patients: An Evolutionary Concept Analysis

    PubMed Central

    Hasanpour-Dehkordi, Ali

    2016-01-01

    Background: Self-care is a frequently used concept in both the theory and the clinical practice of nursing and is considered an element of nursing theory by Orem. The aim of this paper is to identify the core attributes of the self-care concept in cancer patients. Materials and Methods: We used Rodgers’ evolutionary method of concept analysis. The articles published in English language from 1980 to 2015 on nursing and non-nursing disciplines were analyzed. Finally, 85 articles, an MSc thesis, and a PhD thesis were selected, examined, and analyzed in-depth. Two experts checked the process of analysis and monitored and reviewed the articles. Results: The analysis showed that self-care concept is determined by four attributes of education, interaction, self-control, and self-reliance. Three types of antecedents in the present study were client-related (self-efficacy, self-esteem), system-related (adequate sources, social networks, and cultural factors), and healthcare professionals-related (participation). Conclusion: The self-care concept has considerably evolved among patients with chronic diseases, particularly cancer, over the past 35 years, and nurses have managed to enhance their knowledge about self-care remarkably for the clients so that the nurses in healthcare teams have become highly efficient and able to assume the responsibility for self-care teams. PMID:27803559

  3. cyclostratigraphy, sequence stratigraphy and organic matter accumulation mechanism

    NASA Astrophysics Data System (ADS)

    Cong, F.; Li, J.

    2016-12-01

    The first member of Maokou Formation of Sichuan basin is composed of well preserved carbonate ramp couplets of limestone and marlstone/shale. It acts as one of the potential shale gas source rock, and is suitable for time-series analysis. We conducted time-series analysis to identify high-frequency sequences, reconstruct high-resolution sedimentation rate, estimate detailed primary productivity for the first time in the study intervals and discuss organic matter accumulation mechanism of source rock under sequence stratigraphic framework.Using the theory of cyclostratigraphy and sequence stratigraphy, the high-frequency sequences of one outcrop profile and one drilling well are identified. Two third-order sequences and eight fourth-order sequences are distinguished on outcrop profile based on the cycle stacking patterns. For drilling well, sequence boundary and four system tracts is distinguished by "integrated prediction error filter analysis" (INPEFA) of Gamma-ray logging data, and eight fourth-order sequences is identified by 405ka long eccentricity curve in depth domain which is quantified and filtered by integrated analysis of MTM spectral analysis, evolutive harmonic analysis (EHA), evolutive average spectral misfit (eASM) and band-pass filtering. It suggests that high-frequency sequences correlate well with Milankovitch orbital signals recorded in sediments, and it is applicable to use cyclostratigraphy theory in dividing high-frequency(4-6 orders) sequence stratigraphy.High-resolution sedimentation rate is reconstructed through the study interval by tracking the highly statistically significant short eccentricity component (123ka) revealed by EHA. Based on sedimentation rate, measured TOC and density data, the burial flux, delivery flux and primary productivity of organic carbon was estimated. By integrating redox proxies, we can discuss the controls on organic matter accumulation by primary production and preservation under the high-resolution sequence stratigraphic framework. Results show that high average organic carbon contents in the study interval are mainly attributed to high primary production. The results also show a good correlation between high organic carbon accumulation and intervals of transgression.

  4. Selling science 2.0: What scientific projects receive crowdfunding online?

    PubMed

    Schäfer, Mike S; Metag, Julia; Feustle, Jessica; Herzog, Livia

    2016-09-19

    Crowdfunding has emerged as an additional source for financing research in recent years. The study at hand identifies and tests explanatory factors influencing the success of scientific crowdfunding projects by drawing on news value theory, the "reputation signaling" approach, and economic theories of online payment. A standardized content analysis of 371 projects on English- and German-language platforms reveals that each theory provides factors influencing crowdfunding success. It shows that projects presented on science-only crowdfunding platforms have a higher success rate. At the same time, projects are more likely to be successful if their presentation includes visualizations and humor, the lower their targeted funding is, the less personal data potential donors have to relinquish and the more interaction between researchers and donors is possible. This suggests that after donors decide to visit a scientific crowdfunding platform, factors unrelated to science matter more for subsequent funding decisions, raising questions about the potential and implications of crowdfunding science. © The Author(s) 2016.

  5. The motivation to breastfeed: a fit to the opponent-process theory?

    PubMed

    Myers, H H; Siegel, P S

    1985-07-01

    The opponent-process theory, a dynamic model of acquired motivation presented by Solomon and Corbit (1974), was applied to the process of breastfeeding. A modified form of the Nowlis Mood Adjective Checklist (MACL, Nowlis, 1965, 1970) and a discomfort measure were used in assessing through recall the affective course predicted by the theory. The data were analyzed using multivariate analysis of variance (MANOVA) and correlational procedures. Results were highly significant: Women who breastfed for relatively long periods recalled positive affective responses while the baby was at breast and a subsequent negative or dysphoric response. The additional characteristics of acquired motivation, habituation, and withdrawal, were also evidenced in the data. As a control for possible confounding demand characteristics inherent in the methodology, a sample of childless women was surveyed using an "as-if" form of the same questionnaire. Very little similarity to the breastfeeders was found in the pattern of responses yielded by this group. It was concluded that our major findings are quite likely free of influence from this source.

  6. Exploring nursing educators' use of theory and methods in search for evidence based credibility in nursing education.

    PubMed

    Beccaria, Lisa; Kek, Megan Y C A; Huijser, Henk

    2018-06-01

    In this paper, a review of nursing education literature is employed to ascertain the extent to which nursing educators apply theory to their research, as well as the types of theory they employ. In addition, the use of research methodologies in the nursing education literature is explored. An integrative review. A systematic search was conducted for English-language, peer reviewed publications of any research design via Academic Search Complete, Science Direct, CINAHL, and Health Source: Nursing/Academic Edition databases from 2001 to 2016, of which 140 were reviewed. The findings suggest that within current nursing education literature the scholarship of discovery, and the exploration of epistemologies other than nursing, in particular as they relate to teaching and learning, shows significant potential for expansion and diversification. The analysis highlights opportunities for nursing educators to incorporate broader theoretical, pedagogical, methodological and philosophical perspectives within teaching and the scholarship of teaching. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Simplified analysis about horizontal displacement of deep soil under tunnel excavation

    NASA Astrophysics Data System (ADS)

    Tian, Xiaoyan; Gu, Shuancheng; Huang, Rongbin

    2017-11-01

    Most of the domestic scholars focus on the study about the law of the soil settlement caused by subway tunnel excavation, however, studies on the law of horizontal displacement are lacking. And it is difficult to obtain the horizontal displacement data of any depth in the project. At present, there are many formulas for calculating the settlement of soil layers. In terms of integral solutions of Mindlin classic elastic theory, stochastic medium theory, source-sink theory, the Peck empirical formula is relatively simple, and also has a strong applicability at home. Considering the incompressibility of rock and soil mass, based on the principle of plane strain, the calculation formula of the horizontal displacement of the soil along the cross section of the tunnel was derived by using the Peck settlement formula. The applicability of the formula is verified by comparing with the existing engineering cases, a simple and rapid analytical method for predicting the horizontal displacement is presented.

  8. Algorithms for System Identification and Source Location.

    NASA Astrophysics Data System (ADS)

    Nehorai, Arye

    This thesis deals with several topics in least squares estimation and applications to source location. It begins with a derivation of a mapping between Wiener theory and Kalman filtering for nonstationary autoregressive moving average (ARMO) processes. Applying time domain analysis, connections are found between time-varying state space realizations and input-output impulse response by matrix fraction description (MFD). Using these connections, the whitening filters are derived by the two approaches, and the Kalman gain is expressed in terms of Wiener theory. Next, fast estimation algorithms are derived in a unified way as special cases of the Conjugate Direction Method. The fast algorithms included are the block Levinson, fast recursive least squares, ladder (or lattice) and fast Cholesky algorithms. The results give a novel derivation and interpretation for all these methods, which are efficient alternatives to available recursive system identification algorithms. Multivariable identification algorithms are usually designed only for left MFD models. In this work, recursive multivariable identification algorithms are derived for right MFD models with diagonal denominator matrices. The algorithms are of prediction error and model reference type. Convergence analysis results obtained by the Ordinary Differential Equation (ODE) method are presented along with simulations. Sources of energy can be located by estimating time differences of arrival (TDOA's) of waves between the receivers. A new method for TDOA estimation is proposed for multiple unknown ARMA sources and additive correlated receiver noise. The method is based on a formula that uses only the receiver cross-spectra and the source poles. Two algorithms are suggested that allow tradeoffs between computational complexity and accuracy. A new time delay model is derived and used to show the applicability of the methods for non -integer TDOA's. Results from simulations illustrate the performance of the algorithms. The last chapter analyzes the response of exact least squares predictors for enhancement of sinusoids with additive colored noise. Using the matrix inversion lemma and the Christoffel-Darboux formula, the frequency response and amplitude gain of the sinusoids are expressed as functions of the signal and noise characteristics. The results generalize the available white noise case.

  9. Analysis of amorphous indium-gallium-zinc-oxide thin-film transistor contact metal using Pilling-Bedworth theory and a variable capacitance diode model

    NASA Astrophysics Data System (ADS)

    Kiani, Ahmed; Hasko, David G.; Milne, William I.; Flewitt, Andrew J.

    2013-04-01

    It is widely reported that threshold voltage and on-state current of amorphous indium-gallium-zinc-oxide bottom-gate thin-film transistors are strongly influenced by the choice of source/drain contact metal. Electrical characterisation of thin-film transistors indicates that the electrical properties depend on the type and thickness of the metal(s) used. Electron transport mechanisms and possibilities for control of the defect state density are discussed. Pilling-Bedworth theory for metal oxidation explains the interaction between contact metal and amorphous indium-gallium-zinc-oxide, which leads to significant trap formation. Charge trapping within these states leads to variable capacitance diode-like behavior and is shown to explain the thin-film transistor operation.

  10. Non Locality Proofs in Quantum Mechanics Analyzed by Ordinary Mathematical Logic

    NASA Astrophysics Data System (ADS)

    Nisticò, Giuseppe

    2014-10-01

    The so-called non-locality theorems aim to show that Quantum Mechanics is not consistent with the Locality Principle. Their proofs require, besides the standard postulates of Quantum Theory, further conditions, as for instance the Criterion of Reality, which cannot be formulated in the language of Standard Quantum Theory; this difficulty makes the proofs not verifiable according to usual logico-mathematical methods, and therefore it is a source of the controversial debate about the real implications of these theorems. The present work addresses this difficulty for Bell-type and Stapp's arguments of non-locality. We supplement the formalism of Quantum Mechanics with formal statements inferred from the further conditions in the two different cases. Then an analysis of the two arguments is performed according to ordinary mathematical logic.

  11. Development of guidelines for the definition of the relavant information content in data classes

    NASA Technical Reports Server (NTRS)

    Schmitt, E.

    1973-01-01

    The problem of experiment design is defined as an information system consisting of information source, measurement unit, environmental disturbances, data handling and storage, and the mathematical analysis and usage of data. Based on today's concept of effective computability, general guidelines for the definition of the relevant information content in data classes are derived. The lack of a universally applicable information theory and corresponding mathematical or system structure is restricting the solvable problem classes to a small set. It is expected that a new relativity theory of information, generally described by a universal algebra of relations will lead to new mathematical models and system structures capable of modeling any well defined practical problem isomorphic to an equivalence relation at any corresponding level of abstractness.

  12. Analysis of aperture averaging measurements. [laser scintillation data on the effect of atmospheric turbulence on signal fluctuations

    NASA Technical Reports Server (NTRS)

    Fried, D. L.

    1975-01-01

    Laser scintillation data obtained by the NASA Goddard Space Flight Center balloon flight no. 5 from White Sands Missile Range on 19 October 1973 are analyzed. The measurement data, taken with various size receiver apertures, were related to predictions of aperture averaging theory, and it is concluded that the data are in reasonable agreement with theory. The following parameters are assigned to the vertical distribution of the strength of turbulence during the period of the measurements (daytime), for lambda = 0.633 microns, and the source at the zenith; the aperture averaging length is d sub o = 0.125 m, and the log-amplitude variance is (beta sub l)2 = 0.084 square nepers. This corresponds to a normalized point intensity variance of 0.40.

  13. Residents’ Waste Separation Behaviors at the Source: Using SEM with the Theory of Planned Behavior in Guangzhou, China

    PubMed Central

    Zhang, Dongliang; Huang, Guangqing; Yin, Xiaoling; Gong, Qinghua

    2015-01-01

    Understanding the factors that affect residents’ waste separation behaviors helps in constructing effective environmental campaigns for a community. Using the theory of planned behavior (TPB), this study examines factors associated with waste separation behaviors by analyzing responses to questionnaires distributed in Guangzhou, China. Data drawn from 208 of 1000-field questionnaires were used to assess socio-demographic factors and the TPB constructs (i.e., attitudes, subjective norms, perceived behavioral control, intentions, and situational factors). The questionnaire data revealed that attitudes, subjective norms, perceived behavioral control, intentions, and situational factors significantly predicted household waste behaviors in Guangzhou, China. Through a structural equation modeling analysis, we concluded that campaigns targeting moral obligations may be particularly effective for increasing the participation rate in waste separation behaviors. PMID:26274969

  14. Locating relationship and communication issues among stressors associated with breast cancer.

    PubMed

    Weber, Kirsten M; Solomon, Denise Haunani

    2008-11-01

    This article clarifies how the social contexts in which breast cancer survivors live can contribute to the stress they experience because of the disease. Guided by Solomon and Knobloch's (2004) relational turbulence model and Petronio's (2002) communication privacy management theory, this study explores personal relationship and communication boundary issues within stressors that are associated with the diagnosis, treatment, and early survivorship of breast cancer. A qualitative analysis of discourse posted on breast cancer discussion boards and weblogs using the constant comparative method and open-coding techniques revealed 12 sources of stress. Using axial coding methods and probing these topics for underlying relationship and communication issues yielded 5 themes. The discussion highlights the implications of the findings for the theories that guided this investigation and for breast cancer survivorship more generally.

  15. The resolution of point sources of light as analyzed by quantum detection theory

    NASA Technical Reports Server (NTRS)

    Helstrom, C. W.

    1972-01-01

    The resolvability of point sources of incoherent light is analyzed by quantum detection theory in terms of two hypothesis-testing problems. In the first, the observer must decide whether there are two sources of equal radiant power at given locations, or whether there is only one source of twice the power located midway between them. In the second problem, either one, but not both, of two point sources is radiating, and the observer must decide which it is. The decisions are based on optimum processing of the electromagnetic field at the aperture of an optical instrument. In both problems the density operators of the field under the two hypotheses do not commute. The error probabilities, determined as functions of the separation of the points and the mean number of received photons, characterize the ultimate resolvability of the sources.

  16. General analytical solutions for DC/AC circuit-network analysis

    NASA Astrophysics Data System (ADS)

    Rubido, Nicolás; Grebogi, Celso; Baptista, Murilo S.

    2017-06-01

    In this work, we present novel general analytical solutions for the currents that are developed in the edges of network-like circuits when some nodes of the network act as sources/sinks of DC or AC current. We assume that Ohm's law is valid at every edge and that charge at every node is conserved (with the exception of the source/sink nodes). The resistive, capacitive, and/or inductive properties of the lines in the circuit define a complex network structure with given impedances for each edge. Our solution for the currents at each edge is derived in terms of the eigenvalues and eigenvectors of the Laplacian matrix of the network defined from the impedances. This derivation also allows us to compute the equivalent impedance between any two nodes of the circuit and relate it to currents in a closed circuit which has a single voltage generator instead of many input/output source/sink nodes. This simplifies the treatment that could be done via Thévenin's theorem. Contrary to solving Kirchhoff's equations, our derivation allows to easily calculate the redistribution of currents that occurs when the location of sources and sinks changes within the network. Finally, we show that our solutions are identical to the ones found from Circuit Theory nodal analysis.

  17. USU Center of Excellence in Theory and Analysis of the Geo-Plasma Environment

    DTIC Science & Technology

    1992-05-25

    AFM CN AOR9002 B. ADORE=S ICRYi. Stei md ZIP Codej 10. SOURCE OF FUNOING NOS. BuildPng 410 PROGRAM PROJECT TASK WORK UNIT.- Buling 410D..203 ELEMENT ...OTH radars, communications, and orbiting space structures. The overall goal of the research is to obtain a better understanding of the basic chemical...and orbiting space structures. The overall goal of the research is to obtain a better understanding of the basic chemical and physical processes

  18. Analysis of the United States Marine Corps’ Utilization of Defense Logistics Agency Disposition Services as a Source of Supply

    DTIC Science & Technology

    2011-12-03

    difficult within the DoD because “most incentives and motivations are not apparent for either government or industry ” (p. 84). Doane and Spencer (1997...stated that industry incentives and motivation seem to be based on the same profit and loss theories that were present before acquisition reform...purchased items needed for the unit’s day-to-day operations, such as digital cameras, commercial tactical clothing and eyewear , plasma televisions, lawn

  19. Unified aeroacoustics analysis for high speed turboprop aerodynamics and noise. Volume 5: Propagation of propeller tone noise through a fuselage boundary layer

    NASA Technical Reports Server (NTRS)

    Magliozzi, B.; Hanson, D. B.

    1991-01-01

    An analysis of tone noise propagation through a boundary layer and fuselage scattering effects was derived. This analysis is a three dimensional and the complete wave field is solved by matching analytical expressions for the incident and scattered waves in the outer flow to a numerical solution in the boundary layer flow. The outer wave field is constructed analytically from an incident wave appropriate to the source and a scattered wave in the standard Hankel function form. For the incident wave, an existing function - domain propeller noise radiation theory is used. In the boundary layer region, the wave equation is solved by numerical methods. The theoretical analysis is embodied in a computer program which allows the calculation of correction factors for the fuselage scattering and boundary layer refraction effects. The effects are dependent on boundary layer profile, flight speed, and frequency. Corrections can be derived for any point on the fuselage, including those on the opposite side from the source. The theory was verified using limited cases and by comparing calculations with available measurements from JetStar tests of model prop-fans. For the JetStar model scale, the boundary layer refraction effects produce moderate fuselage pressure reinforcements aft of and near the plane of rotation and significant attenuation forward of the plane of rotation at high flight speeds. At lower flight speeds, the calculated boundary layer effects result in moderate amplification over the fuselage area of interest. Apparent amplification forward of the plane of rotation is a result of effective changes in the source directivity due to boundary layer refraction effects. Full scale effects are calculated to be moderate, providing fuselage pressure amplification of about 5 dB at the peak noise location. Evaluation using available noise measurements was made under high-speed, high-altitude flight conditions. Comparisons of calculations made of free field noise, using a current frequency-domain propeller noise prediction method, and fuselage effects using this new procedure show good agreement with fuselage measurements over a wide range of flight speeds and frequencies. Correction factors for the JetStar measurements made on the fuselage are provided in an Appendix.

  20. Mode selection of magneto-Rayleigh-Taylor instability from the point of view of Landau phase transition theory

    NASA Astrophysics Data System (ADS)

    Dan, Jia Kun; Huang, Xian Bin; Ren, Xiao Dong; Wei, Bing

    2017-08-01

    A theoretical model referring to mode selection of Z-pinch-driven magneto-Rayleigh-Taylor (MRT) instability, which explains the generation of fundamental instability mode and evolution of fundamental wavelength in experiments, is proposed on the basis of the Landau theory of phase transition. The basic idea of this phase transition model lies in that the appearance of MRT instability pattern can be considered as a consequence of the spontaneous generation of interfacial structure like the spontaneous magnetization in a ferromagnetic system. It is demonstrated that the amplitude of instability is responsible for the order parameter in the Landau theory of phase transition and the fundamental wavelength appears to play a role analogous to inverse temperature in thermodynamics. Further analysis indicates that the MRT instability is characterized by first order phase transition and the fundamental wavelength is proportional to the square root of energy entering into the system from the driving source. The theory predicts that the fundamental wavelength grows rapidly and saturates reaching a limiting wavelength of the order of the liner's final outer radius. The results given by this theory show qualitative agreement with the available experimental data of MRT instability of liner implosions conducted on the Sandia Z machine as well as Primary Test Stand facility at the Institute of Fluid Physics.

  1. The Strategic Exercise of Options Using Government Subsidies: An Analysis of Production Subsidies for the Ground Source Heat Pump

    NASA Astrophysics Data System (ADS)

    Lin, Sheng-Hau; Li, Jia-Hsun; Hsu, Chih-Chen; Hsieh, Jing-Chzi; Liao, Pin-Chao

    2018-04-01

    This study utilizes consolidation investment theory to incorporate with business strategies and government subsidy to develop a strategic exercise of options model. This empirical investigation examines the ground source heat pump (GSHP) government subsidy program, which is part of China’s 12th Five Year Plan. The developed model is applied to explain the behaviours of business investment with regard to strategic investment timing, option values, and the influence of government subsidies in duopolistic real-world investment decisions. The results indicate that subsidy policy can reduce the differences of investment timing among GSHP investors and has clearly evidenced the positive benefit–cost ratio of government subsidy, which facilitates China’s GSHP industry development.

  2. Dynamical Analysis of a Cylindrical Piezoelectric Transducer

    NASA Astrophysics Data System (ADS)

    LU, P.; LEE, K. H.; LIM, S. P.

    2003-01-01

    In the present paper, the vibration of a cylindrical piezoelectric transducer induced by applied voltage, which can be used as the stator transducer of a cylindrical micromotor, is studied based on shell theory. The transducer is modelled as a thin elastic cylinder. The properties of the vibration modes of the transducer, such as mode frequencies and amplitude ratios of the mode shapes, are determined following Galerkin method. The response of the transducer under the four electric sources with 90° phase difference is then obtained by the modal summation method. With the results, the performance of the transducer under the electric sources can be estimated. The present work provides a general and precise theoretical modelling on the dynamical movement of the transducer.

  3. Wireless Power Transfer for Space Applications

    NASA Technical Reports Server (NTRS)

    Ramos, Gabriel Vazquez; Yuan, Jiann-Shiun

    2011-01-01

    This paper introduces an implementation for magnetic resonance wireless power transfer for space applications. The analysis includes an equivalent impedance study, loop material characterization, source/load resonance coupling technique, and system response behavior due to loads variability. System characterization is accomplished by executing circuit design from analytical equations and simulations using Matlab and SPICE. The theory was validated by a combination of different experiments that includes loop material consideration, resonance coupling circuits considerations, electric loads considerations and a small scale proof-of-concept prototype. Experiment results shows successful wireless power transfer for all the cases studied. The prototype provided about 4.5 W of power to the load at a separation of -5 cm from the source using a power amplifier rated for 7 W.

  4. Near-field interferometry of a free-falling nanoparticle from a point-like source

    NASA Astrophysics Data System (ADS)

    Bateman, James; Nimmrichter, Stefan; Hornberger, Klaus; Ulbricht, Hendrik

    2014-09-01

    Matter-wave interferometry performed with massive objects elucidates their wave nature and thus tests the quantum superposition principle at large scales. Whereas standard quantum theory places no limit on particle size, alternative, yet untested theories—conceived to explain the apparent quantum to classical transition—forbid macroscopic superpositions. Here we propose an interferometer with a levitated, optically cooled and then free-falling silicon nanoparticle in the mass range of one million atomic mass units, delocalized over >150 nm. The scheme employs the near-field Talbot effect with a single standing-wave laser pulse as a phase grating. Our analysis, which accounts for all relevant sources of decoherence, indicates that this is a viable route towards macroscopic high-mass superpositions using available technology.

  5. Magnetic Reconnection Driven by Thermonuclear Burning

    NASA Astrophysics Data System (ADS)

    Gatto, R.; Coppi, B.

    2017-10-01

    Considering that fusion reaction products (e.g. α-particles) deposit their energy on the electrons, the relevant thermal energy balance equation is characterized by a fusion source term, a relatively large longitudinal thermal conductivity and an appropriate transverse thermal conductivity. Then, looking for modes that are radially localized around rational surfaces, reconnected field configurations are found that can be sustained by the electron thermal energy source due to fusion reactions. Then this process can be included in the category of endogenous reconnection processes and may be viewed as a form of the thermonuclear instability that can develop in an ignited inhomogeneous plasma. A complete analysis of the equations supporting the relevant theory is reported. Sponsored in part by the U.S. DoE.

  6. THEORY IN RELIGION AND AGING: AN OVERVIEW

    PubMed Central

    Levin, Jeff; Chatters, Linda M.; Taylor, Robert Joseph

    2011-01-01

    This paper provides an overview of theory in religion, aging, and health. It offers both a primer on theory and a roadmap for researchers. Four “tenses” of theory are described—distinct ways that theory comes into play in this field: grand theory, mid-range theory, use of theoretical models, and positing of constructs which mediate or moderate putative religious effects. Examples are given of both explicit and implicit uses of theory. Sources of theory for this field are then identified, emphasizing perspectives of sociologists and psychologists, and discussion is given to limitations of theory. Finally, reflections are offered as to why theory matters. PMID:20087662

  7. Investigating the purpose of trigonometry in the modern sciences

    NASA Astrophysics Data System (ADS)

    Hertel, Joshua T.

    This dissertation reports the results of a qualitative research project that aimed to develop a research-based perspective on the purpose of trigonometry in the modern sciences. The investigation was guided by three objectives. First, the study sought to identify the purpose of trigonometry as described by educators and high school textbooks. Second, the research investigated the perspectives these sources held about definitions of the trigonometric functions. Third, the investigation examined the potential benefits and drawbacks of a line-segment definition of the trigonometric functions. The study followed a grounded theory methodology with data collection and analysis intertwined. Participants included faculty from two large Midwestern research universities, high school teachers, and authors of standards documents. Textbooks were drawn from introductory algebra, geometry, advanced algebra, precalculus, and calculus texts. Data collected included surveys, interviews, and textbook excerpts. Analysis used the constant comparative method (Corbin & Strauss, 2008; Glaser & Strauss, 2006/1967). Analysis resulted in the emergence of a grounded theory, the tensions of trigonometry, which described three interrelated themes within the data: definition, application, and role. Two ideas emerged that connected the tensions of trigonometry, the regions of interaction, which described the interplay between the three tensions, and the idealized dichotomy of trigonometry education, which outlined opposing perspectives on trigonometry: trigonometry for all and trigonometry for some. The grounded theory outlines a range of competing purposes for trigonometry in the modern sciences. It suggests that educators are engaged in a process of continual negotiation that results in the formation of a localized purpose of trigonometry. The benefits and drawbacks of different definitions are not based on mathematical sophistication, but are situational. Furthermore, the theory suggests that the line-segment definition faces a number of obstacles if it is to be adopted. Implications for future research on the teaching and learning of trigonometry are discussed.

  8. Recollection is a continuous process: implications for dual-process theories of recognition memory.

    PubMed

    Mickes, Laura; Wais, Peter E; Wixted, John T

    2009-04-01

    Dual-process theory, which holds that recognition decisions can be based on recollection or familiarity, has long seemed incompatible with signal detection theory, which holds that recognition decisions are based on a singular, continuous memory-strength variable. Formal dual-process models typically regard familiarity as a continuous process (i.e., familiarity comes in degrees), but they construe recollection as a categorical process (i.e., recollection either occurs or does not occur). A continuous process is characterized by a graded relationship between confidence and accuracy, whereas a categorical process is characterized by a binary relationship such that high confidence is associated with high accuracy but all lower degrees of confidence are associated with chance accuracy. Using a source-memory procedure, we found that the relationship between confidence and source-recollection accuracy was graded. Because recollection, like familiarity, is a continuous process, dual-process theory is more compatible with signal detection theory than previously thought.

  9. Renormalization group theory for percolation in time-varying networks.

    PubMed

    Karschau, Jens; Zimmerling, Marco; Friedrich, Benjamin M

    2018-05-22

    Motivated by multi-hop communication in unreliable wireless networks, we present a percolation theory for time-varying networks. We develop a renormalization group theory for a prototypical network on a regular grid, where individual links switch stochastically between active and inactive states. The question whether a given source node can communicate with a destination node along paths of active links is equivalent to a percolation problem. Our theory maps the temporal existence of multi-hop paths on an effective two-state Markov process. We show analytically how this Markov process converges towards a memoryless Bernoulli process as the hop distance between source and destination node increases. Our work extends classical percolation theory to the dynamic case and elucidates temporal correlations of message losses. Quantification of temporal correlations has implications for the design of wireless communication and control protocols, e.g. in cyber-physical systems such as self-organized swarms of drones or smart traffic networks.

  10. Theory of Gamma-Ray Burst Sources

    NASA Astrophysics Data System (ADS)

    Ramirez-Ruiz, Enrico

    In the sections which follow, we shall be concerned predominantly with the theory of γ-ray burst sources. If the concepts there proposed are indeed relevant to an understanding of the nature of these sources, then their existence becomes inextricably linked to the metabolic pathways through which gravity, spin, and energy can combine to form collimated, ultrarelativistic outflows. These threads are few and fragile, as we are still wrestling with trying to understand non-relativistic processes, most notably those associated with the electromagnetic field and gas dynamics. If we are to improve our picture-making we must make more and stronger ties of physical theory. But in reconstructing the creature, we must be guided by our eyes and their extensions. In this introductory section we have therefore attempted to summarise the observed properties of these ultra-energetic phenomena.

  11. Towards the theory of pollinator-mediated gene flow.

    PubMed Central

    Cresswell, James E

    2003-01-01

    I present a new exposition of a model of gene flow by animal-mediated pollination between a source population and a sink population. The model's parameters describe two elements: (i) the expected portion of the source's paternity that extends to the sink population; and (ii) the dilution of this portion by within-sink pollinations. The model is termed the portion-dilution model (PDM). The PDM is a parametric restatement of the conventional view of animal-mediated pollination. In principle, it can be applied to plant species in general. I formulate a theoretical value of the portion parameter that maximizes gene flow and prescribe this as a benchmark against which to judge the performance of real systems. Existing foraging theory can be used in solving part of the PDM, but a theory for source-to-sink transitions by pollinators is currently elusive. PMID:12831465

  12. Nonlinear theory of shocked sound propagation in a nearly choked duct flow

    NASA Technical Reports Server (NTRS)

    Myers, M. K.; Callegari, A. J.

    1982-01-01

    The development of shocks in the sound field propagating through a nearly choked duct flow is analyzed by extending a quasi-one dimensional theory. The theory is applied to the case in which sound is introduced into the flow by an acoustic source located in the vicinity of a near-sonic throat. Analytical solutions for the field are obtained which illustrate the essential features of the nonlinear interaction between sound and flow. Numerical results are presented covering ranges of variation of source strength, throat Mach number, and frequency. It is found that the development of shocks leads to appreciable attenuation of acoustic power transmitted upstream through the near-sonic flow. It is possible, for example, that the power loss in the fundamental harmonic can be as much as 90% of that introduced at the source.

  13. Development of a noncompact source theory with applications to helicopter rotors

    NASA Technical Reports Server (NTRS)

    Farassat, F.; Brown, T. J.

    1976-01-01

    A new formulation for determining the acoustic field of moving bodies, based on acoustic analogy, is derived. The acoustic pressure is given as the sum of two integrals, one of which has a derivative with respect to time. The integrands are functions of the normal velocity and surface pressure of the body. A computer program based on this formulation was used to calculate acoustic pressure signatures for several helicoptor rotors from experimental surface pressure data. Results are compared with those from compact source calculations. It is shown that noncompactness of steady sources on the rotor can account for the high harmonics of the pressure system. Thickness noise is shown to be a significant source of sound, especially for blunt airfoils in regions where noncompact source theory should be applied.

  14. Seismic wavefield propagation in 2D anisotropic media: Ray theory versus wave-equation simulation

    NASA Astrophysics Data System (ADS)

    Bai, Chao-ying; Hu, Guang-yi; Zhang, Yan-teng; Li, Zhong-sheng

    2014-05-01

    Despite the ray theory that is based on the high frequency assumption of the elastic wave-equation, the ray theory and the wave-equation simulation methods should be mutually proof of each other and hence jointly developed, but in fact parallel independent progressively. For this reason, in this paper we try an alternative way to mutually verify and test the computational accuracy and the solution correctness of both the ray theory (the multistage irregular shortest-path method) and the wave-equation simulation method (both the staggered finite difference method and the pseudo-spectral method) in anisotropic VTI and TTI media. Through the analysis and comparison of wavefield snapshot, common source gather profile and synthetic seismogram, it is able not only to verify the accuracy and correctness of each of the methods at least for kinematic features, but also to thoroughly understand the kinematic and dynamic features of the wave propagation in anisotropic media. The results show that both the staggered finite difference method and the pseudo-spectral method are able to yield the same results even for complex anisotropic media (such as a fault model); the multistage irregular shortest-path method is capable of predicting similar kinematic features as the wave-equation simulation method does, which can be used to mutually test each other for methodology accuracy and solution correctness. In addition, with the aid of the ray tracing results, it is easy to identify the multi-phases (or multiples) in the wavefield snapshot, common source point gather seismic section and synthetic seismogram predicted by the wave-equation simulation method, which is a key issue for later seismic application.

  15. A Need for a Theory of Visual Literacy.

    ERIC Educational Resources Information Center

    Hortin, John A.

    1982-01-01

    Examines sources available for developing a theory of visual literacy and attempts to clarify the meaning of the term. Suggests that visual thinking, a concept supported by recent research on mental imagery, visualization, and dual coding, ought to be the emphasis for future theory development. (FL)

  16. Electrophysiological Correlates of Emotional Source Memory in High-Trait-Anxiety Individuals

    PubMed Central

    Cui, Lixia; Shi, Guangyuan; He, Fan; Zhang, Qin; Oei, Tian P. S.; Guo, Chunyan

    2016-01-01

    The interaction between recognition memory and emotion has become a research hotspot in recent years. Dual process theory posits that familiarity and recollection are two separate processes contributing to recognition memory, but further experimental evidence is needed. The present study explored the emotional context effects on successful and unsuccessful source retrieval amongst 15 high-trait-anxiety college students by using event-related potentials (ERPs) measurement. During study, a happy, fearful, or neutral face picture first was displayed, then a Chinese word was superimposed centrally on the picture and subjects were asked to remember the word and the corresponding type of picture. During the test participants were instructed to press one of four buttons to indicate whether the displayed word was an old or new word. And then, for the old word, indicate whether it had been shown with a fearful, happy, or neutral face during the study. ERPs were generally more positive for remembered words than for new words and the ERP difference was termed as an old/new effect. It was found that, for successful source retrieval (it meant both the item and the source were remembered accurately) between 500 and 700 ms (corresponding to a late positive component, LPC), there were significant old/new effects in all contexts. However, for unsuccessful source retrieval (it meant the correct recognition of old items matched with incorrect source attribution), there were no significant old/new effects in happy and neutral contexts, though significant old/new effects were observed in the fearful context. Between 700 and 1200 ms (corresponding to a late slow wave, LSW), there were significant old/new effects for successful source retrieval in happy and neutral contexts. However, in the fearful context, the old/new effects were reversed, ERPs were more negative for successful source retrieval compared to correct rejections. Moreover, there were significant emotion effects for successful source retrieval at this time window. Further analysis showed ERPs of old items were more negative in fearful context than in neutral context. The results showed that early unsuccessful fearful source retrieval processes (related to familiarity) were enhanced, but late successful fearful source retrieval processes during source retrieval monitoring (related to recollection) were weakened. This provided preliminary evidence for the dual processing theory. PMID:27462288

  17. Getting over Epistemology and Treating Theory as a Recyclable Source of "Things"

    ERIC Educational Resources Information Center

    Kusznirczuk, John

    2012-01-01

    This paper challenges the way in which we are inclined to treat theory and suggests that our tendency to privilege it over method is counterproductive. Some consequences of privileging theory are pointed out and a remedy is proposed. The remedy entails a number of "reversals" in the way we treat theory and method in maths education research, the…

  18. Online information search behaviour of physicians.

    PubMed

    Mikalef, Patrick; Kourouthanassis, Panos E; Pateli, Adamantia G

    2017-03-01

    Although doctors increasingly engage in online information seeking to complement their medical practice, little is known regarding what online information sources are used and how effective they are. Grounded on self-determination and needs theory, this study posits that doctors tend to use online information sources to fulfil their information requirements in three pre-defined areas: patient care, knowledge development and research activities. Fulfilling these information needs is argued to improve doctors' perceived medical practice competence. Performing PLS-SEM analysis on primary survey data from 303 medical doctors practicing in four major Greek hospitals, a conceptual model is empirically tested. Using authoritative online information sources was found to fulfil all types of information needs. Contrarily, using non-authoritative information sources had no significant effect. Satisfying information requirements relating to patient care and research activities enhanced doctors' perceptions about their medical practice competence. In contrast, meeting knowledge development information needs had the opposite result. Consistent with past studies, outcomes indicate that doctors tend to use non-authoritative online information sources; yet their use was found to have no significant value in fulfilling their information requirements. Authoritative online information sources are found to improve perceived medical practice competence by satisfying doctors' diverse information requirements. © 2017 Health Libraries Group.

  19. Is amplitude loss of sonic waveforms due to intrinsic attenuation or source coupling to the medium?

    USGS Publications Warehouse

    Lee, Myung W.

    2006-01-01

    Sonic waveforms acquired in gas-hydrate-bearing sediments indicate strong amplitude loss associated with an increase in sonic velocity. Because the gas hydrate increases sonic velocities, the amplitude loss has been interpreted as due to intrinsic attenuation caused by the gas hydrate in the pore space, which apparently contradicts conventional wave propagation theory. For a sonic source in a fluid-filled borehole, the signal amplitude transmitted into the formation depends on the physical properties of the formation, including any pore contents, in the immediate vicinity of the source. A signal in acoustically fast material, such as gas-hydrate-bearing sediments, has a smaller amplitude than a signal in acoustically slower material. Therefore, it is reasonable to interpret the amplitude loss in the gas-hydrate-bearing sediments in terms of source coupling to the surrounding medium as well as intrinsic attenuation. An analysis of sonic waveforms measured at the Mallik 5L-38 well, Northwest Territories, Canada, indicates that a significant part of the sonic waveform's amplitude loss is due to a source-coupling effect. All amplitude analyses of sonic waveforms should include the effect of source coupling in order to accurately characterize the formation's intrinsic attenuation.

  20. Possible effects of free convection on fire behavior - laminar and turbulent line and point sources of heat

    Treesearch

    S. Scesa; F. M. Sauer

    1954-01-01

    The transfer theory is applied to the problem of atmospheric diffusion of momentum and heat induced by line and point sources of heat on the surface of the earth. In order that the validity of the approximations of the boundary layer theory be realized, the thickness of the layer in which the temperatures and velocities differ appreciably from the values at...

  1. Experimental verification of enhanced sound transmission from water to air at low frequencies.

    PubMed

    Calvo, David C; Nicholas, Michael; Orris, Gregory J

    2013-11-01

    Laboratory measurements of enhanced sound transmission from water to air at low frequencies are presented. The pressure at a monitoring hydrophone is found to decrease for shallow source depths in agreement with the classical theory of a monopole source in proximity to a pressure release interface. On the other hand, for source depths below 1/10 of an acoustic wavelength in water, the radiation pattern in the air measured by two microphones becomes progressively omnidirectional in contrast to the classical geometrical acoustics picture in which sound is contained within a cone of 13.4° half angle. The measured directivities agree with wavenumber integration results for a point source over a range of frequencies and source depths. The wider radiation pattern owes itself to the conversion of evanescent waves in the water into propagating waves in the air that fill the angular space outside the cone. A ratio of pressure measurements made using an on-axis microphone and a near-axis hydrophone are also reported and compared with theory. Collectively, these pressure measurements are consistent with the theory of anomalous transparency of the water-air interface in which a large fraction of acoustic power emitted by a shallow source is radiated into the air.

  2. Thermoelectric DC conductivities in hyperscaling violating Lifshitz theories

    NASA Astrophysics Data System (ADS)

    Cremonini, Sera; Cvetič, Mirjam; Papadimitriou, Ioannis

    2018-04-01

    We analytically compute the thermoelectric conductivities at zero frequency (DC) in the holographic dual of a four dimensional Einstein-Maxwell-Axion-Dilaton theory that admits a class of asymptotically hyperscaling violating Lifshitz backgrounds with a dynamical exponent z and hyperscaling violating parameter θ. We show that the heat current in the dual Lifshitz theory involves the energy flux, which is an irrelevant operator for z > 1. The linearized fluctuations relevant for computing the thermoelectric conductivities turn on a source for this irrelevant operator, leading to several novel and non-trivial aspects in the holographic renormalization procedure and the identification of the physical observables in the dual theory. Moreover, imposing Dirichlet or Neumann boundary conditions on the spatial components of one of the two Maxwell fields present leads to different thermoelectric conductivities. Dirichlet boundary conditions reproduce the thermoelectric DC conductivities obtained from the near horizon analysis of Donos and Gauntlett, while Neumann boundary conditions result in a new set of DC conductivities. We make preliminary analytical estimates for the temperature behavior of the thermoelectric matrix in appropriate regions of parameter space. In particular, at large temperatures we find that the only case which could lead to a linear resistivity ρ ˜ T corresponds to z = 4 /3.

  3. How to test the threat-simulation theory.

    PubMed

    Revonsuo, Antti; Valli, Katja

    2008-12-01

    Malcolm-Smith, Solms, Turnbull and Tredoux [Malcolm-Smith, S., Solms, M.,Turnbull, O., & Tredoux, C. (2008). Threat in dreams: An adaptation? Consciousness and Cognition, 17, 1281-1291.] have made an attempt to test the Threat-Simulation Theory (TST), a theory offering an evolutionary psychological explanation for the function of dreaming [Revonsuo, A. (2000a). The reinterpretation of dreams: An evolutionary hypothesis of the function of dreaming. Behavioral and Brain Sciences, 23(6), 877-901]. Malcolm-Smith et al. argue that empirical evidence from their own study as well as from some other studies in the literature does not support the main predictions of the TST: that threatening events are frequent and overrepresented in dreams, that exposure to real threats activates the threat-simulation system, and that dream threats contain realistic rehearsals of threat avoidance responses. Other studies, including our own, have come up with results and conclusions that are in conflict with those of Malcolm-Smith et al. In this commentary, we provide an analysis of the sources of these disagreements, and their implications to the TST. Much of the disagreement seems to stem from differing interpretations of the theory and, consequently, of differing methods to test it.

  4. On the detection of different chlorine bearing molecules in ISM through Herschel/HIFI

    NASA Astrophysics Data System (ADS)

    Majumdar, Liton; Chakrabarti, Sandip Kumar; Das, Ankan

    Main focus of this work is to explore possibility of finding two deuterated isotopomers of H2Cl+ (chloronium) in and around interstellar medium. Presence of a chloronium ion has recently been confirmed by Herschel Space Observatory's Heterodyne Instrument for far-infrared. It observed para-chloronium towards six sources in the Galaxy. Till date, existence of its deuterated isotopomers (HDCl+ and D2Cl+) have not been discussed in literature. We find that these deuterated gas phase ions could be destroyed by various ion-molecular reactions, dissociative recombination (DR), and cosmic rays (CRs). We compute all ion-molecular (polar) reaction rates by using parametrized trajectory theory and ion-molecular (non-polar) reaction rates by using the Langevin theory. For DR- and CR-induced reactions, we adopt two well-behaved rate formulas. We also include these rate coefficients in our large gas-grain chemical network to study chemical evolution of these species around outer edge of the cold, dense cloud. In order to study spectral properties of chloronium ion and its two deuterated isotopomers, we have carried out quantum chemical simulations. We calculate ground-state properties of these species by employing second-order Moller-Plesset perturbation theory (MP2) along with quadruple-zeta correlation consistent (aug-cc-pVQZ) basis set. Infrared and electronic absorption spectra of these species are calculated by using the same level of theory. The MP2/aug-cc-pVQZ level of theory is used to report the different spectroscopic constants of these gas phase species. These spectroscopic constants are essential to predict the rotational transitions of these species. Our predicted column densities of D2Cl+, HDCl+, along with spectral information may enable their future identification around the sources like NGC 6334I, Sgr B2(S) using Hershell. We are expecting that our theoretical modelling results along with spectroscopic analysis may enable HIFI for detecting new interstellar halogen molecules and their complexes.

  5. An asymptotic theory of supersonic propeller noise

    NASA Technical Reports Server (NTRS)

    Envia, Edmane

    1992-01-01

    A theory for predicting the noise field of a propeller with a realistic blade geometry is presented. The theory, which utilizes a large blade count approximation, provides an efficient formula for predicting the radiation of sound from all three sources of propeller noise. Comparisons with full numerical integration indicate that the noise levels predicted by this formula are quite accurate. Calculations based on this method also show that the radiation from the Lighthill quadrupole source is rather substantial when compared with thickness and loading noise for high speed propellers. A preliminary application of the theory to the problem of the sensitivity of the peak noise levels generated by a supersonic propeller to the variations in its tip helical Mach number has produced a trend that is in qualitative agreement with the experimental observations.

  6. Using Power as a Negative Cue: How Conspiracy Mentality Affects Epistemic Trust in Sources of Historical Knowledge.

    PubMed

    Imhoff, Roland; Lamberty, Pia; Klein, Olivier

    2018-04-01

    Classical theories of attitude change point to the positive effect of source expertise on perceived source credibility persuasion, but there is an ongoing societal debate on the increase in anti-elitist sentiments and conspiracy theories regarding the allegedly untrustworthy power elite. In one correlational ( N = 275) and three experimental studies ( N = 195, N = 464, N = 225), we tested the novel idea that people who endorse a conspiratorial mind-set (conspiracy mentality) indeed exhibit markedly different reactions to cues of epistemic authoritativeness than those who do not: Whereas the perceived credibility of powerful sources decreased with the recipients' conspiracy mentality, that of powerless sources increased independent of and incremental to other biases, such as the need to see the ingroup in particularly positive light. The discussion raises the question whether a certain extent of source-based bias is necessary for the social fabric of a highly complex society.

  7. Effects of measurement errors on psychometric measurements in ergonomics studies: Implications for correlations, ANOVA, linear regression, factor analysis, and linear discriminant analysis.

    PubMed

    Liu, Yan; Salvendy, Gavriel

    2009-05-01

    This paper aims to demonstrate the effects of measurement errors on psychometric measurements in ergonomics studies. A variety of sources can cause random measurement errors in ergonomics studies and these errors can distort virtually every statistic computed and lead investigators to erroneous conclusions. The effects of measurement errors on five most widely used statistical analysis tools have been discussed and illustrated: correlation; ANOVA; linear regression; factor analysis; linear discriminant analysis. It has been shown that measurement errors can greatly attenuate correlations between variables, reduce statistical power of ANOVA, distort (overestimate, underestimate or even change the sign of) regression coefficients, underrate the explanation contributions of the most important factors in factor analysis and depreciate the significance of discriminant function and discrimination abilities of individual variables in discrimination analysis. The discussions will be restricted to subjective scales and survey methods and their reliability estimates. Other methods applied in ergonomics research, such as physical and electrophysiological measurements and chemical and biomedical analysis methods, also have issues of measurement errors, but they are beyond the scope of this paper. As there has been increasing interest in the development and testing of theories in ergonomics research, it has become very important for ergonomics researchers to understand the effects of measurement errors on their experiment results, which the authors believe is very critical to research progress in theory development and cumulative knowledge in the ergonomics field.

  8. Collaborative Learning: Theoretical Foundations and Applicable Strategies to University

    ERIC Educational Resources Information Center

    Roselli, Nestor D.

    2016-01-01

    Collaborative learning is a construct that identifies a current strong field, both in face-to-face and virtual education. Firstly, three converging theoretical sources are analyzed: socio-cognitive conflict theory, intersubjectivity theory and distributed cognition theory. Secondly, a model of strategies that can be implemented by teachers to…

  9. Kolb's Experiential Learning Theory in Athletic Training Education: A Literature Review

    ERIC Educational Resources Information Center

    Schellhase, Kristen C.

    2008-01-01

    Objective: Kolb's Experiential Learning Theory offers insight into the development of learning styles, classification of learning styles, and how students learn through experience. Discussion is presented on the value of Kolb's Experiential Learning Theory for Athletic Training Education. Data Sources: This article reviews research related to…

  10. The Implicit Leadership Theories of College and University Presidents. ASHE Annual Meeting Paper.

    ERIC Educational Resources Information Center

    Birnbaum, Robert

    Theories implicit in college presidents' definitions of leadership are examined, since understanding presidents' leadership models may affect how they interpret their roles and the events they encounter. The source of the theory that is analyzed is the organizational leadership literature. Research traditions in organizational leadership are…

  11. Fan noise caused by the ingestion of anisotropic turbulence - A model based on axisymmetric turbulence theory

    NASA Technical Reports Server (NTRS)

    Kerschen, E. J.; Gliebe, P. R.

    1980-01-01

    An analytical model of fan noise caused by inflow turbulence, a generalization of earlier work by Mani, is presented. Axisymmetric turbulence theory is used to develop a statistical representation of the inflow turbulence valid for a wide range of turbulence properties. Both the dipole source due to rotor blade unsteady forces and the quadrupole source resulting from the interaction of the turbulence with the rotor potential field are considered. The effects of variations in turbulence properties and fan operating conditions are evaluated. For turbulence axial integral length scales much larger than the blade spacing, the spectrum is shown to consist of sharp peaks at the blade passing frequency and its harmonics, with negligible broadband content. The analysis can then be simplified considerably and the total sound power contained within each spectrum peak becomes independent of axial length scale, while the width of the peak is inversely proportional to this parameter. Large axial length scales are characteristic of static fan test facilities, where the transverse contraction of the inlet flow produces highly anisotropic turbulence. In this situation, the rotor/turbulence interaction noise is mainly caused by the transverse component of turbulent velocity.

  12. Asthma patient education opportunities in predominantly minority urban communities.

    PubMed

    Zayas, Luis E; McLean, Don

    2007-12-01

    Disenfranchised ethnic minority communities in the urban United States experience a high burden of asthma. Conventional office-based patient education often is insufficient to promote proper asthma management and coping practices responsive to minority patients' environments. This paper explores existing and alternative asthma information and education sources in three urban minority communities in western New York State to help design other practical educational interventions. Four focus groups (n = 59) and four town hall meetings (n = 109) were conducted in one Hispanic and two black communities. Focus groups included adult asthmatics or caretakers of asthmatics, and town meetings were open to all residents. A critical theory perspective informed the study. Asthma information and education sources, perceptions of asthma and ways of coping were elicited through semi-structured interviews. Data analysis followed a theory-driven immersion-crystallization approach. Several asthma education and information resources from the health care system, media, public institutions and communities were identified. Intervention recommendations highlighted asthma workshops that recognize participants as teachers and learners, offer social support, promote advocacy, are culturally appropriate and community-based and include health care professionals. Community-based, group health education couched on people's experiences and societal conditions offers unique opportunities for patient asthma care empowerment in minority urban communities.

  13. Spectroscopic characterization of low dose rate brachytherapy sources

    NASA Astrophysics Data System (ADS)

    Beach, Stephen M.

    The low dose rate (LDR) brachytherapy seeds employed in permanent radioactive-source implant treatments usually use one of two radionuclides, 125I or 103Pd. The theoretically expected source spectroscopic output from these sources can be obtained via Monte Carlo calculation based upon seed dimensions and materials as well as the bare-source photon emissions for that specific radionuclide. However the discrepancies resulting from inconsistent manufacturing of sources in comparison to each other within model groups and simplified Monte Carlo calculational geometries ultimately result in undesirably large uncertainties in the Monte Carlo calculated values. This dissertation describes experimentally attained spectroscopic outputs of the clinically used brachytherapy sources in air and in liquid water. Such knowledge can then be applied to characterize these sources by a more fundamental and metro logically-pure classification, that of energy-based dosimetry. The spectroscopic results contained within this dissertation can be utilized in the verification and benchmarking of Monte Carlo calculational models of these brachytherapy sources. This body of work was undertaken to establish a usable spectroscopy system and analysis methods for the meaningful study of LDR brachytherapy seeds. The development of a correction algorithm and the analysis of the resultant spectroscopic measurements are presented. The characterization of the spectrometer and the subsequent deconvolution of the measured spectrum to obtain the true spectrum free of any perturbations caused by the spectrometer itself is an important contribution of this work. The approach of spectroscopic deconvolution that was applied in this work is derived in detail and it is applied to the physical measurements. In addition, the spectroscopically based analogs to the LDR dosimetry parameters that are currently employed are detailed, as well as the development of the theory and measurement methods to arrive at these analogs. Several dosimetrically-relevant water-equivalent plastics were also investigated for their transmission properties within a liquid water environment, as well as in air. The framework for the accurate spectrometry of LDR sources is established as a result of this dissertation work. In addition to the measurement and analysis methods, this work presents the basic measured spectroscopic characteristics of each LDR seed currently in use in the clinic today.

  14. Working together versus working autonomously: a new power-dependence perspective on the individual-level of analysis.

    PubMed

    de Jong, Simon B

    2014-01-01

    Recent studies have indicated that it is important to investigate the interaction between task interdependence and task autonomy because this interaction can affect team effectiveness. However, only a limited number of studies have been conducted and those studies focused solely on the team level of analysis. Moreover, there has also been a dearth of theoretical development. Therefore, this study develops and tests an alternative theoretical perspective in an attempt to understand if, and if so why, this interaction is important at the individual level of analysis. Based on interdependence theory and power-dependence theory, we expected that highly task-interdependent individuals who reported high task autonomy would be more powerful and better performers. In contrast, we expected that similarly high task-interdependent individuals who reported less task autonomy would be less powerful and would be weaker performers. These expectations were supported by multi-level and bootstrapping analyses performed on a multi-source dataset (self-, peer-, manager-ratings) comprised of 182 employees drawn from 37 teams. More specifically, the interaction between task interdependence and task autonomy was γ =.128, p <.05 for power and γ =.166, p <.05 for individual performance. The 95% bootstrap interval ranged from .0038 to .0686.

  15. Theories of transporting processes of Cu in Jiaozhou Bay

    NASA Astrophysics Data System (ADS)

    Yang, Dongfang; Su, Chunhua; Zhu, Sixi; Wu, Yunjie; Zhou, Wei

    2018-02-01

    Many marine bays have been polluted along with the rapid development of industry and population size, and understanding the transporting progresses of pollutants is essential to pollution control. In order to better understanding the transporting progresses of pollutants in marine, this paper carried on a comprehensive research of the theories of transporting processes of Cu in Jiaozhou Bay. Results showed that the transporting processes of Cu in this bay could be summarized into seven key theories including homogeneous theory, environmental dynamic theory, horizontal loss theory, source to waters transporting theory, sedimentation transporting theory, migration trend theory and vertical transporting theory, respectively. These theories helpful to better understand the migration progress of pollutants in marine bay.

  16. Classical field configurations and infrared slavery

    NASA Astrophysics Data System (ADS)

    Swanson, Mark S.

    1987-09-01

    The problem of determining the energy of two spinor particles interacting through massless-particle exchange is analyzed using the path-integral method. A form for the long-range interaction energy is obtained by analyzing an abridged vertex derived from the parent theory. This abridged vertex describes the radiation of zero-momentum particles by pointlike sources. A path-integral formalism for calculating the energy of the radiation field associated with this abridged vertex is developed and applications are made to determine the energy necessary for adiabatic separation of two sources in quantum electrodynamics and for an SU(2) Yang-Mills theory. The latter theory is shown to be consistent with confinement via infrared slavery.

  17. Location error uncertainties - an advanced using of probabilistic inverse theory

    NASA Astrophysics Data System (ADS)

    Debski, Wojciech

    2016-04-01

    The spatial location of sources of seismic waves is one of the first tasks when transient waves from natural (uncontrolled) sources are analyzed in many branches of physics, including seismology, oceanology, to name a few. Source activity and its spatial variability in time, the geometry of recording network, the complexity and heterogeneity of wave velocity distribution are all factors influencing the performance of location algorithms and accuracy of the achieved results. While estimating of the earthquake foci location is relatively simple a quantitative estimation of the location accuracy is really a challenging task even if the probabilistic inverse method is used because it requires knowledge of statistics of observational, modelling, and apriori uncertainties. In this presentation we addressed this task when statistics of observational and/or modeling errors are unknown. This common situation requires introduction of apriori constraints on the likelihood (misfit) function which significantly influence the estimated errors. Based on the results of an analysis of 120 seismic events from the Rudna copper mine operating in southwestern Poland we illustrate an approach based on an analysis of Shanon's entropy calculated for the aposteriori distribution. We show that this meta-characteristic of the aposteriori distribution carries some information on uncertainties of the solution found.

  18. Accounting for uncertain fault geometry in earthquake source inversions - I: theory and simplified application

    NASA Astrophysics Data System (ADS)

    Ragon, Théa; Sladen, Anthony; Simons, Mark

    2018-05-01

    The ill-posed nature of earthquake source estimation derives from several factors including the quality and quantity of available observations and the fidelity of our forward theory. Observational errors are usually accounted for in the inversion process. Epistemic errors, which stem from our simplified description of the forward problem, are rarely dealt with despite their potential to bias the estimate of a source model. In this study, we explore the impact of uncertainties related to the choice of a fault geometry in source inversion problems. The geometry of a fault structure is generally reduced to a set of parameters, such as position, strike and dip, for one or a few planar fault segments. While some of these parameters can be solved for, more often they are fixed to an uncertain value. We propose a practical framework to address this limitation by following a previously implemented method exploring the impact of uncertainties on the elastic properties of our models. We develop a sensitivity analysis to small perturbations of fault dip and position. The uncertainties in fault geometry are included in the inverse problem under the formulation of the misfit covariance matrix that combines both prediction and observation uncertainties. We validate this approach with the simplified case of a fault that extends infinitely along strike, using both Bayesian and optimization formulations of a static inversion. If epistemic errors are ignored, predictions are overconfident in the data and source parameters are not reliably estimated. In contrast, inclusion of uncertainties in fault geometry allows us to infer a robust posterior source model. Epistemic uncertainties can be many orders of magnitude larger than observational errors for great earthquakes (Mw > 8). Not accounting for uncertainties in fault geometry may partly explain observed shallow slip deficits for continental earthquakes. Similarly, ignoring the impact of epistemic errors can also bias estimates of near surface slip and predictions of tsunamis induced by megathrust earthquakes. (Mw > 8)

  19. Why Is an Application of Multiple Intelligences Theory Important for Language Learning and Teaching Speaking Ability?

    ERIC Educational Resources Information Center

    Boonma, Malai; Phaiboonnugulkij, Malinee

    2014-01-01

    This article calls for a strong need to propose the theoretical framework of the Multiple Intelligences theory (MI) and provide a suitable answer of the doubt in part of foreign language teaching. The article addresses the application of MI theory following various sources from Howard Gardner and the authors who revised this theory for use in the…

  20. Quasi-experimental study designs series-paper 9: collecting data from quasi-experimental studies.

    PubMed

    Aloe, Ariel M; Becker, Betsy Jane; Duvendack, Maren; Valentine, Jeffrey C; Shemilt, Ian; Waddington, Hugh

    2017-09-01

    To identify variables that must be coded when synthesizing primary studies that use quasi-experimental designs. All quasi-experimental (QE) designs. When designing a systematic review of QE studies, potential sources of heterogeneity-both theory-based and methodological-must be identified. We outline key components of inclusion criteria for syntheses of quasi-experimental studies. We provide recommendations for coding content-relevant and methodological variables and outlined the distinction between bivariate effect sizes and partial (i.e., adjusted) effect sizes. Designs used and controls used are viewed as of greatest importance. Potential sources of bias and confounding are also addressed. Careful consideration must be given to inclusion criteria and the coding of theoretical and methodological variables during the design phase of a synthesis of quasi-experimental studies. The success of the meta-regression analysis relies on the data available to the meta-analyst. Omission of critical moderator variables (i.e., effect modifiers) will undermine the conclusions of a meta-analysis. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. A blended learning approach for teaching computer programming: design for large classes in Sub-Saharan Africa

    NASA Astrophysics Data System (ADS)

    Bayu Bati, Tesfaye; Gelderblom, Helene; van Biljon, Judy

    2014-01-01

    The challenge of teaching programming in higher education is complicated by problems associated with large class teaching, a prevalent situation in many developing countries. This paper reports on an investigation into the use of a blended learning approach to teaching and learning of programming in a class of more than 200 students. A course and learning environment was designed by integrating constructivist learning models of Constructive Alignment, Conversational Framework and the Three-Stage Learning Model. Design science research is used for the course redesign and development of the learning environment, and action research is integrated to undertake participatory evaluation of the intervention. The action research involved the Students' Approach to Learning survey, a comparative analysis of students' performance, and qualitative data analysis of data gathered from various sources. The paper makes a theoretical contribution in presenting a design of a blended learning solution for large class teaching of programming grounded in constructivist learning theory and use of free and open source technologies.

  2. Linking data to decision-making: applying qualitative data analysis methods and software to identify mechanisms for using outcomes data.

    PubMed

    Patel, Vaishali N; Riley, Anne W

    2007-10-01

    A multiple case study was conducted to examine how staff in child out-of-home care programs used data from an Outcomes Management System (OMS) and other sources to inform decision-making. Data collection consisted of thirty-seven semi-structured interviews with clinicians, managers, and directors from two treatment foster care programs and two residential treatment centers, and individuals involved with developing the OMS; and observations of clinical and quality management meetings. Case study and grounded theory methodology guided analyses. The application of qualitative data analysis software is described. Results show that although staff rarely used data from the OMS, they did rely on other sources of systematically collected information to inform clinical, quality management, and program decisions. Analyses of how staff used these data suggest that improving the utility of OMS will involve encouraging staff to participate in data-based decision-making, and designing and implementing OMS in a manner that reflects how decision-making processes operate.

  3. Primary socialization theory. The influence of the community on drug use and deviance. III.

    PubMed

    Oetting, E R; Donnermeyer, J F; Deffenbacher, J L

    1998-06-01

    Primary socialization theory states that drug use and deviance are social behaviors learned predominantly through three sources, the family, the school, and peer clusters. This paper shows that the theory provides a parsimonious explanation of how characteristics of both the local community and the larger extended community influence drug use and deviance. These characteristics affect deviance because they either strengthen or weaken bonding with the three primary socialization sources, or affect the norms that are transmitted through the primary socialization process. The paper considers the following social structure characteristics of the local neighborhood or community: physical characteristics, rurality, ethnicity, heterogeneity, occupational type, mobility, poverty, neighborhood deviance, and age distribution. It also examines how other secondary socialization sources, the extended family, associational groups, religion, the peer environment, and the media influence the primary socialization process and, in turn, drug use and deviance.

  4. Light scattering regimes along the optical axis in turbid media

    NASA Astrophysics Data System (ADS)

    Campbell, S. D.; O'Connell, A. K.; Menon, S.; Su, Q.; Grobe, R.

    2006-12-01

    We inject an angularly collimated laser beam into a scattering medium of a nondairy creamer-water solution and examine the distribution of the scattered light along the optical axis as a function of the source-detector spacing. The experimental and simulated data obtained from a Monte Carlo simulation suggest four regimes characterizing the transition from unscattered to diffusive light. We compare the data also with theoretical predictions based on a first-order scattering theory for regions close to the source, and with diffusionlike theories for larger source-detector spacings. We demonstrate the impact of the measurement process and the effect of the unavoidable absorption of photons by the detection fiber on the light distribution inside the medium. We show that the range of validity of these theories can depend on the experimental parameters such as the diameter and acceptance angle of the detection fiber.

  5. A benign property of the ghost mode in massive theory of gravitation

    NASA Astrophysics Data System (ADS)

    Chugreev, Yu. V.

    2018-01-01

    It was shown in the frameworks of massive gravitational theories having in linear approximation mass term {m^2}( {φ ^{α β }}{φ_{α β }} - 1/2{φ ^2}} ) in the lagrangian, that created some time ago spherically-symmetric static sources should possess inside their light cone not only Yukawa potential, but also nonstationary component. It leads to the long ( 1/ m) period of gravitational evaporation of such sources with the mass loss Ṁ m 2 M 2 The magnitude of the flux is c 4/ v 4 times ( c—speed of light, v—velocity of the source particles) bigger then negative gravitational radiation flux corresponding to the ghost scalar mode in the spectrum of such gravitational field, with stabilizing the source.

  6. Dualities and Topological Field Theories from Twisted Geometries

    NASA Astrophysics Data System (ADS)

    Markov, Ruza

    I will present three studies of string theory on twisted geometries. In the first calculation included in this dissertation we use gauge/gravity duality to study the Coulomb branch of an unusual type of nonlocal field theory, called Puff Field Theory. On the gravity side, this theory is given in terms of D3-branes in type IIB string theory with a geometric twist. While the field theory description, available in the IR limit, is a deformation of Yang-Mills gauge theory by an order seven operator which we here compute. In the rest of this dissertation we explore N = 4 super Yang-Mills (SYM) theory compactied on a circle with S-duality and R-symmetry twists that preserve N = 6 supersymmetry in 2 + 1D. It was shown that abelian theory on a flat manifold gives Chern-Simons theory in the low-energy limit and here we are interested in the non-abelian counterpart. To that end, we introduce external static supersymmetric quark and anti-quark sources into the theory and calculate the Witten Index of the resulting Hilbert space of ground states on a two-torus. Using these results we compute the action of simple Wilson loops on the Hilbert space of ground states without sources. In some cases we find disagreement between our results for the Wilson loop eigenvalues and previous conjectures about a connection with Chern-Simons theory. The last result discussed in this dissertation demonstrates a connection between gravitational Chern-Simons theory and N = 4 four-dimensional SYM theory compactified on a circle twisted by S-duality where the remaining three-manifold is not flat starting with the explicit geometric realization of S-duality in terms of (2, 0) theory.

  7. Atmospheric simulation using a liquid crystal wavefront-controlling device

    NASA Astrophysics Data System (ADS)

    Brooks, Matthew R.; Goda, Matthew E.

    2004-10-01

    Test and evaluation of laser warning devices is important due to the increased use of laser devices in aerial applications. This research consists of an atmospheric aberrating system to enable in-lab testing of various detectors and sensors. This system employs laser light at 632.8nm from a Helium-Neon source and a spatial light modulator (SLM) to cause phase changes using a birefringent liquid crystal material. Measuring outgoing radiation from the SLM using a CCD targetboard and Shack-Hartmann wavefront sensor reveals an acceptable resemblance of system output to expected atmospheric theory. Over three turbulence scenarios, an error analysis reveals that turbulence data matches theory. A wave optics computer simulation is created analogous to the lab-bench design. Phase data, intensity data, and a computer simulation affirm lab-bench results so that the aberrating SLM system can be operated confidently.

  8. Spectral analysis of variable-length coded digital signals

    NASA Astrophysics Data System (ADS)

    Cariolaro, G. L.; Pierobon, G. L.; Pupolin, S. G.

    1982-05-01

    A spectral analysis is conducted for a variable-length word sequence by an encoder driven by a stationary memoryless source. A finite-state sequential machine is considered as a model of the line encoder, and the spectral analysis of the encoded message is performed under the assumption that the sourceword sequence is composed of independent identically distributed words. Closed form expressions for both the continuous and discrete parts of the spectral density are derived in terms of the encoder law and sourceword statistics. The jump part exhibits jumps at multiple integers of per lambda(sub 0)T, where lambda(sub 0) is the greatest common divisor of the possible codeword lengths, and T is the symbol period. The derivation of the continuous part can be conveniently factorized, and the theory is applied to the spectral analysis of BnZS and HDBn codes.

  9. Source analysis of electrophysiological correlates of beat induction as sensory-guided action

    PubMed Central

    Todd, Neil P. M.; Lee, Christopher S.

    2015-01-01

    In this paper we present a reanalysis of electrophysiological data originally collected to test a sensory-motor theory of beat induction (Todd et al., 2002; Todd and Seiss, 2004; Todd and Lee, 2015). The reanalysis is conducted in the light of more recent findings and in particular the demonstration that auditory evoked potentials contain a vestibular dependency. At the core of the analysis is a model which predicts brain dipole source current activity over time in temporal and frontal lobe areas during passive listening to a rhythm, or active synchronization, where it dissociates the frontal activity into distinct sources which can be identified as respectively pre-motor and motor in origin. The model successfully captures the main features of the rhythm in showing that the metrical structure is manifest in an increase in source current activity during strong compared to weak beats. In addition the outcomes of modeling suggest that: (1) activity in both temporal and frontal areas contribute to the metrical percept and that this activity is distributed over time; (2) transient, time-locked activity associated with anticipated beats is increased when a temporal expectation is confirmed following a previous violation, such as a syncopation; (3) two distinct processes are involved in auditory cortex, corresponding to tangential and radial (possibly vestibular dependent) current sources. We discuss the implications of these outcomes for the insights they give into the origin of metrical structure and the power of syncopation to induce movement and create a sense of groove. PMID:26321991

  10. A Grounded Theory of Master's-Level Counselor Research Identity

    ERIC Educational Resources Information Center

    Jorgensen, Maribeth F.; Duncan, Kelly

    2015-01-01

    A grounded theory approach was used to examine the research identity of 17 master's-level counseling trainees and practitioners. The emergent theory gave an understanding to sources of variation in the process and outcome of research identity. The authors provide recommendations for counselor educators to use with current and former students.

  11. A Critical Comparison of Classical and Domain Theory: Some Implications for Character Education

    ERIC Educational Resources Information Center

    Keefer, Matthew Wilks

    2006-01-01

    Contemporary approaches to moral education are influenced by the "domain theory" approach to understanding moral development (Turiel, 1983; 1998; Nucci, 2001). Domain theory holds there are distinct conventional, personal and moral domains; each constituting a cognitive "structured-whole" with its own normative source and sphere of influence. One…

  12. Polarized optical scattering by inhomogeneities and surface roughness in an anisotropic thin film

    DOE PAGES

    Germer, Thomas A.; Sharma, Katelynn A.; Brown, Thomas G.; ...

    2017-10-18

    We extend the theory for scattering by oblique columnar structure thin films to include the induced form birefringence and the propagation of radiation in those films. We generalize the 4 × 4 matrix theory to include arbitrary sources in the layer, which are necessary to determine the Green function for the inhomogeneous wave equation. We further extend first-order vector perturbation theory for scattering by roughness in the smooth surface limit, when the layer is anisotropic. Scattering by an inhomogeneous medium is approximated by a distorted Born approximation, where effective medium theory is used to determine the effective properties of themore » medium and strong fluctuation theory is used to determine the inhomogeneous sources. In this manner, we develop a model for scattering by inhomogeneous films, with anisotropic correlation functions. Here, the results are compared to Mueller matrix bidirectional scattering distribution function measurements for a glancing-angle deposition (GLAD) film. While the results are applied to the GLAD film example, the development of the theory is general enough that it can guide simulations for scattering in other anisotropic thin films.« less

  13. Mathematical theory of cylindrical isothermal blast waves in a magnetic field. [with application to supernova remnant evolution

    NASA Technical Reports Server (NTRS)

    Lerche, I.

    1981-01-01

    An analysis is conducted regarding the properties of cylindrically symmetric self-similar blast waves propagating away from a line source into a medium whose density and magnetic field (with components in both the phi and z directions) both vary as r to the -(omega) power (with omega less than 1) ahead of the blast wave. The main results of the analysis can be divided into two classes, related to a zero azimuthal field and a zero longitudinal field. In the case of the zero longitudinal field it is found that there are no physically acceptable solutions with continuous postshock variations of flow speed and gas density.

  14. [Efficacy analysis and theoretical study on Chinese herbal properties of Açaí (Euterpe oleracea)].

    PubMed

    Zhang, Jian-jun; Chen, Shao-hong; Zhu, Ying-li; Wang, Chun; Wang, Jing-xia; Wang, Lin-yuan; Gao, Xue-min

    2015-06-01

    Açaí (Euterpe oleracea) emerged as a source of herb has a long history in South America, which was approved by the Ministry of Health used in China and it has been introduced planting in Guangdong and Taiwan. This article summarized applied history of Açaí and its present status in China. Did theoretical study on the Chinese herbal properties of Açaí based on the Chinese traditional philosophical culture to analysis the function and symptom preliminary, combining with used for medical recordation, chemical component, biological activity. It is aiming at establishing the theoretical foundation for the application under the guidance of TCM theory.

  15. Department of Cybernetic Acoustics

    NASA Astrophysics Data System (ADS)

    The development of the theory, instrumentation and applications of methods and systems for the measurement, analysis, processing and synthesis of acoustic signals within the audio frequency range, particularly of the speech signal and the vibro-acoustic signal emitted by technical and industrial equipments treated as noise and vibration sources was discussed. The research work, both theoretical and experimental, aims at applications in various branches of science, and medicine, such as: acoustical diagnostics and phoniatric rehabilitation of pathological and postoperative states of the speech organ; bilateral ""man-machine'' speech communication based on the analysis, recognition and synthesis of the speech signal; vibro-acoustical diagnostics and continuous monitoring of the state of machines, technical equipments and technological processes.

  16. Teachers' Source Evaluation Self-Efficacy Predicts Their Use of Relevant Source Features When Evaluating the Trustworthiness of Web Sources on Special Education

    ERIC Educational Resources Information Center

    Andreassen, Rune; Bråten, Ivar

    2013-01-01

    Building on prior research and theory concerning source evaluation and the role of self-efficacy in the context of online learning, this study investigated the relationship between teachers' beliefs about their capability to evaluate the trustworthiness of sources and their reliance on relevant source features when judging the trustworthiness…

  17. A new dump system design for stray light reduction of Thomson scattering diagnostic system on EAST.

    PubMed

    Xiao, Shumei; Zang, Qing; Han, Xiaofeng; Wang, Tengfei; Yu, Jin; Zhao, Junyu

    2016-07-01

    Thomson scattering (TS) diagnostic is an important diagnostic for measuring electron temperature and density during plasma discharge. However, the measurement of Thomson scattering signal is disturbed by the stray light easily. The stray light sources in the Experimental Advanced Superconducting Tokamak (EAST) TS diagnostic system were analyzed by a simulation model of the diagnostic system, and simulation results show that the dump system is the primary stray light source. Based on the optics theory and the simulation analysis, a novel dump system including an improved beam trap was proposed and installed. The measurement results indicate that the new dump system can reduce more than 60% of the stray light for the diagnostic system, and the influence of stray light on the error of measured density decreases.

  18. Addendum to foundations of multidimensional wave field signal theory: Gaussian source function

    NASA Astrophysics Data System (ADS)

    Baddour, Natalie

    2018-02-01

    Many important physical phenomena are described by wave or diffusion-wave type equations. Recent work has shown that a transform domain signal description from linear system theory can give meaningful insight to multi-dimensional wave fields. In N. Baddour [AIP Adv. 1, 022120 (2011)], certain results were derived that are mathematically useful for the inversion of multi-dimensional Fourier transforms, but more importantly provide useful insight into how source functions are related to the resulting wave field. In this short addendum to that work, it is shown that these results can be applied with a Gaussian source function, which is often useful for modelling various physical phenomena.

  19. The influence of stereotype threat on immigrants: review and meta-analysis

    PubMed Central

    Appel, Markus; Weber, Silvana; Kronberger, Nicole

    2015-01-01

    In many regions around the world students with certain immigrant backgrounds underachieve in educational settings. This paper provides a review and meta-analysis on one potential source of the immigrant achievement gap: stereotype threat, a situational predicament that may prevent students to perform up to their full abilities. A meta-analysis of 19 experiments suggests an overall mean effect size of 0.63 (random effects model) in support of stereotype threat theory. The results are complemented by moderator analyses with regard to circulation (published or unpublished research), cultural context (US versus Europe), age of immigrants, type of stereotype threat manipulation, dependent measures, and means for identification of immigrant status; evidence on the role of ethnic identity strength is reviewed. Theoretical and practical implications of the findings are discussed. PMID:26217256

  20. Evaluating performance feedback: a research study into issues of credibility and utility for nursing clinicians.

    PubMed

    Fereday, Jennifer; Muir-Cochrane, Eimear

    2004-01-01

    Performance feedback is information provided to employees about how well they are performing in their work role. The nursing profession has a long history of providing formal, written performance reviews, traditionally from a manager to subordinate, with less formal feedback sources including peers, clients and multidisciplinary team members. This paper is based on one aspect of a PhD research study exploring the dynamics of performance feedback primarily from the nursing clinicians' perspective. The research reported here discusses the impact of the social relationship (between the source and recipient of performance feedback) on the recipient's evaluation of feedback as being 'credible' and 'useful' for self-assessment. Focus group interviews were utilised to ascertain the nursing clinicians' perspectives of performance feedback. Thematic analysis of the data was informed by the Social Phenomenology of Alfred Schutz (1967) specifically his theories of intersubjective understanding. Findings supported the level of familiarity between the feedback source and the nursing clinician as a significant criterion influencing the acceptance or rejection of feedback. Implications for the selection of performance feedback sources and processes within nursing are discussed.

  1. U(1) Wilson lattice gauge theories in digital quantum simulators

    NASA Astrophysics Data System (ADS)

    Muschik, Christine; Heyl, Markus; Martinez, Esteban; Monz, Thomas; Schindler, Philipp; Vogell, Berit; Dalmonte, Marcello; Hauke, Philipp; Blatt, Rainer; Zoller, Peter

    2017-10-01

    Lattice gauge theories describe fundamental phenomena in nature, but calculating their real-time dynamics on classical computers is notoriously difficult. In a recent publication (Martinez et al 2016 Nature 534 516), we proposed and experimentally demonstrated a digital quantum simulation of the paradigmatic Schwinger model, a U(1)-Wilson lattice gauge theory describing the interplay between fermionic matter and gauge bosons. Here, we provide a detailed theoretical analysis of the performance and the potential of this protocol. Our strategy is based on analytically integrating out the gauge bosons, which preserves exact gauge invariance but results in complicated long-range interactions between the matter fields. Trapped-ion platforms are naturally suited to implementing these interactions, allowing for an efficient quantum simulation of the model, with a number of gate operations that scales polynomially with system size. Employing numerical simulations, we illustrate that relevant phenomena can be observed in larger experimental systems, using as an example the production of particle-antiparticle pairs after a quantum quench. We investigate theoretically the robustness of the scheme towards generic error sources, and show that near-future experiments can reach regimes where finite-size effects are insignificant. We also discuss the challenges in quantum simulating the continuum limit of the theory. Using our scheme, fundamental phenomena of lattice gauge theories can be probed using a broad set of experimentally accessible observables, including the entanglement entropy and the vacuum persistence amplitude.

  2. SL(2, C) group action on cohomological field theories

    NASA Astrophysics Data System (ADS)

    Basalaev, Alexey

    2018-01-01

    We introduce the S} (2,C) group action on a partition function of a cohomological field theory via a certain Givental's action. Restricted to the small phase space we describe the action via the explicit formulae on a CohFT genus g potential. We prove that applied to the total ancestor potential of a simple-elliptic singularity the action introduced coincides with the transformation of Milanov-Ruan changing the primitive form (cf. Milanov and Ruan in Gromov-Witten theory of elliptic orbifold P1 and quasi-modular forms, arXiv:1106.2321 , 2011).

  3. Theory of the Bloch oscillating transistor

    NASA Astrophysics Data System (ADS)

    Hassel, J.; Seppä, H.

    2005-01-01

    The Bloch oscillating transistor (BOT) is a device in which single electron current through a normal tunnel junction enhances Cooper pair current in a mesoscopic Josephson junction, leading to signal amplification. In this article we develop a theory in which the BOT dynamics is described as a two-level system. The theory is used to predict current-voltage characteristics and small-signal response. The transition from stable operation into the hysteretic regime is studied. By identifying the two-level switching noise as the main source of fluctuations, the expressions for equivalent noise sources and the noise temperature are derived. The validity of the model is tested by comparing the results with simulations and experiments.

  4. Toward A Brain-Based Theory of Beauty

    PubMed Central

    Ishizu, Tomohiro; Zeki, Semir

    2011-01-01

    We wanted to learn whether activity in the same area(s) of the brain correlate with the experience of beauty derived from different sources. 21 subjects took part in a brain-scanning experiment using functional magnetic resonance imaging. Prior to the experiment, they viewed pictures of paintings and listened to musical excerpts, both of which they rated on a scale of 1–9, with 9 being the most beautiful. This allowed us to select three sets of stimuli–beautiful, indifferent and ugly–which subjects viewed and heard in the scanner, and rated at the end of each presentation. The results of a conjunction analysis of brain activity showed that, of the several areas that were active with each type of stimulus, only one cortical area, located in the medial orbito-frontal cortex (mOFC), was active during the experience of musical and visual beauty, with the activity produced by the experience of beauty derived from either source overlapping almost completely within it. The strength of activation in this part of the mOFC was proportional to the strength of the declared intensity of the experience of beauty. We conclude that, as far as activity in the brain is concerned, there is a faculty of beauty that is not dependent on the modality through which it is conveyed but which can be activated by at least two sources–musical and visual–and probably by other sources as well. This has led us to formulate a brain-based theory of beauty. PMID:21755004

  5. Research in computer science

    NASA Technical Reports Server (NTRS)

    Ortega, J. M.

    1984-01-01

    Several short summaries of the work performed during this reporting period are presented. Topics discussed in this document include: (1) resilient seeded errors via simple techniques; (2) knowledge representation for engineering design; (3) analysis of faults in a multiversion software experiment; (4) implementation of parallel programming environment; (5) symbolic execution of concurrent programs; (6) two computer graphics systems for visualization of pressure distribution and convective density particles; (7) design of a source code management system; (8) vectorizing incomplete conjugate gradient on the Cyber 203/205; (9) extensions of domain testing theory and; (10) performance analyzer for the pisces system.

  6. Procedure for detection and measurement of interfaces in remotely acquired data using a digital computer

    NASA Technical Reports Server (NTRS)

    Faller, K. H.

    1976-01-01

    A technique for the detection and measurement of surface feature interfaces in remotely acquired data was developed and evaluated. A computer implementation of this technique was effected to automatically process classified data derived from various sources such as the LANDSAT multispectral scanner and other scanning sensors. The basic elements of the operational theory of the technique are described, followed by the details of the procedure. An example of an application of the technique to the analysis of tidal shoreline length is given with a breakdown of manpower requirements.

  7. The theory of music, mood and movement to improve health outcomes

    PubMed Central

    Murrock, Carolyn J.; Higgins, Patricia A.

    2013-01-01

    Aim This paper presents a discussion of the development of a middle-range nursing theory of the effects of music on physical activity and improved health outcomes. Background Due to the high rate of physical inactivity and the associated negative health outcomes worldwide, nurses need new evidence-based theories and interventions to increase physical activity. Data sources The theory of music, mood and movement (MMM) was developed from physical activity guidelines and music theory using the principles of statement and theory synthesis. The concepts of music, physical activity and health outcomes were searched using the CINAHL, MEDLINE, ProQuest Nursing and Allied Health Source, PsycINFO and Cochrane Library databases covering the years 1975–2008. Discussion The theory of MMM was synthesized by combining the psychological and physiological responses of music to increase physical activity and improve health outcomes. It proposes that music alters mood, is a cue for movement, and makes physical activity more enjoyable leading to improved health outcomes of weight, blood pressure, blood sugar and cardiovascular risk factor management, and improved quality of life. Conclusion As it was developed from the physical activity guidelines, the middle-range theory is prescriptive, produces testable hypotheses, and can guide nursing research and practice. The middle-range theory needs to be tested to determine its usefulness for nurses to develop physical activity programmes to improve health outcomes across various cultures. PMID:20568327

  8. Analysis of field of view limited by a multi-line X-ray source and its improvement for grating interferometry.

    PubMed

    Du, Yang; Huang, Jianheng; Lin, Danying; Niu, Hanben

    2012-08-01

    X-ray phase-contrast imaging based on grating interferometry is a technique with the potential to provide absorption, differential phase contrast, and dark-field signals simultaneously. The multi-line X-ray source used recently in grating interferometry has the advantage of high-energy X-rays for imaging of thick samples for most clinical and industrial investigations. However, it has a drawback of limited field of view (FOV), because of the axial extension of the X-ray emission area. In this paper, we analyze the effects of axial extension of the multi-line X-ray source on the FOV and its improvement in terms of Fresnel diffraction theory. Computer simulation results show that the FOV limitation can be overcome by use of an alternative X-ray tube with a specially designed multi-step anode. The FOV of this newly designed X-ray source can be approximately four times larger than that of the multi-line X-ray source in the same emission area. This might be beneficial for the applications of X-ray phase contrast imaging in materials science, biology, medicine, and industry.

  9. On oscillatory magnetoconvection in a nanofluid layer in the presence of internal heat source and Soret effect

    NASA Astrophysics Data System (ADS)

    Khalid, Izzati Khalidah; Mokhtar, Nor Fadzillah Mohd; Bakri, Nur Amirah; Siri, Zailan; Ibrahim, Zarina Bibi; Gani, Siti Salwa Abd

    2017-11-01

    The onset of oscillatory magnetoconvection for an infinite horizontal nanofluid layer subjected to Soret effect and internal heat source heated from below is examined theoretically with the implementation of linear stability theory. Two important properties that are thermophoresis and Brownian motion are included in the model and three types of lower-upper bounding systems of the model: rigid-rigid, rigid-free as well as free-free boundaries are examined. Eigenvalue equations are gained from a normal mode analysis and executed using Galerkin technique. Magnetic field effect, internal heat source effect, Soret effect and other nanofluid parameters on the oscillatory convection are presented graphically. For oscillatory mode, it is found that the effect of internal heat source is quite significant for small values of the non-dimensional parameter and elevating the internal heat source speed up the onset of convection. Meanwhile, the increasing of the strength of magnetic field in a nanofluid layer reduced the rate of thermal instability and sustain the stabilization of the system. For the Soret effect, the onset of convection in the system is accelerated when the values of the Soret effect is increased.

  10. Stochastic Growth Theory of Type 3 Solar Radio Emission

    NASA Technical Reports Server (NTRS)

    Robinson, P. A.; Carins, I. H.

    1993-01-01

    The recently developed stochastic growth theory of type 3 radio sources is extended to predict their electromagnetic volume emissivities and brightness temperatures. Predicted emissivities are consistent with spacecraft observations and independent theoretical constraints.

  11. Julian Schwinger and the Source Theory

    Science.gov Websites

    existing (operator) field theory to describe the new experimental discoveries in high energy particle , Purdue University 1964 National Medal of Science Top Some links on this page may take you to non-federal

  12. Stability of measures from children's interviews: the effects of time, sample length, and topic.

    PubMed

    Heilmann, John; DeBrock, Lindsay; Riley-Tillman, T Chris

    2013-08-01

    The purpose of this study was to examine the reliability of, and sources of variability in, language measures from interviews collected from young school-age children. Two 10-min interviews were collected from 20 at-risk kindergarten children by an examiner using a standardized set of questions. Test-retest reliability coefficients were calculated for 8 language measures. Generalizability theory (G-theory) analyses were completed to document the variability introduced into the measures from the child, session, sample length, and topic. Significant and strong reliability correlation coefficients were observed for most of the language sample measures. The G-theory analyses revealed that most of the variance in the language measures was attributed to the child. Session, sample length, and topic accounted for negligible amounts of variance in most of the language measures. Measures from interviews were reliable across sessions, and the sample length and topic did not have a substantial impact on the reliability of the language measures. Implications regarding the clinical feasibility of language sample analysis for assessment and progress monitoring are discussed.

  13. Analysis of energy states where electrons and holes coexist in pseudomorphically strained InAs high-electron-mobility transistors

    NASA Astrophysics Data System (ADS)

    Nishio, Yui; Sato, Takato; Hirayama, Naomi; Iida, Tsutomu; Takanashi, Yoshifumi

    2016-04-01

    In strained high-electron-mobility transistors (HEMTs) with InAs as the channel, excess electrons and holes are generated in the drain region by impact ionization. In the source region, electrons are injected to recombine with accumulated holes by the Auger process. This causes the shift of the gate potential, V GS,shift, for HEMTs. For a system where electrons and holes coexist, we established a theory taking into account the nonparabolicity of the conduction band in the InAs channel. This theory enables us to rigorously determine not only the energy states and the concentration profiles for both carriers but also the V GS,shift due to an accumulation of holes. We have derived the Auger recombination theory which takes into account the Fermi-Dirac statistics and is applicable to an arbitrary shape of potential energy. The Auger recombination lifetime τA for InAs-PHEMTs was estimated as a function of the sheet hole concentration, p s, and τA was on the order of psec for p s exceeding 1012 cm-2.

  14. Sources of Response Bias in Older Ethnic Minorities: A Case of Korean American Elderly

    PubMed Central

    Kim, Miyong T.; Ko, Jisook; Yoon, Hyunwoo; Kim, Kim B.; Jang, Yuri

    2015-01-01

    The present study was undertaken to investigate potential sources of response bias in empirical research involving older ethnic minorities and to identify prudent strategies to reduce those biases, using Korean American elderly (KAE) as an example. Data were obtained from three independent studies of KAE (N=1,297; age ≥60) in three states (Florida, New York, and Maryland) from 2000 to 2008. Two common measures, Pearlin’s Mastery Scale and the CES-D scale, were selected for a series of psychometric tests based on classical measurement theory. Survey items were analyzed in depth, using psychometric properties generated from both exploratory factor analysis and confirmatory factor analysis as well as correlational analysis. Two types of potential sources of bias were identified as the most significant contributors to increases in error variances for these psychological instruments. Error variances were most prominent when (1) items were not presented in a manner that was culturally or contextually congruent with respect to the target population and/or (2) the response anchors for items were mixed (e.g., positive vs. negative). The systemic patterns and magnitudes of the biases were also cross-validated for the three studies. The results demonstrate sources and impacts of measurement biases in studies of older ethnic minorities. The identified response biases highlight the need for re-evaluation of current measurement practices, which are based on traditional recommendations that response anchors should be mixed or that the original wording of instruments should be rigidly followed. Specifically, systematic guidelines for accommodating cultural and contextual backgrounds into instrument design are warranted. PMID:26049971

  15. Sources of Response Bias in Older Ethnic Minorities: A Case of Korean American Elderly.

    PubMed

    Kim, Miyong T; Lee, Ju-Young; Ko, Jisook; Yoon, Hyunwoo; Kim, Kim B; Jang, Yuri

    2015-09-01

    The present study was undertaken to investigate potential sources of response bias in empirical research involving older ethnic minorities and to identify prudent strategies to reduce those biases, using Korean American elderly (KAE) as an example. Data were obtained from three independent studies of KAE (N = 1,297; age ≥60) in three states (Florida, New York, and Maryland) from 2000 to 2008. Two common measures, Pearlin's Mastery Scale and the CES-D scale, were selected for a series of psychometric tests based on classical measurement theory. Survey items were analyzed in depth, using psychometric properties generated from both exploratory factor analysis and confirmatory factor analysis as well as correlational analysis. Two types of potential sources of bias were identified as the most significant contributors to increases in error variances for these psychological instruments. Error variances were most prominent when (1) items were not presented in a manner that was culturally or contextually congruent with respect to the target population and/or (2) the response anchors for items were mixed (e.g., positive vs. negative). The systemic patterns and magnitudes of the biases were also cross-validated for the three studies. The results demonstrate sources and impacts of measurement biases in studies of older ethnic minorities. The identified response biases highlight the need for re-evaluation of current measurement practices, which are based on traditional recommendations that response anchors should be mixed or that the original wording of instruments should be rigidly followed. Specifically, systematic guidelines for accommodating cultural and contextual backgrounds into instrument design are warranted.

  16. Computing payment for ecosystem services in watersheds: an analysis of the Middle Route Project of South-to-North Water Diversion in China.

    PubMed

    Dong, Zhengju; Yan, Yan; Duan, Jing; Fu, Xiao; Zhou, Qingrong; Huang, Xiang; Zhu, Xiangen; Zhao, Jingzhu

    2011-01-01

    Payment for ecosystem services (PES) has attracted considerable attention as an economic incentive for promoting natural resource management recently. As emphasis has been placed on using the incentive-based mechanism by the central government, rapid progress on PES research and practice has been achieved. However PES still faces many difficulties. A key issue is the lack of a fully-fledged theory and method to clearly define the design scope, accounting and feasibility of PES criteria. An improved watershed criteria model was developed in light of research on PES practices in China, investigations on the water source area for the Middle Route Project of South-to-North Water Diversion and ecosystem services outflows theory. The basic principle of assessment is the direct and opportunity cost for ecological conservation and environmental protection in the water source area deduct nationally-financed PES and internal effect. Then the scope and the criteria methods were determined, and internal effect was put forward to define benefits brought from water source area. Finally, Shiyan City, which is the main water source area for the Project of Water Diversion, was analyzed by this model and its payment was calculated. The results showed that: (1) during 2003-2050, the total direct cost and opportunity cost would reach up to 262.70 billion and 256.33 billion Chinese Yuan (CNY, 2000 constant prices), i.e., 50.61% and 49.38% of total cost, respectively; (2) Shiyan City would gain 0.23, 0.06 and 0.03 CNY/m3 in 2014-2020, 2021-2030, and 2031-2050, respectively.

  17. How to Use the Pop-Screen in Literary Studies

    ERIC Educational Resources Information Center

    Reuber, Alexandra

    2010-01-01

    Teaching literary theory is fascinating for those who love the application of theory to a literary text, difficult for those who are of the opinion that theory destroys the actual beauty and value of the fictional source, and unfortunately often boring for those who are taught. This article, however, provides a popular approach to the introduction…

  18. The Neuroscience of Self-Efficacy: Vertically Integrated Leisure Theory and Its Implications for Theory-Based Programming

    ERIC Educational Resources Information Center

    Stone, Garrett Anderson

    2018-01-01

    The purpose of this paper is to explain and establish a link between social-psychological and biological explanations of self-efficacy theory. Specifically, the paper uses a hypothetical rock climbing program to illustrate how a practitioner could enhance the four sources of self-efficacious beliefs (enactive attainment, vicarious experience,…

  19. Youth Advocacy Training Resource. Volume IV. A Review of Theory and Applications for the Education of Troubled Youth.

    ERIC Educational Resources Information Center

    Evaluation Technologies, Inc., Arlington, VA.

    This volume serves as a source of information about the relationship of Teacher Corps Youth Advocacy Project activities to the field of secondary school reform for troubled youth. This document presents major theories about educating troubled youth, theoretically-based programs, and research and evaluation on their effectiveness. Theories are…

  20. A Complete Multimode Equivalent-Circuit Theory for Electrical Design

    PubMed Central

    Williams, Dylan F.; Hayden, Leonard A.; Marks, Roger B.

    1997-01-01

    This work presents a complete equivalent-circuit theory for lossy multimode transmission lines. Its voltages and currents are based on general linear combinations of standard normalized modal voltages and currents. The theory includes new expressions for transmission line impedance matrices, symmetry and lossless conditions, source representations, and the thermal noise of passive multiports. PMID:27805153

  1. Multivariate Cholesky models of human female fertility patterns in the NLSY.

    PubMed

    Rodgers, Joseph Lee; Bard, David E; Miller, Warren B

    2007-03-01

    Substantial evidence now exists that variables measuring or correlated with human fertility outcomes have a heritable component. In this study, we define a series of age-sequenced fertility variables, and fit multivariate models to account for underlying shared genetic and environmental sources of variance. We make predictions based on a theory developed by Udry [(1996) Biosocial models of low-fertility societies. In: Casterline, JB, Lee RD, Foote KA (eds) Fertility in the United States: new patterns, new theories. The Population Council, New York] suggesting that biological/genetic motivations can be more easily realized and measured in settings in which fertility choices are available. Udry's theory, along with principles from molecular genetics and certain tenets of life history theory, allow us to make specific predictions about biometrical patterns across age. Consistent with predictions, our results suggest that there are different sources of genetic influence on fertility variance at early compared to later ages, but that there is only one source of shared environmental influence that occurs at early ages. These patterns are suggestive of the types of gene-gene and gene-environment interactions for which we must account to better understand individual differences in fertility outcomes.

  2. Psychometric considerations in the measurement of event-related brain potentials: Guidelines for measurement and reporting.

    PubMed

    Clayson, Peter E; Miller, Gregory A

    2017-01-01

    Failing to consider psychometric issues related to reliability and validity, differential deficits, and statistical power potentially undermines the conclusions of a study. In research using event-related brain potentials (ERPs), numerous contextual factors (population sampled, task, data recording, analysis pipeline, etc.) can impact the reliability of ERP scores. The present review considers the contextual factors that influence ERP score reliability and the downstream effects that reliability has on statistical analyses. Given the context-dependent nature of ERPs, it is recommended that ERP score reliability be formally assessed on a study-by-study basis. Recommended guidelines for ERP studies include 1) reporting the threshold of acceptable reliability and reliability estimates for observed scores, 2) specifying the approach used to estimate reliability, and 3) justifying how trial-count minima were chosen. A reliability threshold for internal consistency of at least 0.70 is recommended, and a threshold of 0.80 is preferred. The review also advocates the use of generalizability theory for estimating score dependability (the generalizability theory analog to reliability) as an improvement on classical test theory reliability estimates, suggesting that the latter is less well suited to ERP research. To facilitate the calculation and reporting of dependability estimates, an open-source Matlab program, the ERP Reliability Analysis Toolbox, is presented. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. [A non-classical approach to medical practices: Michel Foucault and Actor-Network Theory].

    PubMed

    Bińczyk, E

    2001-01-01

    The text presents an analysis of medical practices stemming from two sources: Michel Foucault's conception and the research of Annemarie Mol and John Law, representatives of a trend known as Actor-Network Theory. Both approaches reveal significant theoretical kinship: they can be successfully consigned to the framework of non-classical sociology of science. I initially refer to the cited conceptions as a version of non-classical sociology of medicine. The identity of non-classical sociology of medicine hinges on the fact that it undermines the possibility of objective definitions of disease, health and body. These are rather approached as variable social and historical phenomena, co-constituted by medical practices. To both Foucault and Mol the main object of interest was not medicine as such, but rather the network of medical practices. Mol and Law sketch a new theoretical perspective for the analysis of medical practices. They attempt to go beyond the dichotomous scheme of thinking about the human body as an object of medical research and the subject of private experience. Research on patients suffering blood-sugar deficiency provide the empirical background for the thesis of Actor-Network Theory representatives. Michel Foucault's conceptions are extremely critical of medical practices. The French researcher describes the processes of 'medicalising' Western society as the emergence of a new type of power. He attempts to sensitise the reader to the ethical dimension of the processes of medicalising society.

  4. A simple-source model of military jet aircraft noise

    NASA Astrophysics Data System (ADS)

    Morgan, Jessica; Gee, Kent L.; Neilsen, Tracianne; Wall, Alan T.

    2010-10-01

    The jet plumes produced by military jet aircraft radiate significant amounts of noise. A need to better understand the characteristics of the turbulence-induced aeroacoustic sources has motivated the present study. The purpose of the study is to develop a simple-source model of jet noise that can be compared to the measured data. The study is based off of acoustic data collected near a tied-down F-22 Raptor. The simplest model consisted of adjusting the origin of a monopole above a rigid planar reflector until the locations of the predicted and measured interference nulls matched. The model has developed into an extended Rayleigh distribution of partially correlated monopoles which fits the measured data from the F-22 significantly better. The results and basis for the model match the current prevailing theory that jet noise consists of both correlated and uncorrelated sources. In addition, this simple-source model conforms to the theory that the peak source location moves upstream with increasing frequency and lower engine conditions.

  5. Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package

    NASA Astrophysics Data System (ADS)

    Donges, Jonathan; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik; Marwan, Norbert; Dijkstra, Henk; Kurths, Jürgen

    2016-04-01

    We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology. pyunicorn is available online at https://github.com/pik-copan/pyunicorn. Reference: J.F. Donges, J. Heitzig, B. Beronov, M. Wiedermann, J. Runge, Q.-Y. Feng, L. Tupikina, V. Stolbova, R.V. Donner, N. Marwan, H.A. Dijkstra, and J. Kurths, Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package, Chaos 25, 113101 (2015), DOI: 10.1063/1.4934554, Preprint: arxiv.org:1507.01571 [physics.data-an].

  6. Cyclotron maser instability and its applications

    NASA Astrophysics Data System (ADS)

    Wu, C. S.

    The possible application of cyclotron maser theory to a variety of radio sources is considered, with special attention given to the theory of auroral kilometric radiation (AKR) of Wu and Lee (1979). The AKR model assumes a loss-cone distribution function for the reflected electrons, along with the depletion of low-energy electrons by the parallel electric field. Other topics considered include fundamental AKR, second-harmonic AKR, the generation of Z-mode radiation, and the application of maser instability to other sources than AKR.

  7. Predicting materials for sustainable energy sources: The key role of density functional theory

    NASA Astrophysics Data System (ADS)

    Galli, Giulia

    Climate change and the related need for sustainable energy sources replacing fossil fuels are pressing societal problems. The development of advanced materials is widely recognized as one of the key elements for new technologies that are required to achieve a sustainable environment and provide clean and adequate energy for our planet. We discuss the key role played by Density Functional Theory, and its implementations in high performance computer codes, in understanding, predicting and designing materials for energy applications.

  8. Neutron coincidence counting based on time interval analysis with one- and two-dimensional Rossi-alpha distributions: an application for passive neutron waste assay

    NASA Astrophysics Data System (ADS)

    Bruggeman, M.; Baeten, P.; De Boeck, W.; Carchon, R.

    1996-02-01

    Neutron coincidence counting is commonly used for the non-destructive assay of plutonium bearing waste or for safeguards verification measurements. A major drawback of conventional coincidence counting is related to the fact that a valid calibration is needed to convert a neutron coincidence count rate to a 240Pu equivalent mass ( 240Pu eq). In waste assay, calibrations are made for representative waste matrices and source distributions. The actual waste however may have quite different matrices and source distributions compared to the calibration samples. This often results in a bias of the assay result. This paper presents a new neutron multiplicity sensitive coincidence counting technique including an auto-calibration of the neutron detection efficiency. The coincidence counting principle is based on the recording of one- and two-dimensional Rossi-alpha distributions triggered respectively by pulse pairs and by pulse triplets. Rossi-alpha distributions allow an easy discrimination between real and accidental coincidences and are aimed at being measured by a PC-based fast time interval analyser. The Rossi-alpha distributions can be easily expressed in terms of a limited number of factorial moments of the neutron multiplicity distributions. The presented technique allows an unbiased measurement of the 240Pu eq mass. The presented theory—which will be indicated as Time Interval Analysis (TIA)—is complementary to Time Correlation Analysis (TCA) theories which were developed in the past, but is from the theoretical point of view much simpler and allows a straightforward calculation of deadtime corrections and error propagation. Analytical expressions are derived for the Rossi-alpha distributions as a function of the factorial moments of the efficiency dependent multiplicity distributions. The validity of the proposed theory is demonstrated and verified via Monte Carlo simulations of pulse trains and the subsequent analysis of the simulated data.

  9. Wilber's Integral Theory and Dossey's Theory of Integral Nursing: An Examination of Two Integral Approaches in Nursing Scholarship.

    PubMed

    Shea, Linda; Frisch, Noreen

    2016-09-01

    The purpose of this article is to examine Dossey's theory of integral nursing in relation to its major theoretical source, Wilber's integral theory. Although several nursing scholars have written about integral theory in relation to nursing scholarship and practice, Dossey's theory of integral nursing may be influencing how nurses take up integral theory in a significant way due to an extensive outreach in the holistic nursing community. Despite this wide circulation, the theory of integral nursing has yet to be reviewed in the nursing literature. This article (a) compares Dossey's theory of integral nursing with Wilber's integral theory and (b) contrasts Dossey's integral approach with another integral approach used by other scholars of integral theory. © The Author(s) 2015.

  10. A molecular Debye-Hückel theory and its applications to electrolyte solutions: The size asymmetric case

    DOE PAGES

    Xiao, Tiejun; Song, Xueyu

    2017-03-28

    We developed a molecular Debye-Hückel theory for electrolyte solutions with size asymmetry, where the dielectric response of an electrolyte solution is described by a linear combination of Debye-Hückel-like response modes. Furthermore, as the size asymmetry of an electrolyte solution leads to a charge imbalanced border zone around a solute, the dielectric response to the solute is characterized by two types of charge sources, namely, a bare solute charge and a charge distribution due to size asymmetry. These two kinds of charge sources are screened by the solvent differently, our theory presents a method to calculate the mean electric potential asmore » well as the electrostatic contributions to thermodynamic properties. Finally, the theory was successfully applied to binary as well as multi-component primitive models of electrolyte solutions.« less

  11. Infrared and visible image fusion with spectral graph wavelet transform.

    PubMed

    Yan, Xiang; Qin, Hanlin; Li, Jia; Zhou, Huixin; Zong, Jing-guo

    2015-09-01

    Infrared and visible image fusion technique is a popular topic in image analysis because it can integrate complementary information and obtain reliable and accurate description of scenes. Multiscale transform theory as a signal representation method is widely used in image fusion. In this paper, a novel infrared and visible image fusion method is proposed based on spectral graph wavelet transform (SGWT) and bilateral filter. The main novelty of this study is that SGWT is used for image fusion. On the one hand, source images are decomposed by SGWT in its transform domain. The proposed approach not only effectively preserves the details of different source images, but also excellently represents the irregular areas of the source images. On the other hand, a novel weighted average method based on bilateral filter is proposed to fuse low- and high-frequency subbands by taking advantage of spatial consistency of natural images. Experimental results demonstrate that the proposed method outperforms seven recently proposed image fusion methods in terms of both visual effect and objective evaluation metrics.

  12. A semi-empirical analysis of strong-motion peaks in terms of seismic source, propagation path, and local site conditions

    NASA Astrophysics Data System (ADS)

    Kamiyama, M.; Orourke, M. J.; Flores-Berrones, R.

    1992-09-01

    A new type of semi-empirical expression for scaling strong-motion peaks in terms of seismic source, propagation path, and local site conditions is derived. Peak acceleration, peak velocity, and peak displacement are analyzed in a similar fashion because they are interrelated. However, emphasis is placed on the peak velocity which is a key ground motion parameter for lifeline earthquake engineering studies. With the help of seismic source theories, the semi-empirical model is derived using strong motions obtained in Japan. In the derivation, statistical considerations are used in the selection of the model itself and the model parameters. Earthquake magnitude M and hypocentral distance r are selected as independent variables and the dummy variables are introduced to identify the amplification factor due to individual local site conditions. The resulting semi-empirical expressions for the peak acceleration, velocity, and displacement are then compared with strong-motion data observed during three earthquakes in the U.S. and Mexico.

  13. A primer on theory-driven web scraping: Automatic extraction of big data from the Internet for use in psychological research.

    PubMed

    Landers, Richard N; Brusso, Robert C; Cavanaugh, Katelyn J; Collmus, Andrew B

    2016-12-01

    The term big data encompasses a wide range of approaches of collecting and analyzing data in ways that were not possible before the era of modern personal computing. One approach to big data of great potential to psychologists is web scraping, which involves the automated collection of information from webpages. Although web scraping can create massive big datasets with tens of thousands of variables, it can also be used to create modestly sized, more manageable datasets with tens of variables but hundreds of thousands of cases, well within the skillset of most psychologists to analyze, in a matter of hours. In this article, we demystify web scraping methods as currently used to examine research questions of interest to psychologists. First, we introduce an approach called theory-driven web scraping in which the choice to use web-based big data must follow substantive theory. Second, we introduce data source theories , a term used to describe the assumptions a researcher must make about a prospective big data source in order to meaningfully scrape data from it. Critically, researchers must derive specific hypotheses to be tested based upon their data source theory, and if these hypotheses are not empirically supported, plans to use that data source should be changed or eliminated. Third, we provide a case study and sample code in Python demonstrating how web scraping can be conducted to collect big data along with links to a web tutorial designed for psychologists. Fourth, we describe a 4-step process to be followed in web scraping projects. Fifth and finally, we discuss legal, practical and ethical concerns faced when conducting web scraping projects. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  14. Field errors in hybrid insertion devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schlueter, R.D.

    1995-02-01

    Hybrid magnet theory as applied to the error analyses used in the design of Advanced Light Source (ALS) insertion devices is reviewed. Sources of field errors in hybrid insertion devices are discussed.

  15. Open-Source Data and the Study of Homicide.

    PubMed

    Parkin, William S; Gruenewald, Jeff

    2015-07-20

    To date, no discussion has taken place in the social sciences as to the appropriateness of using open-source data to augment, or replace, official data sources in homicide research. The purpose of this article is to examine whether open-source data have the potential to be used as a valid and reliable data source in testing theory and studying homicide. Official and open-source homicide data were collected as a case study in a single jurisdiction over a 1-year period. The data sets were compared to determine whether open-sources could recreate the population of homicides and variable responses collected in official data. Open-source data were able to replicate the population of homicides identified in the official data. Also, for every variable measured, the open-sources captured as much, or more, of the information presented in the official data. Also, variables not available in official data, but potentially useful for testing theory, were identified in open-sources. The results of the case study show that open-source data are potentially as effective as official data in identifying individual- and situational-level characteristics, provide access to variables not found in official homicide data, and offer geographic data that can be used to link macro-level characteristics to homicide events. © The Author(s) 2015.

  16. Tackling non-linearities with the effective field theory of dark energy and modified gravity

    NASA Astrophysics Data System (ADS)

    Frusciante, Noemi; Papadomanolakis, Georgios

    2017-12-01

    We present the extension of the effective field theory framework to the mildly non-linear scales. The effective field theory approach has been successfully applied to the late time cosmic acceleration phenomenon and it has been shown to be a powerful method to obtain predictions about cosmological observables on linear scales. However, mildly non-linear scales need to be consistently considered when testing gravity theories because a large part of the data comes from those scales. Thus, non-linear corrections to predictions on observables coming from the linear analysis can help in discriminating among different gravity theories. We proceed firstly by identifying the necessary operators which need to be included in the effective field theory Lagrangian in order to go beyond the linear order in perturbations and then we construct the corresponding non-linear action. Moreover, we present the complete recipe to map any single field dark energy and modified gravity models into the non-linear effective field theory framework by considering a general action in the Arnowitt-Deser-Misner formalism. In order to illustrate this recipe we proceed to map the beyond-Horndeski theory and low-energy Hořava gravity into the effective field theory formalism. As a final step we derived the 4th order action in term of the curvature perturbation. This allowed us to identify the non-linear contributions coming from the linear order perturbations which at the next order act like source terms. Moreover, we confirm that the stability requirements, ensuring the positivity of the kinetic term and the speed of propagation for scalar mode, are automatically satisfied once the viability of the theory is demanded at linear level. The approach we present here will allow to construct, in a model independent way, all the relevant predictions on observables at mildly non-linear scales.

  17. Intelligent topical sentiment analysis for the classification of e-learners and their topics of interest.

    PubMed

    Ravichandran, M; Kulanthaivel, G; Chellatamilan, T

    2015-01-01

    Every day, huge numbers of instant tweets (messages) are published on Twitter as it is one of the massive social media for e-learners interactions. The options regarding various interesting topics to be studied are discussed among the learners and teachers through the capture of ideal sources in Twitter. The common sentiment behavior towards these topics is received through the massive number of instant messages about them. In this paper, rather than using the opinion polarity of each message relevant to the topic, authors focus on sentence level opinion classification upon using the unsupervised algorithm named bigram item response theory (BIRT). It differs from the traditional classification and document level classification algorithm. The investigation illustrated in this paper is of threefold which are listed as follows: (1) lexicon based sentiment polarity of tweet messages; (2) the bigram cooccurrence relationship using naïve Bayesian; (3) the bigram item response theory (BIRT) on various topics. It has been proposed that a model using item response theory is constructed for topical classification inference. The performance has been improved remarkably using this bigram item response theory when compared with other supervised algorithms. The experiment has been conducted on a real life dataset containing different set of tweets and topics.

  18. Identification of nonclassical properties of light with multiplexing layouts

    NASA Astrophysics Data System (ADS)

    Sperling, J.; Eckstein, A.; Clements, W. R.; Moore, M.; Renema, J. J.; Kolthammer, W. S.; Nam, S. W.; Lita, A.; Gerrits, T.; Walmsley, I. A.; Agarwal, G. S.; Vogel, W.

    2017-07-01

    In Sperling et al. [Phys. Rev. Lett. 118, 163602 (2017), 10.1103/PhysRevLett.118.163602], we introduced and applied a detector-independent method to uncover nonclassicality. Here, we extend those techniques and give more details on the performed analysis. We derive a general theory of the positive-operator-valued measure that describes multiplexing layouts with arbitrary detectors. From the resulting quantum version of a multinomial statistics, we infer nonclassicality probes based on a matrix of normally ordered moments. We discuss these criteria and apply the theory to our data which are measured with superconducting transition-edge sensors. Our experiment produces heralded multiphoton states from a parametric down-conversion light source. We show that the known notions of sub-Poisson and sub-binomial light can be deduced from our general approach, and we establish the concept of sub-multinomial light, which is shown to outperform the former two concepts of nonclassicality for our data.

  19. Analysis of prescription database extracted from standard textbooks of traditional Dai medicine.

    PubMed

    Zhang, Chuang; Chongsuvivatwong, Virasakdi; Keawpradub, Niwat; Lin, Yanfang

    2012-08-29

    Traditional Dai Medicine (TDM) is one of the four major ethnomedicine of China. In 2007 a group of experts produced a set of seven Dai medical textbooks on this subject. The first two were selected as the main data source to analyse well recognized prescriptions. To quantify patterns of prescriptions, common ingredients, indications and usages of TDM. A relational database linking the prescriptions, ingredients, herb names, indications, and usages was set up. Frequency of pattern of combination and common ingredients were tabulated. A total of 200 prescriptions and 402 herbs were compiled. Prescriptions based on "wind" disorders, a detoxification theory that most commonly deals with symptoms of digestive system diseases, accounted for over one third of all prescriptions. The major methods of preparations mostly used roots and whole herbs. The information extracted from the relational database may be useful for understanding symptomatic treatments. Antidote and detoxification theory deserves further research.

  20. Brookhaven highlights, October 1978-September 1979. [October 1978 to September 1979

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1979-01-01

    These highlights present an overview of the major research and development achievements at Brookhaven National Laboratory from October 1978 to September 1979. Specific areas covered include: accelerator and high energy physics programs; high energy physics research; the AGS and improvements to the AGS; neutral beam development; heavy ion fusion; superconducting power cables; ISABELLE storage rings; the BNL Tandem accelerator; heavy ion experiments at the Tandem; the High Flux Beam Reactor; medium energy physics; nuclear theory; atomic and applied physics; solid state physics; neutron scattering studies; x-ray scattering studies; solid state theory; defects and disorder in solids; surface physics; the Nationalmore » Synchrotron Light Source ; Chemistry Department; Biology Department; Medical Department; energy sciences; environmental sciences; energy technology programs; National Center for Analysis of Energy Systems; advanced reactor systems; nuclear safety; National Nuclear Data Center; nuclear materials safeguards; Applied Mathematics Department; and support activities. (GHT)« less

  1. An overview of a multifactor-system theory of personality and individual differences: III. Life span development and the heredity-environment issue.

    PubMed

    Powell, A; Royce, J R

    1981-12-01

    In Part III of this three-part series on multifactor-system theory, multivariate, life-span development is approached from the standpoint of a quantitative and qualitative analysis of the ontogenesis of factors in each of the six systems. The pattern of quantitative development (described via the Gompertz equation and three developmental parameters) involves growth, stability, and decline, and qualitative development involves changes in the organization of factors (e.g., factor differentiation and convergence). Hereditary and environmental sources of variation are analyzed via the factor gene model and the concept of heredity-dominant factors, and the factor-learning model and environment-dominant factors. It is hypothesized that the sensory and motor systems are heredity dominant, that the style and value systems are environment dominant, and that the cognitive and affective systems are partially heredity dominant.

  2. Local structure analysis on (La,Ba)(Ga,Mg)O3-δ by the pair distribution function method using a neutron source and density functional theory calculations

    NASA Astrophysics Data System (ADS)

    Kitamura, Naoto; Vogel, Sven C.; Idemoto, Yasushi

    2013-06-01

    In this work, we focused on La0.95Ba0.05Ga0.8Mg0.2O3-δ with the perovskite structure, and investigated the local structure around the oxygen vacancy by pair distribution function (PDF) method and density functional theory (DFT) calculation. By comparing the G(r) simulated based on the DFT calculation and the experimentally-observed G(r), it was suggested that the oxygen vacancy was trapped by Ba2+ at the La3+ site at least at room temperature. Such a defect association may be one of the reasons why the La0.95Ba0.05Ga0.8Mg0.2O3-δ showed lower oxide-ion conductivity than (La,Sr)(Ga,Mg)O3-δ which was widely-used as an electrolyte of the solid oxide fuel cell.

  3. Gas Core Reactor Numerical Simulation Using a Coupled MHD-MCNP Model

    NASA Technical Reports Server (NTRS)

    Kazeminezhad, F.; Anghaie, S.

    2008-01-01

    Analysis is provided in this report of using two head-on magnetohydrodynamic (MHD) shocks to achieve supercritical nuclear fission in an axially elongated cylinder filled with UF4 gas as an energy source for deep space missions. The motivation for each aspect of the design is explained and supported by theory and numerical simulations. A subsequent report will provide detail on relevant experimental work to validate the concept. Here the focus is on the theory of and simulations for the proposed gas core reactor conceptual design from the onset of shock generations to the supercritical state achieved when the shocks collide. The MHD model is coupled to a standard nuclear code (MCNP) to observe the neutron flux and fission power attributed to the supercritical state brought about by the shock collisions. Throughout the modeling, realistic parameters are used for the initial ambient gaseous state and currents to ensure a resulting supercritical state upon shock collisions.

  4. Identification of nonclassical properties of light with multiplexing layouts

    PubMed Central

    Sperling, J.; Eckstein, A.; Clements, W. R.; Moore, M.; Renema, J. J.; Kolthammer, W. S.; Nam, S. W.; Lita, A.; Gerrits, T.; Walmsley, I. A.; Agarwal, G. S.; Vogel, W.

    2018-01-01

    In Sperling et al. [Phys. Rev. Lett. 118, 163602 (2017)], we introduced and applied a detector-independent method to uncover nonclassicality. Here, we extend those techniques and give more details on the performed analysis. We derive a general theory of the positive-operator-valued measure that describes multiplexing layouts with arbitrary detectors. From the resulting quantum version of a multinomial statistics, we infer nonclassicality probes based on a matrix of normally ordered moments. We discuss these criteria and apply the theory to our data which are measured with superconducting transition-edge sensors. Our experiment produces heralded multiphoton states from a parametric down-conversion light source. We show that the known notions of sub-Poisson and sub-binomial light can be deduced from our general approach, and we establish the concept of sub-multinomial light, which is shown to outperform the former two concepts of nonclassicality for our data. PMID:29670949

  5. Asymptotic analysis of the Boltzmann equation for dark matter relics in the presence of a running dilaton and space-time defects

    NASA Astrophysics Data System (ADS)

    Bender, Carl M.; Mavromatos, Nick E.; Sarkar, Sarben

    2013-03-01

    The interplay of dilatonic effects in dilaton cosmology and stochastic quantum space-time defects within the framework of string/brane cosmologies is examined. The Boltzmann equation describes the physics of thermal dark-matter-relic abundances in the presence of rolling dilatons. These dilatons affect the coupling of stringy matter to D-particle defects, which are generic in string theory. This coupling leads to an additional source term in the Boltzmann equation. The techniques of asymptotic matching and boundary-layer theory, which were recently applied by two of the authors (Bender and Sarkar) to a Boltzmann equation, are used here to find the detailed asymptotic relic abundances for all ranges of the expectation value of the dilaton field. The phenomenological implications for the search for supersymmetric dark matter in current colliders, such as the LHC, are discussed.

  6. Stably Stratified Atmospheric Boundary Layers

    NASA Astrophysics Data System (ADS)

    Mahrt, L.

    2014-01-01

    Atmospheric boundary layers with weak stratification are relatively well described by similarity theory and numerical models for stationary horizontally homogeneous conditions. With common strong stratification, similarity theory becomes unreliable. The turbulence structure and interactions with the mean flow and small-scale nonturbulent motions assume a variety of scenarios. The turbulence is intermittent and may no longer fully satisfy the usual conditions for the definition of turbulence. Nonturbulent motions include wave-like motions and solitary modes, two-dimensional vortical modes, microfronts, intermittent drainage flows, and a host of more complex structures. The main source of turbulence may not be at the surface, but rather may result from shear above the surface inversion. The turbulence is typically not in equilibrium with the nonturbulent motions, sometimes preventing the formation of an inertial subrange. New observational and analysis techniques are expected to advance our understanding of the very stable boundary layer.

  7. Transmission from theory to practice: Experiences using open-source code development and a virtual short course to increase the adoption of new theoretical approaches

    NASA Astrophysics Data System (ADS)

    Harman, C. J.

    2015-12-01

    Even amongst the academic community, new theoretical tools can remain underutilized due to the investment of time and resources required to understand and implement them. This surely limits the frequency that new theory is rigorously tested against data by scientists outside the group that developed it, and limits the impact that new tools could have on the advancement of science. Reducing the barriers to adoption through online education and open-source code can bridge the gap between theory and data, forging new collaborations, and advancing science. A pilot venture aimed at increasing the adoption of a new theory of time-variable transit time distributions was begun in July 2015 as a collaboration between Johns Hopkins University and The Consortium of Universities for the Advancement of Hydrologic Science (CUAHSI). There were four main components to the venture: a public online seminar covering the theory, an open source code repository, a virtual short course designed to help participants apply the theory to their data, and an online forum to maintain discussion and build a community of users. 18 participants were selected for the non-public components based on their responses in an application, and were asked to fill out a course evaluation at the end of the short course, and again several months later. These evaluations, along with participation in the forum and on-going contact with the organizer suggest strengths and weaknesses in this combination of components to assist participants in adopting new tools.

  8. Geoacoustic inversion with two source-receiver arrays in shallow water.

    PubMed

    Sukhovich, Alexey; Roux, Philippe; Wathelet, Marc

    2010-08-01

    A geoacoustic inversion scheme based on a double beamforming algorithm in shallow water is proposed and tested. Double beamforming allows identification of multi-reverberated eigenrays propagating between two vertical transducer arrays according to their emission and reception angles and arrival times. Analysis of eigenray intensities yields the bottom reflection coefficient as a function of angle of incidence. By fitting the experimental reflection coefficient with a theoretical prediction, values of the acoustic parameters of the waveguide bottom can be extracted. The procedure was initially tested in a small-scale tank experiment for a waveguide with a Plexiglas bottom. Inversion results for the speed of shear waves in Plexiglas are in good agreement with the table values. A similar analysis was applied to data collected during an at-sea experiment in shallow coastal waters of the Mediterranean. Bottom reflection coefficient was fitted with the theory in which bottom sediments are modeled as a multi-layered system. Retrieved bottom parameters are in quantitative agreement with those determined from a prior inversion scheme performed in the same area. The present study confirms the interest in processing source-receiver array data through the double beamforming algorithm, and indicates the potential for application of eigenray intensity analysis to geoacoustic inversion problems.

  9. Two competing ionization processes in electrospray mass spectrometry of indolyl benzo[b]carbazoles: formation of M⁺• versus [M + H]⁺.

    PubMed

    Zhang, Xiaoping; Jiang, Kezhi; Zou, Jingfeng; Li, Zuguang

    2015-02-15

    Ionization in electrospray ionization mass spectrometry (ESI-MS) mainly occurs as a result of acid-base reactions or coordination with metal cations. Formation of the radical cation M(+•) in the ESI process has attracted our interest to perform further investigation. A series of indolyl benzo[b]carbazoles were investigated using a quadrupole ion trap mass spectrometer equipped with an ESI source or an atmospheric pressure chemical ionization (APCI) source in the positive-ion mode. Theoretical calculations were performed using the density functional theory (DFT) method at the B3LYP/6-31G(d) level. Both the radical ion M(+•) and the protonated molecule [M + H](+) were obtained by ESI-MS analysis of indolyl benzo[b]carbazoles, while only [M + H](+) was observed in the APCI-MS analysis. The relative intensities of M(+•) and [M + H](+) were significantly affected by several ESI operating parameters and the nature of the substituents. Formation of M(+•) and [M + H](+) was rationalized as two competing ionization processes in the ESI-MS analysis of indolyl benzo[b]carbazoles. Copyright © 2014 John Wiley & Sons, Ltd.

  10. Improving the limits of detection of low background alpha emission measurements

    NASA Astrophysics Data System (ADS)

    McNally, Brendan D.; Coleman, Stuart; Harris, Jack T.; Warburton, William K.

    2018-01-01

    Alpha particle emission - even at extremely low levels - is a significant issue in the search for rare events (e.g., double beta decay, dark matter detection). Traditional measurement techniques require long counting times to measure low sample rates in the presence of much larger instrumental backgrounds. To address this, a commercially available instrument developed by XIA uses pulse shape analysis to discriminate alpha emissions produced by the sample from those produced by other surfaces of the instrument itself. Experience with this system has uncovered two residual sources of background: cosmogenics and radon emanation from internal components. An R&D program is underway to enhance the system and extend the pulse shape analysis technique further, so that these residual sources can be identified and rejected as well. In this paper, we review the theory of operation and pulse shape analysis techniques used in XIA's alpha counter, and briefly explore data suggesting the origin of the residual background terms. We will then present our approach to enhance the system's ability to identify and reject these terms. Finally, we will describe a prototype system that incorporates our concepts and demonstrates their feasibility.

  11. Mutual information optimization for mass spectra data alignment.

    PubMed

    Zoppis, Italo; Gianazza, Erica; Borsani, Massimiliano; Chinello, Clizia; Mainini, Veronica; Galbusera, Carmen; Ferrarese, Carlo; Galimberti, Gloria; Sorbi, Sandro; Borroni, Barbara; Magni, Fulvio; Antoniotti, Marco; Mauri, Giancarlo

    2012-01-01

    "Signal" alignments play critical roles in many clinical setting. This is the case of mass spectrometry data, an important component of many types of proteomic analysis. A central problem occurs when one needs to integrate (mass spectrometry) data produced by different sources, e.g., different equipment and/or laboratories. In these cases some form of "data integration'" or "data fusion'" may be necessary in order to discard some source specific aspects and improve the ability to perform a classification task such as inferring the "disease classes'" of patients. The need for new high performance data alignments methods is therefore particularly important in these contexts. In this paper we propose an approach based both on an information theory perspective, generally used in a feature construction problem, and on the application of a mathematical programming task (i.e. the weighted bipartite matching problem). We present the results of a competitive analysis of our method against other approaches. The analysis was conducted on data from plasma/ethylenediaminetetraacetic acid (EDTA) of "control" and Alzheimer patients collected from three different hospitals. The results point to a significant performance advantage of our method with respect to the competing ones tested.

  12. A practical and systematic review of Weibull statistics for reporting strengths of dental materials

    PubMed Central

    Quinn, George D.; Quinn, Janet B.

    2011-01-01

    Objectives To review the history, theory and current applications of Weibull analyses sufficient to make informed decisions regarding practical use of the analysis in dental material strength testing. Data References are made to examples in the engineering and dental literature, but this paper also includes illustrative analyses of Weibull plots, fractographic interpretations, and Weibull distribution parameters obtained for a dense alumina, two feldspathic porcelains, and a zirconia. Sources Informational sources include Weibull's original articles, later articles specific to applications and theoretical foundations of Weibull analysis, texts on statistics and fracture mechanics and the international standards literature. Study Selection The chosen Weibull analyses are used to illustrate technique, the importance of flaw size distributions, physical meaning of Weibull parameters and concepts of “equivalent volumes” to compare measured strengths obtained from different test configurations. Conclusions Weibull analysis has a strong theoretical basis and can be of particular value in dental applications, primarily because of test specimen size limitations and the use of different test configurations. Also endemic to dental materials, however, is increased difficulty in satisfying application requirements, such as confirming fracture origin type and diligence in obtaining quality strength data. PMID:19945745

  13. Illicit use of prescription stimulants in a college student sample: a theory-guided analysis.

    PubMed

    Bavarian, Niloofar; Flay, Brian R; Ketcham, Patricia L; Smit, Ellen

    2013-10-01

    The illicit use of prescription stimulants (IUPS) has emerged as a high-risk behavior of the 21st century college student. As the study of IUPS is relatively new, we aimed to understand (1) characteristics of IUPS (i.e., initiation, administration routes, drug sources, motives, experiences), and (2) theory-guided intrapersonal, interpersonal, and environmental correlates associated with use. Using one-stage cluster sampling, 520 students (96.3% response rate) at one Pacific Northwest University completed a paper-based, in-classroom survey on IUPS behaviors and expected correlates. Aim 1 was addressed using descriptive statistics and aim 2 was addressed via three nested logistic regression analyses guided by the Theory of Triadic Influence. The prevalence of ever engaging in IUPS during college was 25.6%. The majority (>50.0%) of users reported initiation during college, oral use, friends as the drug source, academic motives, and experiencing desired outcomes. Intrapersonal correlates associated with use included identifying as White, lower grade point average, diagnoses of attention deficit disorder, and lower avoidance self-efficacy. Interpersonal correlates of use included off-campus residence, varsity sports participation, IUPS perceptions by socializing agents, and greater behavioral norms. Exposure to prescription drug print media, greater prescription stimulant knowledge, and positive attitudes towards prescription stimulants were environmental correlates associated with use. In all models, IUPS intentions were strongly associated with use. IUPS was prevalent on the campus under investigation and factors from the intrapersonal, interpersonal and environmental domains were associated with the behavior. Implications for prevention and future research are discussed. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  14. Illicit Use of Prescription Stimulants in a College Student Sample: A Theory-Guided Analysis*

    PubMed Central

    Bavarian, Niloofar; Flay, Brian R.; Ketcham, Patricia L.; Smit, Ellen

    2013-01-01

    Background The illicit use of prescription stimulants (IUPS) has emerged as a high-risk behavior of the 21st century college student. As the study of IUPS is relatively new, we aimed to understand 1) characteristics of IUPS (i.e., initiation, administration routes, drug sources, motives, experiences), and 2) theory-guided intrapersonal, interpersonal, and environmental correlates associated with use. Methods Using one-stage cluster sampling, 520 students (96.3% response rate) at one Pacific Northwest University completed a paper-based, in-classroom survey on IUPS behaviors and expected correlates. Aim 1 was addressed using descriptive statistics and aim 2 was addressed via three nested logistic regression analyses guided by the Theory of Triadic Influence. Results The prevalence of ever engaging in IUPS during college was 25.6%. The majority (>50.0%) of users reported initiation during college, oral use, friends as the drug source, academic motives, and experiencing desired outcomes. Intrapersonal correlates associated with use included identifying as White, lower grade point average, diagnoses of attention deficit disorder, and lower avoidance self-efficacy. Interpersonal correlates of use included off-campus residence, varsity sports participation, IUPS perceptions by socializing agents, and greater behavioral norms. Exposure to prescription drug print media, greater prescription stimulant knowledge, and positive attitudes towards prescription stimulants were environmental correlates associated with use. In all models, IUPS intentions were strongly associated with use. Conclusions IUPS was prevalent on the campus under investigation and factors from the intrapersonal, interpersonal and environmental domains were associated with the behavior. Implications for prevention and future research are discussed. PMID:23683794

  15. Ostwald ripening theory

    NASA Technical Reports Server (NTRS)

    Baird, J. K.

    1986-01-01

    The Ostwald-ripening theory is deduced and discussed starting from the fundamental principles such as Ising model concept, Mayer cluster expansion, Langer condensation point theory, Ginzburg-Landau free energy, Stillinger cutoff-pair potential, LSW-theory and MLSW-theory. Mathematical intricacies are reduced to an understanding version. Comparison of selected works, from 1949 to 1984, on solution of diffusion equation with and without sink/sources term(s) is presented. Kahlweit's 1980 work and Marqusee-Ross' 1954 work are more emphasized. Odijk and Lekkerkerker's 1985 work on rodlike macromolecules is introduced in order to simulate interested investigators.

  16. Chiral perturbation theory and nucleon-pion-state contaminations in lattice QCD

    NASA Astrophysics Data System (ADS)

    Bär, Oliver

    2017-05-01

    Multiparticle states with additional pions are expected to be a non-negligible source of excited-state contamination in lattice simulations at the physical point. It is shown that baryon chiral perturbation theory can be employed to calculate the contamination due to two-particle nucleon-pion-states in various nucleon observables. Leading order results are presented for the nucleon axial, tensor and scalar charge and three Mellin moments of parton distribution functions (quark momentum fraction, helicity and transversity moment). Taking into account phenomenological results for the charges and moments the impact of the nucleon-pion-states on lattice estimates for these observables can be estimated. The nucleon-pion-state contribution results in an overestimation of all charges and moments obtained with the plateau method. The overestimation is at the 5-10% level for source-sink separations of about 2 fm. The source-sink separations accessible in contemporary lattice simulations are found to be too small for chiral perturbation theory to be directly applicable.

  17. Interior noise prediction methodology: ATDAC theory and validation

    NASA Technical Reports Server (NTRS)

    Mathur, Gopal P.; Gardner, Bryce K.

    1992-01-01

    The Acoustical Theory for Design of Aircraft Cabins (ATDAC) is a computer program developed to predict interior noise levels inside aircraft and to evaluate the effects of different aircraft configurations on the aircraft acoustical environment. The primary motivation for development of this program is the special interior noise problems associated with advanced turboprop (ATP) aircraft where there is a tonal, low frequency noise problem. Prediction of interior noise levels requires knowledge of the energy sources, the transmission paths, and the relationship between the energy variable and the sound pressure level. The energy sources include engine noise, both airborne and structure-borne; turbulent boundary layer noise; and interior noise sources such as air conditioner noise and auxiliary power unit noise. Since propeller and engine noise prediction programs are widely available, they are not included in ATDAC. Airborne engine noise from any prediction or measurement may be input to this program. This report describes the theory and equations implemented in the ATDAC program.

  18. Interior noise prediction methodology: ATDAC theory and validation

    NASA Astrophysics Data System (ADS)

    Mathur, Gopal P.; Gardner, Bryce K.

    1992-04-01

    The Acoustical Theory for Design of Aircraft Cabins (ATDAC) is a computer program developed to predict interior noise levels inside aircraft and to evaluate the effects of different aircraft configurations on the aircraft acoustical environment. The primary motivation for development of this program is the special interior noise problems associated with advanced turboprop (ATP) aircraft where there is a tonal, low frequency noise problem. Prediction of interior noise levels requires knowledge of the energy sources, the transmission paths, and the relationship between the energy variable and the sound pressure level. The energy sources include engine noise, both airborne and structure-borne; turbulent boundary layer noise; and interior noise sources such as air conditioner noise and auxiliary power unit noise. Since propeller and engine noise prediction programs are widely available, they are not included in ATDAC. Airborne engine noise from any prediction or measurement may be input to this program. This report describes the theory and equations implemented in the ATDAC program.

  19. SL(2,R) duality-symmetric action for electromagnetic theory with electric and magnetic sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Choonkyu, E-mail: cklee@phya.snu.ac.kr; School of Physics, Korea Institute for Advanced Study, Seoul 130-722; Min, Hyunsoo, E-mail: hsmin@dirac.uos.ac.kr

    2013-12-15

    For the SL(2,R) duality-invariant generalization of Maxwell electrodynamics in the presence of both electric and magnetic sources, we formulate a local, manifestly duality-symmetric, Zwanziger-type action by introducing a pair of four-potentials A{sup μ} and B{sup μ} in a judicious way. On the two potentials A{sup μ} and B{sup μ} the SL(2,R) duality transformation acts in a simple linear manner. In quantum theory including charged source fields, this action can be recast as a SL(2,Z)-invariant action. Also given is a Zwanziger-type action for SL(2,R) duality-invariant Born–Infeld electrodynamics which can be important for D-brane dynamics in string theory. -- Highlights: •We formulatemore » a local, manifestly duality-symmetric, Zwanziger-type action. •Maxwell electrodynamics is generalized to include dilaton and axion fields. •SL(2,R) symmetry is manifest. •We formulate a local, manifestly duality-symmetric, nonlinear Born–Infeld action with SL(2,R) symmetry.« less

  20. Feeling Interpersonally Controlled While Pursuing Materialistic Goals: A Problematic Combination for Moral Behavior.

    PubMed

    Sheldon, Kennon M; Sommet, Nicolas; Corcoran, Mike; Elliot, Andrew J

    2018-04-01

    We created a life-goal assessment drawing from self-determination theory and achievement goal literature, examining its predictive power regarding immoral behavior and subjective well-being. Our source items assessed direction and energization of motivation, via the distinction between intrinsic and extrinsic aims and between intrinsic and extrinsic reasons for acting, respectively. Fused source items assessed four goal complexes representing a combination of direction and energization. Across three studies ( Ns = 109, 121, and 398), the extrinsic aim/extrinsic reason complex was consistently associated with immoral and/or unethical behavior beyond four source and three other goal complex variables. This was consistent with the triangle model of responsibility's claim that immoral behaviors may result when individuals disengage the self from moral prescriptions. The extrinsic/extrinsic complex also predicted lower subjective well-being, albeit less consistently. Our goal complex approach sheds light on how self-determination theory's goal contents and organismic integration mini-theories interact, particularly with respect to unethical behavior.

  1. [1400 hours of analysis with Freud: Viktor von Dirsztay. A biographical sketch].

    PubMed

    May, Ulrike

    2010-01-01

    On the basis of mostly unpublished sources, the author reconstructs the life of the Hungarian writer Viktor von Dirsztay (1884?-1935) who was personally acquainted with many expressionist artists and writers, e. g. with Karl Kraus, Oskar Kokoschka, Herwarth Walden, Walter Hasenclever, Hermann Broch and Arthur Schnitzler. This association puts Freud into closer proximity with the cultural avantgarde of his times than previously realized. Between 1910 and 1920 Dirsztay underwent several phases of analysis with Freud; then he was treated by Theodor Reik. The overall length of his analysis with Freud is almost unparalleled. The article discusses whether and in which way Dirsztay's writings might have been influenced by his analyses and how Freud and Reik might have drawn upon their experiences with this patient. It is argued that likely references can be discovered in both authors' theories of masochism. There is an intriguing late remark of Dirsztay's that he was "ruined by analysis".

  2. Fresnel diffraction by spherical obstacles

    NASA Technical Reports Server (NTRS)

    Hovenac, Edward A.

    1989-01-01

    Lommel functions were used to solve the Fresnel-Kirchhoff diffraction integral for the case of a spherical obstacle. Comparisons were made between Fresnel diffraction theory and Mie scattering theory. Fresnel theory is then compared to experimental data. Experiment and theory typically deviated from one another by less than 10 percent. A unique experimental setup using mercury spheres suspended in a viscous fluid significantly reduced optical noise. The major source of error was due to the Gaussian-shaped laser beam.

  3. System-of-Systems Acquisition: Alignment and Collaboration

    DTIC Science & Technology

    2011-10-11

    motivational theory as well as empirical evidence, such as the Eureka case. Maslow’s motivational theory ( Maslow , 1943) supports the = = ^Åèìáëáíáçå=oÉëÉ...externalities a new source of market failure? Research in Law and Economics, 17, 1–22. Maslow , A. H. (1943). A theory of human motivation . Psychological...of satisfied needs are motivated by peer recognition. Lawrence and Nohria (2002) identify a four drives theory of individual motivation :

  4. Magnetic Soliton, Homotopy and Higgs Theory,

    DTIC Science & Technology

    1986-04-24

    OD-AL67 366 NAGETIC SOLITON ONOTOPY ND HIGGS THEORY(U) FOREIGNI n1/ 1TECHNOLOGY D V NRIGHT-PATTERSON AFD ON Y LI ET AL. UNCLSSIIED24 APR 86 FTD-ID...MAGNETIC SOLITON, HOMOTOPY AND HIGGS THEORY by Li Yuanjie and Lei Shizu *. . * . .%..**% . . .-..C./ ~~~Approved for public release; -," Distribution...HOMOTOPY AND HIGGS THEORY By: Li Yuanjie and Lei Shizu English pages: 9 Source: Huazhong Gongxueyuan Xuebao, Vol. 11, Nr. 6, 1983, pp. 65-70 Country of

  5. Finite gradient elasticity and plasticity: a constitutive thermodynamical framework

    NASA Astrophysics Data System (ADS)

    Bertram, Albrecht

    2016-05-01

    In Bertram (Continuum Mech Thermodyn. doi: 10.1007/s00161-014-0387-0 , 2015), a mechanical framework for finite gradient elasticity and plasticity has been given. In the present paper, this is extended to thermodynamics. The mechanical theory is only briefly repeated here. A format for a rather general constitutive theory including all thermodynamic fields is given in a Euclidian invariant setting. The plasticity theory is rate-independent and unconstrained. The Clausius-Duhem inequality is exploited to find necessary and sufficient conditions for thermodynamic consistency. The residual dissipation inequality restricts the flow and hardening rules in combination with the yield criterion.

  6. Toward the greening of nuclear energy: A content analysis of nuclear energy frames from 1991 to 2008

    NASA Astrophysics Data System (ADS)

    Miller, Sonya R.

    Framing theory has emerged as one of the predominant theories employed in mass communications research in the 21st century. Frames are identified as interpretive packages for content where some issue attributes are highlighted over other attributes. While framing effects studies appear plentiful, longitudinal studies assessing trends in dominant framing packages and story elements for an issue appear to be less understood. Through content analysis, this study examines dominant frame packages, story elements, headline tone, story tone, stereotypes, and source attribution for nuclear energy from 1991-2008 in the New York Times, USA Today, the Wall Street Journal, and the Washington Post. Unlike many content analysis studies, this study compares intercoder reliability among three indices---percentage agreement, proportional reduction of loss and Scott's Pi. The newspapers represented in this study possess a commonality in the types of dominant frames packages employed. Significant dominant frame packages among the four newspapers include human/health, proliferation, procedural, and marketplace. While the procedural frame package was more likely to appear prior to the 1997 Kyoto Protocol, the proliferation frame packaged was more likely to appear after the Kyoto Protol. Over time, the sustainable frame package demonstrated increased significance. This study is part of the growing literature regarding the function of frames over time.

  7. Theory and investigation of acoustic multiple-input multiple-output systems based on spherical arrays in a room.

    PubMed

    Morgenstern, Hai; Rafaely, Boaz; Zotter, Franz

    2015-11-01

    Spatial attributes of room acoustics have been widely studied using microphone and loudspeaker arrays. However, systems that combine both arrays, referred to as multiple-input multiple-output (MIMO) systems, have only been studied to a limited degree in this context. These systems can potentially provide a powerful tool for room acoustics analysis due to the ability to simultaneously control both arrays. This paper offers a theoretical framework for the spatial analysis of enclosed sound fields using a MIMO system comprising spherical loudspeaker and microphone arrays. A system transfer function is formulated in matrix form for free-field conditions, and its properties are studied using tools from linear algebra. The system is shown to have unit-rank, regardless of the array types, and its singular vectors are related to the directions of arrival and radiation at the microphone and loudspeaker arrays, respectively. The formulation is then generalized to apply to rooms, using an image source method. In this case, the rank of the system is related to the number of significant reflections. The paper ends with simulation studies, which support the developed theory, and with an extensive reflection analysis of a room impulse response, using the platform of a MIMO system.

  8. Statistical methods and neural network approaches for classification of data from multiple sources

    NASA Technical Reports Server (NTRS)

    Benediktsson, Jon Atli; Swain, Philip H.

    1990-01-01

    Statistical methods for classification of data from multiple data sources are investigated and compared to neural network models. A problem with using conventional multivariate statistical approaches for classification of data of multiple types is in general that a multivariate distribution cannot be assumed for the classes in the data sources. Another common problem with statistical classification methods is that the data sources are not equally reliable. This means that the data sources need to be weighted according to their reliability but most statistical classification methods do not have a mechanism for this. This research focuses on statistical methods which can overcome these problems: a method of statistical multisource analysis and consensus theory. Reliability measures for weighting the data sources in these methods are suggested and investigated. Secondly, this research focuses on neural network models. The neural networks are distribution free since no prior knowledge of the statistical distribution of the data is needed. This is an obvious advantage over most statistical classification methods. The neural networks also automatically take care of the problem involving how much weight each data source should have. On the other hand, their training process is iterative and can take a very long time. Methods to speed up the training procedure are introduced and investigated. Experimental results of classification using both neural network models and statistical methods are given, and the approaches are compared based on these results.

  9. Efficient Location of Research Reference Sources in the Field of Dance.

    ERIC Educational Resources Information Center

    Kissinger, Pat; Jay, Danielle

    More than 45 basic dance reference research sources that would be useful to students, scholars, teachers, historians, and therapists are discussed in this bibliographic essay. Aspects of dance covered include choreography, criticism, teaching principles, aesthetic theory, dance therapy, and history. Sources are grouped by type: dictionaries and…

  10. The Stress Sources of Nursing Students

    ERIC Educational Resources Information Center

    Oner Altiok, Hatice; Ustun, Besti

    2013-01-01

    Overall, nursing training is a stressful process. Especially when second year nursing students are evaluated within the professional socialization theory, they are stated to be affected by these sources of stress more negatively. This research was carried out in order to determine the stress sources of second year nursing students. 15 nursing…

  11. Issues and Realities in Early Childhood Education.

    ERIC Educational Resources Information Center

    Spodek, Bernard

    This paper investigates three issues vital to early childhood education: (1) sources of curriculum, (2) sources of financial support, and (3) the relationship between racism and compensatory education. "Natural" childhood and child development theories are discussed, and their use as a source of curriculum for young children is questioned, as is…

  12. Recognition and source memory as multivariate decision processes.

    PubMed

    Banks, W P

    2000-07-01

    Recognition memory, source memory, and exclusion performance are three important domains of study in memory, each with its own findings, it specific theoretical developments, and its separate research literature. It is proposed here that results from all three domains can be treated with a single analytic model. This article shows how to generate a comprehensive memory representation based on multidimensional signal detection theory and how to make predictions for each of these paradigms using decision axes drawn through the space. The detection model is simpler than the comparable multinomial model, it is more easily generalizable, and it does not make threshold assumptions. An experiment using the same memory set for all three tasks demonstrates the analysis and tests the model. The results show that some seemingly complex relations between the paradigms derive from an underlying simplicity of structure.

  13. A new dump system design for stray light reduction of Thomson scattering diagnostic system on EAST

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiao, Shumei; Zang, Qing, E-mail: zangq@ipp.ac.cn; Han, Xiaofeng

    Thomson scattering (TS) diagnostic is an important diagnostic for measuring electron temperature and density during plasma discharge. However, the measurement of Thomson scattering signal is disturbed by the stray light easily. The stray light sources in the Experimental Advanced Superconducting Tokamak (EAST) TS diagnostic system were analyzed by a simulation model of the diagnostic system, and simulation results show that the dump system is the primary stray light source. Based on the optics theory and the simulation analysis, a novel dump system including an improved beam trap was proposed and installed. The measurement results indicate that the new dump systemmore » can reduce more than 60% of the stray light for the diagnostic system, and the influence of stray light on the error of measured density decreases.« less

  14. The size of coronal hard X-ray sources in solar flares: How big are they?

    NASA Astrophysics Data System (ADS)

    Effenberger, F.; Krucker, S.; Rubio da Costa, F.

    2017-12-01

    Coronal hard X-ray sources are considered to be one of the key signatures of non-thermal particle acceleration and heating during the energy release in solar flares. In some cases, X-ray observations reveal multiple components spatially located near and above the loop top and even further up in the corona. Here, we combine a detailed RHESSI imaging analysis of near-limb solar flares with occulted footpoints and a multi-wavelength study of the flare loop evolution in SDO/AIA. We connect our findings to different current sheet formation and magnetic break-out scenarios and relate it to particle acceleration theory. We find that the upper and usually fainter emission regions can be underestimated in their size due to the majority of flux originating from the lower loops.

  15. Quantum Theory of Three-Dimensional Superresolution Using Rotating-PSF Imagery

    NASA Astrophysics Data System (ADS)

    Prasad, S.; Yu, Z.

    The inverse of the quantum Fisher information (QFI) matrix (and extensions thereof) provides the ultimate lower bound on the variance of any unbiased estimation of a parameter from statistical data, whether of intrinsically quantum mechanical or classical character. We calculate the QFI for Poisson-shot-noise-limited imagery using the rotating PSF that can localize and resolve point sources fully in all three dimensions. We also propose an experimental approach based on the use of computer generated hologram and projective measurements to realize the QFI-limited variance for the problem of super-resolving a closely spaced pair of point sources at a highly reduced photon cost. The paper presents a preliminary analysis of quantum-limited three-dimensional (3D) pair optical super-resolution (OSR) problem with potential applications to astronomical imaging and 3D space-debris localization.

  16. Stochastic theory of photon flow in homogeneous and heterogeneous anisotropic biological and artificial material

    NASA Astrophysics Data System (ADS)

    Miller, Steven D.

    1995-05-01

    Standard Monte Carlo methods used in photon diffusion score absorbed photons or statistical weight deposited within voxels comprising a mesh. An alternative approach to a stochastic description is considered for rapid surface flux calculations and finite medias. Matrix elements are assigned to a spatial lattice whose function is to score vector intersections of scattered photons making transitions into either the forward or back solid angle half spaces. These complete matrix elements can be related to the directional fluxes within the lattice space. This model differentiates between ballistic, quasi-ballistic, and highly diffuse photon contributions, and effectively models the subsurface generation of a scattered light flux from a ballistic source. The connection between a path integral and diffusion is illustrated. Flux perturbations can be effectively illustrated for tissue-tumor-tissue and for 3 layer systems with strong absorption in one or more layers. For conditions where the diffusion theory has difficulties such as strong absorption, highly collimated sources, small finite volumes, and subsurface regions, the computation time of the algorithm is rapid with good accuracy and compliments other description of photon diffusion. The model has the potential to do computations relevant to photodynamic therapy (PDT) and analysis of laser beam interaction with tissues.

  17. The effectiveness of theory- and model-based lifestyle interventions on HbA1c among patients with type 2 diabetes: a systematic review and meta-analysis.

    PubMed

    Doshmangir, P; Jahangiry, L; Farhangi, M A; Doshmangir, L; Faraji, L

    2018-02-01

    The prevalence of type 2 diabetes is rising rapidly around the world. A number of systematic reviews have provided evidence for the effectiveness of lifestyle interventions on diabetic patients. The effectiveness of theory- and model-based education-lifestyle interventions for diabetic patients are unclear. The systematic review and meta-analysis aimed to evaluate and quantify the impact of theory-based lifestyle interventions on type 2 diabetes. A literature search of authentic electronic resources including PubMed, Scopus, and Cochrane collaboration was performed to identify published papers between January 2002 and July 2016. The PICOs (participants, intervention, comparison, and outcomes) elements were used for the selection of studies to meet the inclusion and exclusion criteria. Mean differences and standard deviations of hemoglobin A1c (HbA1c [mmol/mol]) level in baseline and follow-up measures of studies in intervention and control groups were considered for data synthesis. A random-effects model was used for estimating pooled effect sizes. To investigate the source of heterogeneity, predefined subgroup analyses were performed using trial duration, baseline HbA1c (mmol/mol) level, and the age of participants. Meta-regression was performed to examine the contribution of trial duration, baseline HbA1c (mmol/mol) level, the age of participants, and mean differences of HbA1c (mmol/mol) level. The significant level was considered P < 0.05. Eighteen studies with 2384 participants met the inclusion criteria. The pooled main outcomes by random-effects model showed significant improvements in HbA1c (mmol/mol) -5.35% (95% confidence interval = -6.3, -4.40; P < 0.001) with the evidence of heterogeneity across studies. The findings of this meta-analysis suggest that theory- and model-based lifestyle interventions have positive effects on HbA1c (mmol/mol) indices in patients with type 2 diabetes. Health education theories have been applied as a useful tool for lifestyle change among people with type 2 diabetes. Copyright © 2017 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  18. REPORT OF RESEARCH ACCOMPLISHMENTS AND FUTURE GOALS HIGH ENERGY PHYSICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wise, Mark B.; Kapustin, Anton N.; Schwarz, John Henry

    Caltech High Energy Physics (HEP) has a broad program in both experimental and theoretical physics. We are known for our creativity and leadership. The future is uncertain and we strive to be involved in all the major areas of experimental and theoretical HEP physics so no matter where the important discoveries occur we are well positioned to play an important role. An outstanding group of postdoctoral scholars, graduate students, staff scientists, and technical and administrative personnel support our efforts in experimental and theoretical physics. The PI’s on this grant are involved in the following program of experimental and theoretical activities:more » I) EXPERIMENTAL PHYSICS Our CMS group, led by Harvey Newman and Maria Spiropulu, has played a key role in the discovery and interpretation of the Higgs boson and in searches for new physics. They have important hardware responsibilities in both ECAL and HCAL and are also involved in the upgrades needed for the High Luminosity LHC. Newman's group also develops and operates Grid-based computing, networking, and collaborative systems for CMS and the US HEP community. The charged lepton (Mu2e) and quark BaBar flavor physics group is led by David Hitlin and Frank Porter. On Mu2e they have been instrumental in the design of the calorimeter. Construction responsibilities include one third of the crystals and associated readout as well as the calibration system. They also will have responsibility for a major part of the online system software. Although data taking ceased in 2008 the Caltech BaBar group is active on several new forefront analyses. The neutrino group is led by Ryan Patterson. They are central to NOvA's core oscillation physics program, to calibration, and to detector readiness being responsible for the production and installation of 12,000 APD arrays. They have key roles in neutrino appearance and disappearance analysis in MINOS and MINOS+. Sunil Golwala leads the dark matter direct detection effort. Areas of activity include: CDMS II data analysis, contributions to SuperCDMS Soudan operations and analysis, R&D towards SuperCDMS SNOLAB, development of a novel screener for radiocontamination (the BetaCage), and development of new WIMP detector concepts. Ren-Yuan Zhu leads the HEP crystal laboratory for the advanced detector R&D effort. The crystal lab is involved in development of novel scintillating crystals and has proposed several crystal based detector concepts for future HEP experiments at the energy and intensity frontiers. Its current research effort is concentrated on development of fast crystal scintillators with good radiation hardness and low cost. II) THEORETICAL PHYSICS The main theme of Sergei Gukov's current research is the relation between the geometry of quantum group invariants and their categorification, on the one hand, and the physics of supersymmetric gauge theory and string theory, on the other. Anton Kapustin's research spans a variety of topics in non-perturbative Quantum Field Theory (QFT). His main areas of interest are supersymmetric gauge theories, non-perturbative dualities in QFT, disorder operators, Topological Quantum Field Theory, and non-relativistic QFT. He is also interested in the foundations and possible generalizations of Quantum Mechanics. Hirosi Ooguri's current research has two main components. One is to find exact results in Calabi-Yau compactification of string theory. Another is to explore applications of the AdS/CFT correspondence. He also plans to continue his project with Caltech postdoctoral fellows on BPS spectra of supersymmetric gauge theories in diverse dimensions. John Preskill works on quantum information science. This field may lead to important future technologies, and also lead to new understanding of issues in fundamental physics John Schwarz has been exploring a number of topics in superstring theory/M-theory, supersymmetric gauge theory, and their AdS/CFT relationships. Much of the motivation for these studies is the desire to gain a deeper understanding of superstring theory and M-theory. The research interests of Mark Wise span particle physics, cosmology and nuclear physics. His recent work has centered on extensions of the standard model where baryon number and lepton number are gauged and extensions of the standard model that have novel sources of baryon number violation and new sources of charged lepton flavor violation« less

  19. Analytical and experimental study of high phase order induction motors

    NASA Technical Reports Server (NTRS)

    Klingshirn, Eugene A.

    1989-01-01

    Induction motors having more than three phases were investigated to determine their suitability for electric vehicle applications. The objective was to have a motor with a current rating lower than that of a three-phase motor. The name chosen for these is high phase order (HPO) motors. Motors having six phases and nine phases were given the most attention. It was found that HPO motors are quite suitable for electric vehicles, and for many other applications as well. They have characteristics which are as good as or better than three-phase motors for practically all applications where polyphase induction motors are appropriate. Some of the analysis methods are presented, and several of the equivalent circuits which facilitate the determination of harmonic currents and losses, or currents with unbalanced sources, are included. The sometimes large stator currents due to harmonics in the source voltages are pointed out. Filters which can limit these currents were developed. An analysis and description of these filters is included. Experimental results which confirm and illustrate much of the theory are also included. These include locked rotor test results and full-load performance with an open phase. Also shown are oscillograms which display the reduction in harmonic currents when a filter is used with the experimental motor supplied by a non-sinusoidal source.

  20. The effect of beamwidth on the analysis of electron-beam-induced current line scans

    NASA Astrophysics Data System (ADS)

    Luke, Keung L.

    1995-04-01

    A real electron beam has finite width, which has been almost universally ignored in electron-beam-induced current (EBIC) theories. Obvious examples are point-source-based EBIC analyses, which neglect both the finite volume of electron-hole carriers generated by an energetic electron beam of negligible width and the beamwidth when it is no longer negligible. Gaussian source-based analyses are more realistic but the beamwidth has not been included, partly because the generation volume is much larger than the beamwidth, but this is not always the case. In this article Donolato's Gaussian source-based EBIC equation is generalized to include the beamwidth of a Gaussian beam. This generalized equation is then used to study three problems: (1) the effect of beamwidth on EBIC line scans and on effective diffusion lengths and the results are applied to the analysis of the EBIC data of Dixon, Williams, Das, and Webb; (2) unresolved questions raised by others concerning the applicability of the Watanabe-Actor-Gatos method to real EBIC data to evaluate surface recombination velocity; (3) the effect of beamwidth on the methods proposed recently by the author to determine the surface recombination velocity and to discriminate between the Everhart-Hoff and Kanaya-Okayama ranges which is the correct one to use for analyzing EBIC line scans.

  1. Does Making Something Move Matter? Representations of Goals and Sources in Motion Events with Causal Sources

    ERIC Educational Resources Information Center

    Lakusta, Laura; Muentener, Paul; Petrillo, Lauren; Mullanaphy, Noelle; Muniz, Lauren

    2017-01-01

    Previous studies have shown a robust bias to express the goal path over the source path when describing events ("the bird flew into the pitcher," rather than "… out of the bucket into the pitcher"). Motivated by linguistic theory, this study manipulated the causal structure of events (specifically, making the source cause the…

  2. Issues in Optical Diffraction Theory

    PubMed Central

    Mielenz, Klaus D.

    2009-01-01

    This paper focuses on unresolved or poorly documented issues pertaining to Fresnel’s scalar diffraction theory and its modifications. In Sec. 2 it is pointed out that all thermal sources used in practice are finite in size and errors can result from insufficient coherence of the optical field. A quarter-wave criterion is applied to show how such errors can be avoided by placing the source at a large distance from the aperture plane, and it is found that in many cases it may be necessary to use collimated light as on the source side of a Fraunhofer experiment. If these precautions are not taken the theory of partial coherence may have to be used for the computations. In Sec. 3 it is recalled that for near-zone computations the Kirchhoff or Rayleigh-Sommerfeld integrals are applicable, but fail to correctly describe the energy flux across the aperture plane because they are not continuously differentiable with respect to the assumed geometrical field on the source side. This is remedied by formulating an improved theory in which the field on either side of a semi-reflecting screen is expressed as the superposition of mutually incoherent components which propagate in the opposite directions of the incident and reflected light. These components are defined as linear combinations of the Rayleigh-Sommerfeld integrals, so that they are rigorous solutions of the wave equation as well as continuously differentiable in the aperture plane. Algorithms for using the new theory for computing the diffraction patterns of circular apertures and slits at arbitrary distances z from either side of the aperture (down to z = ± 0.0003 λ) are presented, and numerical examples of the results are given. These results show that the incident geometrical field is modulated by diffraction before it reaches the aperture plane while the reflected field is spilled into the dark space. At distances from the aperture which are large compared to the wavelength λ these field expressions are reduced to the usual ones specified by Fresnel’s theory. In the specific case of a diffracting half plane the numerical results obtained were practically the same as those given by Sommerfeld’s rigorous theory. The modified theory developed in this paper is based on the explicit assumption that the scalar theory of light cannot explain plolarization effects. This premise is justified in Sec. 4, where it is shown that previous attempts to do so have produced dubious results. PMID:27504215

  3. Teaching Methods Utilizing a Field Theory Viewpoint in the Elementary Reading Program.

    ERIC Educational Resources Information Center

    LeChuga, Shirley; Lowry, Heath

    1980-01-01

    Suggests and lists sources of information on reading instruction that discuss the promotion and enrichment of the interactive learning process between children and their environment based on principles underlying the cognitive-field theory of learning. (MKM)

  4. An Adynamical, Graphical Approach to Quantum Gravity and Unification

    NASA Astrophysics Data System (ADS)

    Stuckey, W. M.; Silberstein, Michael; McDevitt, Timothy

    We use graphical field gradients in an adynamical, background independent fashion to propose a new approach to quantum gravity (QG) and unification. Our proposed reconciliation of general relativity (GR) and quantum field theory (QFT) is based on a modification of their graphical instantiations, i.e. Regge calculus and lattice gauge theory (LGT), respectively, which we assume are fundamental to their continuum counterparts. Accordingly, the fundamental structure is a graphical amalgam of space, time, and sources (in parlance of QFT) called a "space-time source element". These are fundamental elements of space, time, and sources, not source elements in space and time. The transition amplitude for a space-time source element is computed using a path integral with discrete graphical action. The action for a space-time source element is constructed from a difference matrix K and source vector J on the graph, as in lattice gauge theory. K is constructed from graphical field gradients so that it contains a non-trivial null space and J is then restricted to the row space of K, so that it is divergence-free and represents a conserved exchange of energy-momentum. This construct of K and J represents an adynamical global constraint (AGC) between sources, the space-time metric, and the energy-momentum content of the element, rather than a dynamical law for time-evolved entities. In this view, one manifestation of quantum gravity becomes evident when, for example, a single space-time source element spans adjoining simplices of the Regge calculus graph. Thus, energy conservation for the space-time source element includes contributions to the deficit angles between simplices. This idea is used to correct proper distance in the Einstein-de Sitter (EdS) cosmology model yielding a fit of the Union2 Compilation supernova data that matches ΛCDM without having to invoke accelerating expansion or dark energy. A similar modification to LGT results in an adynamical account of quantum interference.

  5. Characterizing performance improvement in primary care systems in Mesoamerica: A realist evaluation protocol.

    PubMed

    Munar, Wolfgang; Wahid, Syed S; Curry, Leslie

    2018-01-03

    Background . Improving performance of primary care systems in low- and middle-income countries (LMICs) may be a necessary condition for achievement of universal health coverage in the age of Sustainable Development Goals. The Salud Mesoamerica Initiative (SMI), a large-scale, multi-country program that uses supply-side financial incentives directed at the central-level of governments, and continuous, external evaluation of public, health sector performance to induce improvements in primary care performance in eight LMICs. This study protocol seeks to explain whether and how these interventions generate program effects in El Salvador and Honduras. Methods . This study presents the protocol for a study that uses a realist evaluation approach to develop a preliminary program theory that hypothesizes the interactions between context, interventions and the mechanisms that trigger outcomes. The program theory was completed through a scoping review of relevant empirical, peer-reviewed and grey literature; a sense-making workshop with program stakeholders; and content analysis of key SMI documents. The study will use a multiple case-study design with embedded units with contrasting cases. We define as a case the two primary care systems of Honduras and El Salvador, each with different context characteristics. Data will be collected through in-depth interviews with program actors and stakeholders, documentary review, and non-participatory observation. Data analysis will use inductive and deductive approaches to identify causal patterns organized as 'context, mechanism, outcome' configurations. The findings will be triangulated with existing secondary, qualitative and quantitative data sources, and contrasted against relevant theoretical literature. The study will end with a refined program theory. Findings will be published following the guidelines generated by the Realist and Meta-narrative Evidence Syntheses study (RAMESES II). This study will be performed contemporaneously with SMI's mid-term stage of implementation. Of the methods described, the preliminary program theory has been completed. Data collection, analysis and synthesis remain to be completed.

  6. Characterizing performance improvement in primary care systems in Mesoamerica: A realist evaluation protocol

    PubMed Central

    Munar, Wolfgang; Wahid, Syed S.; Curry, Leslie

    2018-01-01

    Background. Improving performance of primary care systems in low- and middle-income countries (LMICs) may be a necessary condition for achievement of universal health coverage in the age of Sustainable Development Goals. The Salud Mesoamerica Initiative (SMI), a large-scale, multi-country program that uses supply-side financial incentives directed at the central-level of governments, and continuous, external evaluation of public, health sector performance to induce improvements in primary care performance in eight LMICs. This study protocol seeks to explain whether and how these interventions generate program effects in El Salvador and Honduras. Methods. This study presents the protocol for a study that uses a realist evaluation approach to develop a preliminary program theory that hypothesizes the interactions between context, interventions and the mechanisms that trigger outcomes. The program theory was completed through a scoping review of relevant empirical, peer-reviewed and grey literature; a sense-making workshop with program stakeholders; and content analysis of key SMI documents. The study will use a multiple case-study design with embedded units with contrasting cases. We define as a case the two primary care systems of Honduras and El Salvador, each with different context characteristics. Data will be collected through in-depth interviews with program actors and stakeholders, documentary review, and non-participatory observation. Data analysis will use inductive and deductive approaches to identify causal patterns organized as ‘context, mechanism, outcome’ configurations. The findings will be triangulated with existing secondary, qualitative and quantitative data sources, and contrasted against relevant theoretical literature. The study will end with a refined program theory. Findings will be published following the guidelines generated by the Realist and Meta-narrative Evidence Syntheses study (RAMESES II). This study will be performed contemporaneously with SMI’s mid-term stage of implementation. Of the methods described, the preliminary program theory has been completed. Data collection, analysis and synthesis remain to be completed. PMID:29431181

  7. Role of diversity in ICA and IVA: theory and applications

    NASA Astrophysics Data System (ADS)

    Adalı, Tülay

    2016-05-01

    Independent component analysis (ICA) has been the most popular approach for solving the blind source separation problem. Starting from a simple linear mixing model and the assumption of statistical independence, ICA can recover a set of linearly-mixed sources to within a scaling and permutation ambiguity. It has been successfully applied to numerous data analysis problems in areas as diverse as biomedicine, communications, finance, geo- physics, and remote sensing. ICA can be achieved using different types of diversity—statistical property—and, can be posed to simultaneously account for multiple types of diversity such as higher-order-statistics, sample dependence, non-circularity, and nonstationarity. A recent generalization of ICA, independent vector analysis (IVA), generalizes ICA to multiple data sets and adds the use of one more type of diversity, statistical dependence across the data sets, for jointly achieving independent decomposition of multiple data sets. With the addition of each new diversity type, identification of a broader class of signals become possible, and in the case of IVA, this includes sources that are independent and identically distributed Gaussians. We review the fundamentals and properties of ICA and IVA when multiple types of diversity are taken into account, and then ask the question whether diversity plays an important role in practical applications as well. Examples from various domains are presented to demonstrate that in many scenarios it might be worthwhile to jointly account for multiple statistical properties. This paper is submitted in conjunction with the talk delivered for the "Unsupervised Learning and ICA Pioneer Award" at the 2016 SPIE Conference on Sensing and Analysis Technologies for Biomedical and Cognitive Applications.

  8. Bayesian Inference for Source Reconstruction: A Real-World Application

    DTIC Science & Technology

    2014-09-25

    deliberately or acci- dentally . Two examples of operational monitoring sensor networks are the deployment of biological sensor arrays by the Department of...remarkable paper, Cox [16] demonstrated that proba- bility theory, when interpreted as logic, is the only calculus that conforms to a consistent theory...of inference. This demonstration provides the firm logical basis for asserting that probability calculus is the unique quantitative theory of

  9. Accelerator and Fusion Research Division. Annual report, October 1978-September 1979

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1980-03-01

    Topics covered include: Super HILAC and Bevalac operations; high intensity uranium beams line item; advanced high charge state ion source; 184-inch synchrocyclotron; VENUS project; positron-electron project; high field superconducting accelerator magnets; beam cooling; accelerator theory; induction linac drivers; RF linacs and storage rings; theory; neutral beam systems development; experimental atomic physics; neutral beam plasma research; plasma theory; and the Tormac project. (GHT)

  10. Antecedents of open source software adoption in health care organizations: a qualitative survey of experts in Canada.

    PubMed

    Marsan, Josianne; Paré, Guy

    2013-08-01

    Open source software (OSS) adoption and use in health care organizations (HCOs) is relatively low in developed countries, but several contextual factors have recently encouraged the consideration of the possible role of OSS in information technology (IT) application portfolios. This article aims at developing a research model for investigating the antecedents of OSS adoption decisions in HCOs. Based on a conceptual framework derived from a synthesis of the literature on IT adoption in organizations, we conducted 18 semi-structured interviews with IT experts from all levels of the Province of Quebec's health and social services sector in Canada. We also interviewed 10 IT suppliers in the province. A qualitative data analysis of the interviews was performed to identify major antecedents of OSS adoption decisions in HCOs. Eight factors associated with three distinct theoretical perspectives influence OSS adoption. More specifically, they are associated with the classical diffusion of innovations theory, the theory of resources, as well as institutional theory and its spin-off, the organizing vision theory. The factors fall under three categories: the characteristics of OSS as an innovation, the characteristics of the HCO with respect to its ability to absorb OSS, and the characteristics of the external environment with respect to institutional pressures and public discourse surrounding OSS. We shed light on two novel factors that closely interact with each other: (1) interest of the health care community in the public discourse surrounding OSS, and (2) clarity, consistency and richness of this discourse, whether found in magazines or other media. OSS still raises many questions and presents several challenges for HCOs. It is crucial that the different factors that explain an HCO's decision on OSS adoption be considered simultaneously. Doing so allows a better understanding of HCOs' rationale when deciding to adopt, or not to adopt, OSS. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  11. The role of theory-driven graphic warning labels in motivation to quit: a qualitative study on perceptions from low-income, urban smokers.

    PubMed

    Mead, Erin L; Cohen, Joanna E; Kennedy, Caitlin E; Gallo, Joseph; Latkin, Carl A

    2015-02-07

    Use of communication theories in the development of pictorial health warning labels (graphic warning labels) for cigarette packaging might enhance labels' impact on motivation to quit, but research has been limited, particularly among low socioeconomic status (SES) populations in the U.S. This qualitative study explored perceptions of theory-based graphic warning labels and their role in motivation to quit among low-income smokers. A cross-sectional qualitative study was conducted with 25 low-income adult smokers in Baltimore, Maryland, who were purposively sampled from a community-based source population. Semi-structured, in-depth interviews were conducted from January to February 2014. Participants were asked about the motivational impact of 12 labels falling into four content categories: negative depictions of the health effects of smoking to smokers and others, and positive depictions of the benefits of quitting to smokers and others. Data were coded using a combined inductive/deductive approach and analyzed thematically through framework analysis. Labels depicting negative health effects to smokers were identified as most motivational, followed by labels depicting negative health effects to others. Reasons included perceived severity of and susceptibility to the effects, negative emotional reactions (such as fear), and concern for children. Labels about the benefits of quitting were described as motivational because of their hopefulness, characters as role models, and desire to improve family health. Reasons why labels were described as not motivational included lack of impact on perceived severity/susceptibility, low credibility, and fatalistic attitudes regarding the inevitability of disease. Labels designed to increase risk perceptions from smoking might be significant sources of motivation for low SES smokers. Findings suggest innovative theory-driven approaches for the design of labels, such as using former smokers as role models, contrasting healthy and unhealthy characters, and socially-oriented labels, might motivate low SES smokers to quit.

  12. The research evidence published in high impact nursing journals between 2000 and 2006: a quantitative content analysis.

    PubMed

    Mantzoukas, Stefanos

    2009-04-01

    Evidence-based practice has become an imperative for efficient, effective and safe practice. Furthermore, evidences emerging from published research are considered as valid knowledge sources to guiding practice. The aim of this paper is to review all research articles published in the top 10 general nursing journals for the years 2000-2006 to identify the methodologies used, the types of evidence these studies produced and the issues upon which they endeavored. Quantitative content analysis was implemented to study all published research papers of the top 10 general nursing journals for the years 2000-2006. The top 10 general nursing journals were included in the study. The abstracts of all research articles were analysed with regards the methodologies of enquiry, the types of evidence produced and the issues of study they endeavored upon. Percentages were developed as to enable conclusions to be drawn. The results for the category methodologies used were 7% experimental, 6% quasi-experimental, 39% non-experimental, 2% ethnographical studies, 7% phenomenological, 4% grounded theory, 1% action research, 1% case study, 15% unspecified, 5.5% other, 0.5% meta-synthesis, 2% meta-analysis, 5% literature reviews and 3% secondary analysis. For the category types of evidence were 4% hypothesis/theory testing, 11% evaluative, 5% comparative, 2% correlational, 46% descriptive, 5% interpretative and 27% exploratory. For the category issues of study were 45% practice/clinical, 8% educational, 11% professional, 3% spiritual/ethical/metaphysical, 26% health promotion and 7% managerial/policy. Published studies can provide adequate evidences for practice if nursing journals conceptualise evidence emerging from non-experimental and qualitative studies as relevant types of evidences for practice and develop appropriate mechanisms for assessing their validity. Also, nursing journals need to increase and encourage the publication of studies that implement RCT methodology, systematic reviews, meta-synthesis and meta-analysis methodologies. Finally, nursing journals need to encourage more high quality research evidence that derive from interpretative, theory testing and evaluative types of studies that are practice relevant.

  13. Multimodal approach to seismic pavement testing

    USGS Publications Warehouse

    Ryden, N.; Park, C.B.; Ulriksen, P.; Miller, R.D.

    2004-01-01

    A multimodal approach to nondestructive seismic pavement testing is described. The presented approach is based on multichannel analysis of all types of seismic waves propagating along the surface of the pavement. The multichannel data acquisition method is replaced by multichannel simulation with one receiver. This method uses only one accelerometer-receiver and a light hammer-source, to generate a synthetic receiver array. This data acquisition technique is made possible through careful triggering of the source and results in such simplification of the technique that it is made generally available. Multiple dispersion curves are automatically and objectively extracted using the multichannel analysis of surface waves processing scheme, which is described. Resulting dispersion curves in the high frequency range match with theoretical Lamb waves in a free plate. At lower frequencies there are several branches of dispersion curves corresponding to the lower layers of different stiffness in the pavement system. The observed behavior of multimodal dispersion curves is in agreement with theory, which has been validated through both numerical modeling and the transfer matrix method, by solving for complex wave numbers. ?? ASCE / JUNE 2004.

  14. Improving Patient Safety in Hospitals: Contributions of High-Reliability Theory and Normal Accident Theory

    PubMed Central

    Tamuz, Michal; Harrison, Michael I

    2006-01-01

    Objective To identify the distinctive contributions of high-reliability theory (HRT) and normal accident theory (NAT) as frameworks for examining five patient safety practices. Data Sources/Study Setting We reviewed and drew examples from studies of organization theory and health services research. Study Design After highlighting key differences between HRT and NAT, we applied the frames to five popular safety practices: double-checking medications, crew resource management (CRM), computerized physician order entry (CPOE), incident reporting, and root cause analysis (RCA). Principal Findings HRT highlights how double checking, which is designed to prevent errors, can undermine mindfulness of risk. NAT emphasizes that social redundancy can diffuse and reduce responsibility for locating mistakes. CRM promotes high reliability organizations by fostering deference to expertise, rather than rank. However, HRT also suggests that effective CRM depends on fundamental changes in organizational culture. NAT directs attention to an underinvestigated feature of CPOE: it tightens the coupling of the medication ordering process, and tight coupling increases the chances of a rapid and hard-to-contain spread of infrequent, but harmful errors. Conclusions Each frame can make a valuable contribution to improving patient safety. By applying the HRT and NAT frames, health care researchers and administrators can identify health care settings in which new and existing patient safety interventions are likely to be effective. Furthermore, they can learn how to improve patient safety, not only from analyzing mishaps, but also by studying the organizational consequences of implementing safety measures. PMID:16898984

  15. Theoretical Coalescence: A Method to Develop Qualitative Theory: The Example of Enduring.

    PubMed

    Morse, Janice M

    Qualitative research is frequently context bound, lacks generalizability, and is limited in scope. The purpose of this article was to describe a method, theoretical coalescence, that provides a strategy for analyzing complex, high-level concepts and for developing generalizable theory. Theoretical coalescence is a method of theoretical expansion, inductive inquiry, of theory development, that uses data (rather than themes, categories, and published extracts of data) as the primary source for analysis. Here, using the development of the lay concept of enduring as an example, I explore the scientific development of the concept in multiple settings over many projects and link it within the Praxis Theory of Suffering. As comprehension emerges when conducting theoretical coalescence, it is essential that raw data from various different situations be available for reinterpretation/reanalysis and comparison to identify the essential features of the concept. The concept is then reconstructed, with additional inquiry that builds description, and evidence is conducted and conceptualized to create a more expansive concept and theory. By utilizing apparently diverse data sets from different contexts that are linked by certain characteristics, the essential features of the concept emerge. Such inquiry is divergent and less bound by context yet purposeful, logical, and with significant pragmatic implications for practice in nursing and beyond our discipline. Theoretical coalescence is a means by which qualitative inquiry is broadened to make an impact, to accommodate new theoretical shifts and concepts, and to make qualitative research applied and accessible in new ways.

  16. Consumer lay theories on healthy nutrition: A Q methodology application in Germany.

    PubMed

    Yarar, Nadine; Orth, Ulrich R

    2018-01-01

    Food is an important driver of individual health, and an important subject in public policy and health intervention research. Viewpoints on what constitutes healthy nutrition, however, are manifold and highly subjective in nature, suggesting there is no one-size-fits-all behavioral change intervention. This research explores fundamental lay theories regarding healthy nutrition with consumers in Germany. The study aimed at identifying and characterizing distinct groups of consumers based on similarities and differences in the lay theories individuals hold by means of Q methodology. Thirty German consumers ranked a Q set of 63 statements representing a vast spectrum of individual opinions and beliefs on healthy nutrition into a quasi-normal distribution. Factor analysis identified four major lay theories on healthy nutrition: (1) "Healthy is what tastes good, in moderation", (2) "Healthy nutrition is expensive and inconvenient", (3) "Healthy is everything that makes me slim and pretty", and (4) "Only home-made, organic, and vegetarian food is healthy". Consensus existed among the theories about the question of whom to trust regarding nutritional information and the low relevance of information from official sources. Disagreement existed concerning the overall importance of healthy nutrition in day-to-day lives and whether food healthiness is related to organic or conventional production methods. The findings underscore that specific consumer groups should be engaged separately when intervening in healthy nutrition issues. Implications for public policies and intervention strategies are discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Theoretical Coalescence: A Method to Develop Qualitative Theory

    PubMed Central

    Morse, Janice M.

    2018-01-01

    Background Qualitative research is frequently context bound, lacks generalizability, and is limited in scope. Objectives The purpose of this article was to describe a method, theoretical coalescence, that provides a strategy for analyzing complex, high-level concepts and for developing generalizable theory. Theoretical coalescence is a method of theoretical expansion, inductive inquiry, of theory development, that uses data (rather than themes, categories, and published extracts of data) as the primary source for analysis. Here, using the development of the lay concept of enduring as an example, I explore the scientific development of the concept in multiple settings over many projects and link it within the Praxis Theory of Suffering. Methods As comprehension emerges when conducting theoretical coalescence, it is essential that raw data from various different situations be available for reinterpretation/reanalysis and comparison to identify the essential features of the concept. The concept is then reconstructed, with additional inquiry that builds description, and evidence is conducted and conceptualized to create a more expansive concept and theory. Results By utilizing apparently diverse data sets from different contexts that are linked by certain characteristics, the essential features of the concept emerge. Such inquiry is divergent and less bound by context yet purposeful, logical, and with significant pragmatic implications for practice in nursing and beyond our discipline. Conclusion Theoretical coalescence is a means by which qualitative inquiry is broadened to make an impact, to accommodate new theoretical shifts and concepts, and to make qualitative research applied and accessible in new ways. PMID:29360688

  18. To compute lightness, illumination is not estimated, it is held constant.

    PubMed

    Gilchrist, Alan L

    2018-05-03

    The light reaching the eye from a surface does not indicate the black-gray-white shade of a surface (called lightness) because the effects of illumination level are confounded with the reflectance of the surface. Rotating a gray paper relative to a light source alters its luminance (intensity of light reaching the eye) but the lightness of the paper remains relatively constant. Recent publications have argued, as had Helmholtz (1866/1924), that the visual system unconsciously estimates the direction and intensity of the light source. We report experiments in which this theory was pitted against an alternative theory according to which illumination level and surface reflectance are disentangled by comparing only those surfaces that are equally illuminated, in other words, by holding illumination level constant. A 3-dimensional scene was created within which the rotation of a target surface would be expected to become darker gray according to the lighting estimation theory, but lighter gray according to the equi-illumination comparison theory, with results clearly favoring the latter. In a further experiment cues held to indicate light source direction (cast shadows, attached shadows, and glossy highlights) were completely eliminated and yet this had no effect on the results. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  19. Inference of emission rates from multiple sources using Bayesian probability theory.

    PubMed

    Yee, Eugene; Flesch, Thomas K

    2010-03-01

    The determination of atmospheric emission rates from multiple sources using inversion (regularized least-squares or best-fit technique) is known to be very susceptible to measurement and model errors in the problem, rendering the solution unusable. In this paper, a new perspective is offered for this problem: namely, it is argued that the problem should be addressed as one of inference rather than inversion. Towards this objective, Bayesian probability theory is used to estimate the emission rates from multiple sources. The posterior probability distribution for the emission rates is derived, accounting fully for the measurement errors in the concentration data and the model errors in the dispersion model used to interpret the data. The Bayesian inferential methodology for emission rate recovery is validated against real dispersion data, obtained from a field experiment involving various source-sensor geometries (scenarios) consisting of four synthetic area sources and eight concentration sensors. The recovery of discrete emission rates from three different scenarios obtained using Bayesian inference and singular value decomposition inversion are compared and contrasted.

  20. Modeling water demand when households have multiple sources of water

    NASA Astrophysics Data System (ADS)

    Coulibaly, Lassina; Jakus, Paul M.; Keith, John E.

    2014-07-01

    A significant portion of the world's population lives in areas where public water delivery systems are unreliable and/or deliver poor quality water. In response, people have developed important alternatives to publicly supplied water. To date, most water demand research has been based on single-equation models for a single source of water, with very few studies that have examined water demand from two sources of water (where all nonpublic system water sources have been aggregated into a single demand). This modeling approach leads to two outcomes. First, the demand models do not capture the full range of alternatives, so the true economic relationship among the alternatives is obscured. Second, and more seriously, economic theory predicts that demand for a good becomes more price-elastic as the number of close substitutes increases. If researchers artificially limit the number of alternatives studied to something less than the true number, the price elasticity estimate may be biased downward. This paper examines water demand in a region with near universal access to piped water, but where system reliability and quality is such that many alternative sources of water exist. In extending the demand analysis to four sources of water, we are able to (i) demonstrate why households choose the water sources they do, (ii) provide a richer description of the demand relationships among sources, and (iii) calculate own-price elasticity estimates that are more elastic than those generally found in the literature.

  1. Developing and testing a positive theory of instrument choice: Renewable energy policy in the fifty American states

    NASA Astrophysics Data System (ADS)

    Ciocirlan, Cristina E.

    The environmental economics literature consistently suggests that properly designed and implemented economic incentives are superior to command-and-control regulation in reducing pollution. Economic incentives, such as green taxes, cap-and-trade programs, tax incentives, are able to reduce pollution in a cost-effective manner, provide flexibility to industry and stimulate innovation in cleaner technologies. In the past few decades, both federal and state governments have shown increased use of economic incentives in environmental policy. Some states have embraced them in an active manner, while others have failed to do so. This research uses a three-step analysis. First, it asks why some states employ more economic incentives than others to stimulate consumption of renewable energy by the residential, commercial and industrial sectors. Second, it asks why some states employ stronger incentives than others. And third, it asks why certain states employ certain instruments, such as electricity surcharges, cap-and-trade programs, tax incentives or grants, while others do not. The first two analyses were conducted using factor analysis and multiple regression analysis, while the third analysis employed logistic regression models to analyze the data. Data for all three analyses were obtained from a combination of primary and secondary sources. To address these questions, a theory of instrument choice at the state level, which includes both internal and external determinants of policy-making, was developed and tested. The state level of analysis was chosen. States have proven to be pioneers in designing policies to address greenhouse gases (see, for instance, the recent cap-and-trade legislation passed in California). The theory was operationalized with the help of four models: needs/responsiveness, interest group influence, professionalism/capacity and innovation-and-diffusion. The needs/responsiveness model suggests that states tend to choose more and stronger economic incentives when they are more dependent on conventional sources of energy, such as coal, oil and gas or when they have the potential to produce renewable energy. The interest group influence model suggests that instrument choice is ultimately a political decision, most likely to benefit some groups more than others. The professionalism/capacity model posits that states with more professional legislatures, with legislators who make more use of policy analysis, with more capacity to generate nonpartisan policy research and with larger agencies tend to employ more and stronger instruments to stimulate renewable energy consumption and production. And last, the innovation-and-diffusion model suggests that states with a proven innovation record in climate change tend to employ more and stronger economic incentives than states without such record. Also, this model explains states' instrument choice decisions as a function of the choices made by their neighbors.

  2. Archival Theory and the Shaping of Educational History: Utilizing New Sources and Reinterpreting Traditional Ones

    ERIC Educational Resources Information Center

    Glotzer, Richard

    2013-01-01

    Information technology has spawned new evidentiary sources, better retrieval systems for existing ones, and new tools for interpreting traditional source materials. These advances have contributed to a broadening of public participation in civil society (Blouin and Rosenberg 2006). In these culturally unsettled and economically fragile times…

  3. North Alabama Lightning Mapping Array (LMA): VHF Source Retrieval Algorithm and Error Analyses

    NASA Technical Reports Server (NTRS)

    Koshak, W. J.; Solakiewicz, R. J.; Blakeslee, R. J.; Goodman, S. J.; Christian, H. J.; Hall, J.; Bailey, J.; Krider, E. P.; Bateman, M. G.; Boccippio, D.

    2003-01-01

    Two approaches are used to characterize how accurately the North Alabama Lightning Mapping Array (LMA) is able to locate lightning VHF sources in space and in time. The first method uses a Monte Carlo computer simulation to estimate source retrieval errors. The simulation applies a VHF source retrieval algorithm that was recently developed at the NASA Marshall Space Flight Center (MSFC) and that is similar, but not identical to, the standard New Mexico Tech retrieval algorithm. The second method uses a purely theoretical technique (i.e., chi-squared Curvature Matrix Theory) to estimate retrieval errors. Both methods assume that the LMA system has an overall rms timing error of 50 ns, but all other possible errors (e.g., multiple sources per retrieval attempt) are neglected. The detailed spatial distributions of retrieval errors are provided. Given that the two methods are completely independent of one another, it is shown that they provide remarkably similar results. However, for many source locations, the Curvature Matrix Theory produces larger altitude error estimates than the (more realistic) Monte Carlo simulation.

  4. The Making of SPINdle

    NASA Astrophysics Data System (ADS)

    Lam, Ho-Pun; Governatori, Guido

    We present the design and implementation of SPINdle - an open source Java based defeasible logic reasoner capable to perform efficient and scalable reasoning on defeasible logic theories (including theories with over 1 million rules). The implementation covers both the standard and modal extensions to defeasible logics. It can be used as a standalone theory prover and can be embedded into any applications as a defeasible logic rule engine. It allows users or agents to issues queries, on a given knowledge base or a theory generated on the fly by other applications, and automatically produces the conclusions of its consequences. The theory can also be represented using XML.

  5. Caustic Singularities Of High-Gain, Dual-Shaped Reflectors

    NASA Technical Reports Server (NTRS)

    Galindo, Victor; Veruttipong, Thavath W.; Imbriale, William A.; Rengarajan, Sambiam

    1991-01-01

    Report presents study of some sources of error in analysis, by geometric theory of diffraction (GTD), of performance of high-gain, dual-shaped antenna reflector. Study probes into underlying analytic causes of singularity, with view toward devising and testing practical methods to avoid problems caused by singularity. Hybrid physical optics (PO) approach used to study near-field spillover or noise-temperature characteristics of high-gain relector antenna efficiently and accurately. Report illustrates this approach and underlying principles by presenting numerical results, for both offset and symmetrical reflector systems, computed by GTD, PO, and PO/GO methods.

  6. Analysis of the Capability and Limitations of Relativistic Gravity Measurements Using Radio Astronomy Methods

    NASA Technical Reports Server (NTRS)

    Shapiro, I. I.; Counselman, C. C., III

    1975-01-01

    The uses of radar observations of planets and very-long-baseline radio interferometric observations of extragalactic objects to test theories of gravitation are described in detail with special emphasis on sources of error. The accuracy achievable in these tests with data already obtained, can be summarized in terms of: retardation of signal propagation (radar), deflection of radio waves (interferometry), advance of planetary perihelia (radar), gravitational quadrupole moment of sun (radar), and time variation of gravitational constant (radar). The analyses completed to date have yielded no significant disagreement with the predictions of general relativity.

  7. The Origin of Primitive Cells, Nutrient Intake, and Non-Enzymatic Elongation of Encapsulated Nucleotides

    NASA Technical Reports Server (NTRS)

    Meierhenrich, Uwe J.; Filippi, Jean-Jacques; Meinert, Cornelia; Vierling, Pierre; Dworkin, Jason P.

    2009-01-01

    Fatty acids and fatty alcohols are commonly found in experiments simulating the prebiotic 'soup'. These amphiphiles can be synthesized under prebiotic conditions, at least as long as the molecules are chemically relatively simple and do not need to be enantiomerically pure. In the context of topical origin-of-life theories, two distinct formation pathways for amphiphiles have been described; one related to geophysical sites, such as marine hydrothermal systems, and another to extraterrestrial sources, such as the proto-solar nebula, which was fed by interplanetary and interstellar nebulae. The chemical analysis of each provides individual characteristic challenges.

  8. Microwave and hard X-ray emissions during the impulsive phase of solar flares: Nonthermal electron spectrum and time delay

    NASA Technical Reports Server (NTRS)

    Gu, Ye-Ming; Li, Chung-Sheng

    1986-01-01

    On the basis of the summing-up and analysis of the observations and theories about the impulsive microwave and hard X-ray bursts, the correlations between these two kinds of emissions were investigated. It is shown that it is only possible to explain the optically-thin microwave spectrum and its relations with the hard X-ray spectrum by means of the nonthermal source model. A simple nonthermal trap model in the mildly-relativistic case can consistently explain the main characteristics of the spectrum and the relative time delays.

  9. Health insurance theory: the case of the missing welfare gain.

    PubMed

    Nyman, John A

    2008-11-01

    An important source of value is missing from the conventional welfare analysis of moral hazard, namely, the effect of income transfers (from those who purchase insurance and remain healthy to those who become ill) on purchases of medical care. Income transfers are contained within the price reduction that is associated with standard health insurance. However, in contrast to the income effects contained within an exogenous price decrease, these income transfers act to shift out the demand for medical care. As a result, the consumer's willingness to pay for medical care increases and the resulting additional consumption is welfare increasing.

  10. Effects of rotating flows on combustion and jet noise.

    NASA Technical Reports Server (NTRS)

    Schwartz, I. R.

    1972-01-01

    Experimental investigations of combustion in rotating (swirling) flow have shown that the mixing and combustion processes were accelerated, flame length and noise levels significantly decreased, and flame stability increased relative to that obtained without rotation. Unsteady burning accompanied by a pulsating flame, violent fluctuating jet, and intense noise present in straight flow burning were not present in rotating flow burning. Correlations between theory and experiment show good agreement. Such effects due to rotating flows could lead to suppressing jet noise, improving combustion, reducing pollution, and decreasing aircraft engine size. Quantitative analysis of the aero-acoustic relationship and noise source characteristics are needed.-

  11. Lake water quality mapping from LANDSAT

    NASA Technical Reports Server (NTRS)

    Scherz, J. P.

    1977-01-01

    The lakes in three LANDSAT scenes were mapped by the Bendix MDAS multispectral analysis system. Field checking the maps by three separate individuals revealed approximately 90-95% correct classification for the lake categories selected. Variations between observers was about 5%. From the MDAS color coded maps the lake with the worst algae problem was easily located. This lake was closely checked and a pollution source of 100 cows was found in the springs which fed this lake. The theory, lab work and field work which made it possible for this demonstration project to be a practical lake classification procedure are presented.

  12. Analysis of dangerous area of single berth oil tanker operations based on CFD

    NASA Astrophysics Data System (ADS)

    Shi, Lina; Zhu, Faxin; Lu, Jinshu; Wu, Wenfeng; Zhang, Min; Zheng, Hailin

    2018-04-01

    Based on the single process in the liquid cargo tanker berths in the state as the research object, we analyzed the single berth oil tanker in the process of VOCs diffusion theory, built network model of VOCs diffusion with Gambit preprocessor, set up the simulation boundary conditions and simulated the five detection point sources in specific factors under the influence of VOCs concentration change with time by using Fluent software. We analyzed the dangerous area of single berth oil tanker operations through the diffusion of VOCs, so as to ensure the safe operation of oil tanker.

  13. Complex problems require complex solutions: the utility of social quality theory for addressing the Social Determinants of Health.

    PubMed

    Ward, Paul R; Meyer, Samantha B; Verity, Fiona; Gill, Tiffany K; Luong, Tini C N

    2011-08-05

    In order to improve the health of the most vulnerable groups in society, the WHO Commission on Social Determinants of Health (CSDH) called for multi-sectoral action, which requires research and policy on the multiple and inter-linking factors shaping health outcomes. Most conceptual tools available to researchers tend to focus on singular and specific social determinants of health (SDH) (e.g. social capital, empowerment, social inclusion). However, a new and innovative conceptual framework, known as social quality theory, facilitates a more complex and complete understanding of the SDH, with its focus on four domains: social cohesion, social inclusion, social empowerment and socioeconomic security, all within the same conceptual framework. This paper provides both an overview of social quality theory in addition to findings from a national survey of social quality in Australia, as a means of demonstrating the operationalisation of the theory. Data were collected using a national random postal survey of 1044 respondents in September, 2009. Multivariate logistic regression analysis was conducted. Statistical analysis revealed that people on lower incomes (less than $45000) experience worse social quality across all of the four domains: lower socio-economic security, lower levels of membership of organisations (lower social cohesion), higher levels of discrimination and less political action (lower social inclusion) and lower social empowerment. The findings were mixed in terms of age, with people over 65 years experiencing lower socio-economic security, but having higher levels of social cohesion, experiencing lower levels of discrimination (higher social inclusion) and engaging in more political action (higher social empowerment). In terms of gender, women had higher social cohesion than men, although also experienced more discrimination (lower social inclusion). Applying social quality theory allows researchers and policy makers to measure and respond to the multiple sources of oppression and advantage experienced by certain population groups, and to monitor the effectiveness of interventions over time.

  14. Complex problems require complex solutions: the utility of social quality theory for addressing the Social Determinants of Health

    PubMed Central

    2011-01-01

    Background In order to improve the health of the most vulnerable groups in society, the WHO Commission on Social Determinants of Health (CSDH) called for multi-sectoral action, which requires research and policy on the multiple and inter-linking factors shaping health outcomes. Most conceptual tools available to researchers tend to focus on singular and specific social determinants of health (SDH) (e.g. social capital, empowerment, social inclusion). However, a new and innovative conceptual framework, known as social quality theory, facilitates a more complex and complete understanding of the SDH, with its focus on four domains: social cohesion, social inclusion, social empowerment and socioeconomic security, all within the same conceptual framework. This paper provides both an overview of social quality theory in addition to findings from a national survey of social quality in Australia, as a means of demonstrating the operationalisation of the theory. Methods Data were collected using a national random postal survey of 1044 respondents in September, 2009. Multivariate logistic regression analysis was conducted. Results Statistical analysis revealed that people on lower incomes (less than $45000) experience worse social quality across all of the four domains: lower socio-economic security, lower levels of membership of organisations (lower social cohesion), higher levels of discrimination and less political action (lower social inclusion) and lower social empowerment. The findings were mixed in terms of age, with people over 65 years experiencing lower socio-economic security, but having higher levels of social cohesion, experiencing lower levels of discrimination (higher social inclusion) and engaging in more political action (higher social empowerment). In terms of gender, women had higher social cohesion than men, although also experienced more discrimination (lower social inclusion). Conclusions Applying social quality theory allows researchers and policy makers to measure and respond to the multiple sources of oppression and advantage experienced by certain population groups, and to monitor the effectiveness of interventions over time. PMID:21819576

  15. Advanced capabilities for materials modelling with Quantum ESPRESSO

    NASA Astrophysics Data System (ADS)

    Giannozzi, P.; Andreussi, O.; Brumme, T.; Bunau, O.; Buongiorno Nardelli, M.; Calandra, M.; Car, R.; Cavazzoni, C.; Ceresoli, D.; Cococcioni, M.; Colonna, N.; Carnimeo, I.; Dal Corso, A.; de Gironcoli, S.; Delugas, P.; DiStasio, R. A., Jr.; Ferretti, A.; Floris, A.; Fratesi, G.; Fugallo, G.; Gebauer, R.; Gerstmann, U.; Giustino, F.; Gorni, T.; Jia, J.; Kawamura, M.; Ko, H.-Y.; Kokalj, A.; Küçükbenli, E.; Lazzeri, M.; Marsili, M.; Marzari, N.; Mauri, F.; Nguyen, N. L.; Nguyen, H.-V.; Otero-de-la-Roza, A.; Paulatto, L.; Poncé, S.; Rocca, D.; Sabatini, R.; Santra, B.; Schlipf, M.; Seitsonen, A. P.; Smogunov, A.; Timrov, I.; Thonhauser, T.; Umari, P.; Vast, N.; Wu, X.; Baroni, S.

    2017-11-01

    Quantum EXPRESSO is an integrated suite of open-source computer codes for quantum simulations of materials using state-of-the-art electronic-structure techniques, based on density-functional theory, density-functional perturbation theory, and many-body perturbation theory, within the plane-wave pseudopotential and projector-augmented-wave approaches. Quantum EXPRESSO owes its popularity to the wide variety of properties and processes it allows to simulate, to its performance on an increasingly broad array of hardware architectures, and to a community of researchers that rely on its capabilities as a core open-source development platform to implement their ideas. In this paper we describe recent extensions and improvements, covering new methodologies and property calculators, improved parallelization, code modularization, and extended interoperability both within the distribution and with external software.

  16. Advanced capabilities for materials modelling with Quantum ESPRESSO.

    PubMed

    Giannozzi, P; Andreussi, O; Brumme, T; Bunau, O; Buongiorno Nardelli, M; Calandra, M; Car, R; Cavazzoni, C; Ceresoli, D; Cococcioni, M; Colonna, N; Carnimeo, I; Dal Corso, A; de Gironcoli, S; Delugas, P; DiStasio, R A; Ferretti, A; Floris, A; Fratesi, G; Fugallo, G; Gebauer, R; Gerstmann, U; Giustino, F; Gorni, T; Jia, J; Kawamura, M; Ko, H-Y; Kokalj, A; Küçükbenli, E; Lazzeri, M; Marsili, M; Marzari, N; Mauri, F; Nguyen, N L; Nguyen, H-V; Otero-de-la-Roza, A; Paulatto, L; Poncé, S; Rocca, D; Sabatini, R; Santra, B; Schlipf, M; Seitsonen, A P; Smogunov, A; Timrov, I; Thonhauser, T; Umari, P; Vast, N; Wu, X; Baroni, S

    2017-10-24

    Quantum EXPRESSO is an integrated suite of open-source computer codes for quantum simulations of materials using state-of-the-art electronic-structure techniques, based on density-functional theory, density-functional perturbation theory, and many-body perturbation theory, within the plane-wave pseudopotential and projector-augmented-wave approaches. Quantum EXPRESSO owes its popularity to the wide variety of properties and processes it allows to simulate, to its performance on an increasingly broad array of hardware architectures, and to a community of researchers that rely on its capabilities as a core open-source development platform to implement their ideas. In this paper we describe recent extensions and improvements, covering new methodologies and property calculators, improved parallelization, code modularization, and extended interoperability both within the distribution and with external software.

  17. Advanced capabilities for materials modelling with Quantum ESPRESSO.

    PubMed

    Andreussi, Oliviero; Brumme, Thomas; Bunau, Oana; Buongiorno Nardelli, Marco; Calandra, Matteo; Car, Roberto; Cavazzoni, Carlo; Ceresoli, Davide; Cococcioni, Matteo; Colonna, Nicola; Carnimeo, Ivan; Dal Corso, Andrea; de Gironcoli, Stefano; Delugas, Pietro; DiStasio, Robert; Ferretti, Andrea; Floris, Andrea; Fratesi, Guido; Fugallo, Giorgia; Gebauer, Ralph; Gerstmann, Uwe; Giustino, Feliciano; Gorni, Tommaso; Jia, Junteng; Kawamura, Mitsuaki; Ko, Hsin-Yu; Kokalj, Anton; Küçükbenli, Emine; Lazzeri, Michele; Marsili, Margherita; Marzari, Nicola; Mauri, Francesco; Nguyen, Ngoc Linh; Nguyen, Huy-Viet; Otero-de-la-Roza, Alberto; Paulatto, Lorenzo; Poncé, Samuel; Giannozzi, Paolo; Rocca, Dario; Sabatini, Riccardo; Santra, Biswajit; Schlipf, Martin; Seitsonen, Ari Paavo; Smogunov, Alexander; Timrov, Iurii; Thonhauser, Timo; Umari, Paolo; Vast, Nathalie; Wu, Xifan; Baroni, Stefano

    2017-09-27

    Quantum ESPRESSO is an integrated suite of open-source computer codes for quantum simulations of materials using state-of-the art electronic-structure techniques, based on density-functional theory, density-functional perturbation theory, and many-body perturbation theory, within the plane-wave pseudo-potential and projector-augmented-wave approaches. Quantum ESPRESSO owes its popularity to the wide variety of properties and processes it allows to simulate, to its performance on an increasingly broad array of hardware architectures, and to a community of researchers that rely on its capabilities as a core open-source development platform to implement theirs ideas. In this paper we describe recent extensions and improvements, covering new methodologies and property calculators, improved parallelization, code modularization, and extended interoperability both within the distribution and with external software. © 2017 IOP Publishing Ltd.

  18. Theory of acoustic design of opera house and a design proposal

    NASA Astrophysics Data System (ADS)

    Ando, Yoichi

    2004-05-01

    First of all, the theory of subjective preference for sound fields based on the model of auditory-brain system is briefly mentioned. It consists of the temporal factors and spatial factors associated with the left and right cerebral hemispheres, respectively. The temporal criteria are the initial time delay gap between the direct sound and the first Reflection (Dt1) and the subsequent reverberation time (Tsub). These preferred conditions are related to the minimum value of effective duration of the running autocorrelation function of source signals (te)min. The spatial criteria are binaural listening level (LL) and the IACC, which may be extracted from the interaural crosscorrelation function. In the opera house, there are two different kind of sound sources, i.e., the vocal source of relatively short values of (te)min in the stage and the orchestra music of long values of (te)min in the pit. For these sources, a proposal is made here.

  19. Application of Behavioral Theories to Disaster and Emergency Health Preparedness: A Systematic Review

    PubMed Central

    Ejeta, Luche Tadesse; Ardalan, Ali; Paton, Douglas

    2015-01-01

    Background: Preparedness for disasters and emergencies at individual, community and organizational levels could be more effective tools in mitigating (the growing incidence) of disaster risk and ameliorating their impacts. That is, to play more significant roles in disaster risk reduction (DRR). Preparedness efforts focus on changing human behaviors in ways that reduce people’s risk and increase their ability to cope with hazard consequences. While preparedness initiatives have used behavioral theories to facilitate DRR, many theories have been used and little is known about which behavioral theories are more commonly used, where they have been used, and why they have been preferred over alternative behavioral theories. Given that theories differ with respect to the variables used and the relationship between them, a systematic analysis is an essential first step to answering questions about the relative utility of theories and providing a more robust evidence base for preparedness components of DRR strategies. The goal of this systematic review was to search and summarize evidence by assessing the application of behavioral theories to disaster and emergency health preparedness across the world. Methods: The protocol was prepared in which the study objectives, questions, inclusion and exclusion criteria, and sensitive search strategies were developed and pilot-tested at the beginning of the study. Using selected keywords, articles were searched mainly in PubMed, Scopus, Mosby’s Index (Nursing Index) and Safetylit databases. Articles were assessed based on their titles, abstracts, and their full texts. The data were extracted from selected articles and results were presented using qualitative and quantitative methods. Results: In total, 2040 titles, 450 abstracts and 62 full texts of articles were assessed for eligibility criteria, whilst five articles were archived from other sources, and then finally, 33 articles were selected. The Health Belief Model (HBM), Extended Parallel Process Model (EPPM), Theory of Planned Behavior (TPB) and Social Cognitive Theories were most commonly applied to influenza (H1N1 and H5N1), floods, and earthquake hazards. Studies were predominantly conducted in USA (13 studies). In Asia, where the annual number of disasters and victims exceeds those in other continents, only three studies were identified. Overall, the main constructs of HBM (perceived susceptibility, severity, benefits, and barriers), EPPM (higher threat and higher efficacy), TPB (attitude and subjective norm), and the majority of the constructs utilized in Social Cognitive Theories were associated with preparedness for diverse hazards. However, while all the theories described above describe the relationships between constituent variables, with the exception of research on Social Cognitive Theories, few studies of other theories and models used path analysis to identify the interdependence relationships between the constructs described in the respective theories/models. Similarly, few identified how other mediating  variables could influence disaster and emergency preparedness.  Conclusions: The existing evidence on the application of behavioral theories and models to disaster and emergency preparedness is chiefly from developed countries. This raises issues regarding their utility in countries, particularly in Asisa and the Middle East, where cultural characteristics are very different to those prevailing in the Western countries in which theories have been developed and tested. The theories and models discussed here have been applied predominantly to disease outbreaks and natural hazards, and information on their utility as guides to preparedness for man-made hazards is lacking. Hence, future studies related to behavioral theories and models addressing preparedness need to target developing countries where disaster risk  and the consequent need for preparedness is high. A need for additional work on demonstrating the relationships of variables and constructs, including more clearly articulating roles for mediating effects was also identified in this analysis.  PMID:26203400

  20. Microscopic Sources of Paramagnetic Noise on α-Al2O3 Substrates for Superconducting Qubits

    NASA Astrophysics Data System (ADS)

    Dubois, Jonathan; Lee, Donghwa; Lordi, Vince

    2014-03-01

    Superconducting qubits (SQs) represent a promising route to achieving a scalable quantum computer. However, the coupling between electro-dynamic qubits and (as yet largely unidentified) ambient parasitic noise sources has so far limited the functionality of current SQs by limiting coherence times of the quantum states below a practical threshold for measurement and manipulation. Further improvement can be enabled by a detailed understanding of the various noise sources afflicting SQs. In this work, first principles density functional theory (DFT) calculations are employed to identify the microscopic origins of magnetic noise sources in SQs on an α-Al2O3 substrate. The results indicate that it is unlikely that the existence of intrinsic point defects and defect complexes in the substrate are responsible for low frequency noise in these systems. Rather, a comprehensive analysis of extrinsic defects shows that surface aluminum ions interacting with ambient molecules will form a bath of magnetic moments that can couple to the SQ paramagnetically. The microscopic origin of this magnetic noise source is discussed and strategies for ameliorating the effects of these magnetic defects are proposed. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  1. Transient pressure analysis of fractured well in bi-zonal gas reservoirs

    NASA Astrophysics Data System (ADS)

    Zhao, Yu-Long; Zhang, Lie-Hui; Liu, Yong-hui; Hu, Shu-Yong; Liu, Qi-Guo

    2015-05-01

    For hydraulic fractured well, how to evaluate the properties of fracture and formation are always tough jobs and it is very complex to use the conventional method to do that, especially for partially penetrating fractured well. Although the source function is a very powerful tool to analyze the transient pressure for complex structure well, the corresponding reports on gas reservoir are rare. In this paper, the continuous point source functions in anisotropic reservoirs are derived on the basis of source function theory, Laplace transform method and Duhamel principle. Application of construction method, the continuous point source functions in bi-zonal gas reservoir with closed upper and lower boundaries are obtained. Sequentially, the physical models and transient pressure solutions are developed for fully and partially penetrating fractured vertical wells in this reservoir. Type curves of dimensionless pseudo-pressure and its derivative as function of dimensionless time are plotted as well by numerical inversion algorithm, and the flow periods and sensitive factors are also analyzed. The source functions and solutions of fractured well have both theoretical and practical application in well test interpretation for such gas reservoirs, especial for the well with stimulated reservoir volume around the well in unconventional gas reservoir by massive hydraulic fracturing which always can be described with the composite model.

  2. Two Wrongs Make a Right: Addressing Underreporting in Binary Data from Multiple Sources.

    PubMed

    Cook, Scott J; Blas, Betsabe; Carroll, Raymond J; Sinha, Samiran

    2017-04-01

    Media-based event data-i.e., data comprised from reporting by media outlets-are widely used in political science research. However, events of interest (e.g., strikes, protests, conflict) are often underreported by these primary and secondary sources, producing incomplete data that risks inconsistency and bias in subsequent analysis. While general strategies exist to help ameliorate this bias, these methods do not make full use of the information often available to researchers. Specifically, much of the event data used in the social sciences is drawn from multiple, overlapping news sources (e.g., Agence France-Presse, Reuters). Therefore, we propose a novel maximum likelihood estimator that corrects for misclassification in data arising from multiple sources. In the most general formulation of our estimator, researchers can specify separate sets of predictors for the true-event model and each of the misclassification models characterizing whether a source fails to report on an event. As such, researchers are able to accurately test theories on both the causes of and reporting on an event of interest. Simulations evidence that our technique regularly out performs current strategies that either neglect misclassification, the unique features of the data-generating process, or both. We also illustrate the utility of this method with a model of repression using the Social Conflict in Africa Database.

  3. Quantitative EEG and low resolution electromagnetic tomography (LORETA) imaging of patients with persistent auditory hallucinations.

    PubMed

    Lee, Seung-Hwan; Wynn, Jonathan K; Green, Michael F; Kim, Hyun; Lee, Kang-Joon; Nam, Min; Park, Joong-Kyu; Chung, Young-Cho

    2006-04-01

    Electrophysiological studies have demonstrated gamma and beta frequency oscillations in response to auditory stimuli. The purpose of this study was to test whether auditory hallucinations (AH) in schizophrenia patients reflect abnormalities in gamma and beta frequency oscillations and to investigate source generators of these abnormalities. This theory was tested using quantitative electroencephalography (qEEG) and low-resolution electromagnetic tomography (LORETA) source imaging. Twenty-five schizophrenia patients with treatment refractory AH, lasting for at least 2 years, and 23 schizophrenia patients with non-AH (N-AH) in the past 2 years were recruited for the study. Spectral analysis of the qEEG and source imaging of frequency bands of artifact-free 30 s epochs were examined during rest. AH patients showed significantly increased beta 1 and beta 2 frequency amplitude compared with N-AH patients. Gamma and beta (2 and 3) frequencies were significantly correlated in AH but not in N-AH patients. Source imaging revealed significantly increased beta (1 and 2) activity in the left inferior parietal lobule and the left medial frontal gyrus in AH versus N-AH patients. These results imply that AH is reflecting increased beta frequency oscillations with neural generators localized in speech-related areas.

  4. Porous elastic system with nonlinear damping and sources terms

    NASA Astrophysics Data System (ADS)

    Freitas, Mirelson M.; Santos, M. L.; Langa, José A.

    2018-02-01

    We study the long-time behavior of porous-elastic system, focusing on the interplay between nonlinear damping and source terms. The sources may represent restoring forces, but may also be focusing thus potentially amplifying the total energy which is the primary scenario of interest. By employing nonlinear semigroups and the theory of monotone operators, we obtain several results on the existence of local and global weak solutions, and uniqueness of weak solutions. Moreover, we prove that such unique solutions depend continuously on the initial data. Under some restrictions on the parameters, we also prove that every weak solution to our system blows up in finite time, provided the initial energy is negative and the sources are more dominant than the damping in the system. Additional results are obtained via careful analysis involving the Nehari Manifold. Specifically, we prove the existence of a unique global weak solution with initial data coming from the "good" part of the potential well. For such a global solution, we prove that the total energy of the system decays exponentially or algebraically, depending on the behavior of the dissipation in the system near the origin. We also prove the existence of a global attractor.

  5. MAIL LOG, program theory, volume 2

    NASA Technical Reports Server (NTRS)

    Harris, D. K.

    1979-01-01

    Information relevant to the MAIL LOG program theory is documented. The L-files for mail correspondence, design information release/report, and the drawing/engineering order are given. In addition, sources for miscellaneous external routines and special support routines are documented along with a glossary of terms.

  6. Situated Learning in Computer Science Education

    ERIC Educational Resources Information Center

    Ben-Ari, Mordechai

    2004-01-01

    Sociocultural theories of learning such as Wenger and Lave's situated learning have been suggested as alternatives to cognitive theories of learning like constructivism. This article examines situated learning within the context of computer science (CS) education. Situated learning accurately describes some CS communities like open-source software…

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quennet, Marcel, E-mail: marcel.quennet@fu-berlin.de; Institut für Chemie und Biochemie, Freie Universität Berlin, Takustraße 3, 14195 Berlin; Ritscher, Anna

    In this work the Cu/Zn order-disorder transition in Cu{sub 2}ZnSnS{sub 4} kesterites on Wyckoff positions 2c and 2d was investigated by a structural and electronic analysis in theory and experiment. For experimental investigations stoichiometric samples with different Cu/Zn order, annealed in the temperature range of 473–623 K and afterwards quenched, were used. The optical gaps were determined using the Derivation of Absorption Spectrum Fitting (DASF) method. Furthermore, the order-disorder transition was examined by DFT calculations for a closer analysis of the origins of the reduced band gap, showing a good agreement with experimental data with respect to structural and electronicmore » properties. Our studies show a slight increase of lattice parameter c in the kesterite lattice with increasing disorder. Additionally, a reduced band gap was observed with increasing disorder, which is an effect of newly occurring binding motifs in the disordered kesterite structure. - Highlights: • Experimental and theoretical investigation on the order-disorder transition in kesterites. • Slight enlargements of lattice constants due to disorder in experiment and theory. • Strong band gap fluctuations with decreasing order. • Electronic structure deviations due to changing binding motifs. • Disorder as possible main source of low open-circuit voltages.« less

  8. Social Physique Anxiety and Intention to Be Physically Active: A Self-Determination Theory Approach.

    PubMed

    Sicilia, Álvaro; Sáenz-Alvarez, Piedad; González-Cutre, David; Ferriz, Roberto

    2016-12-01

    Based on self-determination theory, the purpose of this study was to analyze the relationship between social physique anxiety and intention to be physically active, while taking into account the mediating effects of the basic psychological needs and behavioral regulations in exercise. Having obtained parents' prior consent, 390 students in secondary school (218 boys, 172 girls; M age  = 15.10 years, SD = 1.94 years) completed a self-administered questionnaire during physical education class that assessed the target variables. Preliminary analyses included means, standard deviations, and bivariate correlations among the target variables. Next, a path analysis was performed using the maximum likelihood estimation method with the bootstrapping procedure in the statistical package AMOS 19. Analysis revealed that social physique anxiety negatively predicted intention to be physically active through mediation of the basic psychological needs and the 3 autonomous forms of motivation (i.e., intrinsic motivation, integrated regulation, and identified regulation). The results suggest that social physique anxiety is an internal source of controlling influence that hinders basic psychological need satisfaction and autonomous motivation in exercise, and interventions aimed at reducing social physique anxiety could promote future exercise.

  9. On the role of the frozen surface approximation in small wave-height perturbation theory for moving surfaces

    NASA Astrophysics Data System (ADS)

    Keiffer, Richard; Novarini, Jorge; Scharstein, Robert

    2002-11-01

    In the standard development of the small wave-height approximation (SWHA) perturbation theory for scattering from moving rough surfaces [e.g., E. Y. Harper and F. M. Labianca, J. Acoust. Soc. Am. 58, 349-364 (1975)] the necessity for any sort of frozen surface approximation is avoided by the replacement of the rough boundary by a flat (and static) boundary. In this paper, this seemingly fortuitous byproduct of the small wave-height approximation is examined and found to fail to fully agree with an analysis based on the kinematics of the problem. Specifically, the first-order correction term from standard perturbation approach predicts a scattered amplitude that depends on the source frequency, whereas the kinematics of the problem point to a scattered amplitude that depends on the scattered frequency. It is shown that a perturbation approach in which an explicit frozen surface approximation is made before the SWHA is invoked predicts (first-order) scattered amplitudes that are in agreement with the kinematic analysis. [Work supported by ONR/NRL (PE 61153N-32) and by grants of computer time DoD HPC Shared Resource Center at Stennis Space Center, MS.

  10. NEFI: Network Extraction From Images

    PubMed Central

    Dirnberger, M.; Kehl, T.; Neumann, A.

    2015-01-01

    Networks are amongst the central building blocks of many systems. Given a graph of a network, methods from graph theory enable a precise investigation of its properties. Software for the analysis of graphs is widely available and has been applied to study various types of networks. In some applications, graph acquisition is relatively simple. However, for many networks data collection relies on images where graph extraction requires domain-specific solutions. Here we introduce NEFI, a tool that extracts graphs from images of networks originating in various domains. Regarding previous work on graph extraction, theoretical results are fully accessible only to an expert audience and ready-to-use implementations for non-experts are rarely available or insufficiently documented. NEFI provides a novel platform allowing practitioners to easily extract graphs from images by combining basic tools from image processing, computer vision and graph theory. Thus, NEFI constitutes an alternative to tedious manual graph extraction and special purpose tools. We anticipate NEFI to enable time-efficient collection of large datasets. The analysis of these novel datasets may open up the possibility to gain new insights into the structure and function of various networks. NEFI is open source and available at http://nefi.mpi-inf.mpg.de. PMID:26521675

  11. The Effects of Global Warming on Temperature and Precipitation Trends in Northeast America

    NASA Astrophysics Data System (ADS)

    Francis, F.

    2013-12-01

    The objective of this paper is to discuss the analysis of results in temperature and precipitation (rainfall) data and how they are affected by the theory of global warming in Northeast America. The topic was chosen because it will show the trends in temperature and precipitation and their relations to global warming. Data was collected from The Global Historical Climatology Network (GHCN). The data range from years of 1973 to 2012. We were able to calculate the yearly and monthly regress to estimate the relationship of variables found in the individual sources. With the use of specially designed software, analysis and manual calculations we are able to give a visualization of these trends in precipitation and temperature and to question if these trends are due to the theory of global warming. With the Calculation of the trends in slope we were able to interpret the changes in minimum and maximum temperature and precipitation. Precipitation had a 9.5 % increase over the past forty years, while maximum temperature increased 1.9 %, a greater increase is seen in minimum temperature of 3.3 % was calculated over the years. The trends in precipitation, maximum and minimum temperature is statistically significant at a 95% level.

  12. Solvation Structure and Thermodynamic Mapping (SSTMap): An Open-Source, Flexible Package for the Analysis of Water in Molecular Dynamics Trajectories.

    PubMed

    Haider, Kamran; Cruz, Anthony; Ramsey, Steven; Gilson, Michael K; Kurtzman, Tom

    2018-01-09

    We have developed SSTMap, a software package for mapping structural and thermodynamic water properties in molecular dynamics trajectories. The package introduces automated analysis and mapping of local measures of frustration and enhancement of water structure. The thermodynamic calculations are based on Inhomogeneous Fluid Solvation Theory (IST), which is implemented using both site-based and grid-based approaches. The package also extends the applicability of solvation analysis calculations to multiple molecular dynamics (MD) simulation programs by using existing cross-platform tools for parsing MD parameter and trajectory files. SSTMap is implemented in Python and contains both command-line tools and a Python module to facilitate flexibility in setting up calculations and for automated generation of large data sets involving analysis of multiple solutes. Output is generated in formats compatible with popular Python data science packages. This tool will be used by the molecular modeling community for computational analysis of water in problems of biophysical interest such as ligand binding and protein function.

  13. NFFA-Europe: enhancing European competitiveness in nanoscience research and innovation (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Carsughi, Flavio; Fonseca, Luis

    2017-06-01

    NFFA-EUROPE is an European open access resource for experimental and theoretical nanoscience and sets out a platform to carry out comprehensive projects for multidisciplinary research at the nanoscale extending from synthesis to nanocharacterization to theory and numerical simulation. Advanced infrastructures specialized on growth, nano-lithography, nano-characterization, theory and simulation and fine-analysis with Synchrotron, FEL and Neutron radiation sources are integrated in a multi-site combination to develop frontier research on methods for reproducible nanoscience research and to enable European and international researchers from diverse disciplines to carry out advanced proposals impacting science and innovation. NFFA-EUROPE will enable coordinated access to infrastructures on different aspects of nanoscience research that is not currently available at single specialized ones and without duplicating their specific scopes. Approved user projects will have access to the best suited instruments and support competences for performing the research, including access to analytical large scale facilities, theory and simulation and high-performance computing facilities. Access is offered free of charge to European users and users will receive a financial contribution for their travel, accommodation and subsistence costs. The users access will include several "installations" and will be coordinated through a single entry point portal that will activate an advanced user-infrastructure dialogue to build up a personalized access programme with an increasing return on science and innovation production. The own research activity of NFFA-EUROPE will address key bottlenecks of nanoscience research: nanostructure traceability, protocol reproducibility, in-operando nano-manipulation and analysis, open data.

  14. The finite ground plane effect on the microstrip antenna radiation patterns

    NASA Technical Reports Server (NTRS)

    Huang, J.

    1983-01-01

    The uniform geometrical theory of diffraction (GTD) is employed for calculating the edge diffracted fields from the finite ground plane of a microstrip antenna. The source field from the radiating patch is calculated by two different methods: the slot theory and the modal expansion theory. Many numerical and measured results are presented to demonstrate the accuracy of the calculations and the finite ground plane edge effect.

  15. Polarized optical scattering by inhomogeneities and surface roughness in an anisotropic thin film.

    PubMed

    Germer, Thomas A; Sharma, Katelynn A; Brown, Thomas G; Oliver, James B

    2017-11-01

    We extend the theory of Kassam et al. [J. Opt. Soc. Am. A12, 2009 (1995)JOAOD60740-323210.1364/JOSAA.12.002009] for scattering by oblique columnar structure thin films to include the induced form birefringence and the propagation of radiation in those films. We generalize the 4×4 matrix theory of Berreman [J. Opt. Soc. Am.62, 502 (1972)JOSAAH0030-394110.1364/JOSA.62.000502] to include arbitrary sources in the layer, which are necessary to determine the Green function for the inhomogeneous wave equation. We further extend first-order vector perturbation theory for scattering by roughness in the smooth surface limit, when the layer is anisotropic. Scattering by an inhomogeneous medium is approximated by a distorted Born approximation, where effective medium theory is used to determine the effective properties of the medium, and strong fluctuation theory is used to determine the inhomogeneous sources. In this manner, we develop a model for scattering by inhomogeneous films, with anisotropic correlation functions. The results are compared with Mueller matrix bidirectional scattering distribution function measurements for a glancing-angle deposition (GLAD) film. While the results are applied to the GLAD film example, the development of the theory is general enough that it can guide simulations for scattering in other anisotropic thin films.

  16. Spinning particles, axion radiation, and the classical double copy

    NASA Astrophysics Data System (ADS)

    Goldberger, Walter D.; Li, Jingping; Prabhu, Siddharth G.

    2018-05-01

    We extend the perturbative double copy between radiating classical sources in gauge theory and gravity to the case of spinning particles. We construct, to linear order in spins, perturbative radiating solutions to the classical Yang-Mills equations sourced by a set of interacting color charges with chromomagnetic dipole spin couplings. Using a color-to-kinematics replacement rule proposed earlier by one of the authors, these solutions map onto radiation in a theory of interacting particles coupled to massless fields that include the graviton, a scalar (dilaton) ϕ and the Kalb-Ramond axion field Bμ ν. Consistency of the double copy imposes constraints on the parameters of the theory on both the gauge and gravity sides of the correspondence. In particular, the color charges carry a chromomagnetic interaction which, in d =4 , corresponds to a gyromagnetic ratio equal to Dirac's value g =2 . The color-to-kinematics map implies that on the gravity side, the bulk theory of the fields (ϕ ,gμ ν,Bμ ν) has interactions which match those of d -dimensional "string gravity," as is the case both in the BCJ double copy of pure gauge theory scattering amplitudes and the KLT relations between the tree-level S -matrix elements of open and closed string theory.

  17. A Comparative Analysis of Three Unique Theories of Organizational Learning

    ERIC Educational Resources Information Center

    Leavitt, Carol C.

    2011-01-01

    The purpose of this paper is to present three classical theories on organizational learning and conduct a comparative analysis that highlights their strengths, similarities, and differences. Two of the theories -- experiential learning theory and adaptive -- generative learning theory -- represent the thinking of the cognitive perspective, while…

  18. Situation-specific theories from the middle-range transitions theory.

    PubMed

    Im, Eun-Ok

    2014-01-01

    The purpose of this article was to analyze the theory development process of the situation-specific theories that were derived from the middle-range transitions theory. This analysis aims to provide directions for future development of situation-specific theories. First, transitions theory is concisely described with its history, goal, and major concepts. Then, the approach that was used to retrieve the situation-specific theories derived from transitions theory is described. Next, an analysis of 6 situation-specific theories is presented. Finally, 4 themes reflecting commonalities and variances in the theory development process are discussed with implications for future theoretical development.

  19. Observation and analysis of in vivo vocal fold tissue instabilities produced by nonlinear source-filter coupling: A case studya

    PubMed Central

    Zañartu, Matías; Mehta, Daryush D.; Ho, Julio C.; Wodicka, George R.; Hillman, Robert E.

    2011-01-01

    Different source-related factors can lead to vocal fold instabilities and bifurcations referred to as voice breaks. Nonlinear coupling in phonation suggests that changes in acoustic loading can also be responsible for this unstable behavior. However, no in vivo visualization of tissue motion during these acoustically induced instabilities has been reported. Simultaneous recordings of laryngeal high-speed videoendoscopy, acoustics, aerodynamics, electroglottography, and neck skin acceleration are obtained from a participant consistently exhibiting voice breaks during pitch glide maneuvers. Results suggest that acoustically induced and source-induced instabilities can be distinguished at the tissue level. Differences in vibratory patterns are described through kymography and phonovibrography; measures of glottal area, open∕speed quotient, and amplitude∕phase asymmetry; and empirical orthogonal function decomposition. Acoustically induced tissue instabilities appear abruptly and exhibit irregular vocal fold motion after the bifurcation point, whereas source-induced ones show a smoother transition. These observations are also reflected in the acoustic and acceleration signals. Added aperiodicity is observed after the acoustically induced break, and harmonic changes appear prior to the bifurcation for the source-induced break. Both types of breaks appear to be subcritical bifurcations due to the presence of hysteresis and amplitude changes after the frequency jumps. These results are consistent with previous studies and the nonlinear source-filter coupling theory. PMID:21303014

  20. Theory and analysis of a large field polarization imaging system with obliquely incident light.

    PubMed

    Lu, Xiaotian; Jin, Weiqi; Li, Li; Wang, Xia; Qiu, Su; Liu, Jing

    2018-02-05

    Polarization imaging technology provides information about not only the irradiance of a target but also the polarization degree and angle of polarization, which indicates extensive application potential. However, polarization imaging theory is based on paraxial optics. When a beam of obliquely incident light passes an analyser, the direction of light propagation is not perpendicular to the surface of the analyser and the applicability of the traditional paraxial optical polarization imaging theory is challenged. This paper investigates a theoretical model of a polarization imaging system with obliquely incident light and establishes a polarization imaging transmission model with a large field of obliquely incident light. In an imaging experiment with an integrating sphere light source and rotatable polarizer, the polarization imaging transmission model is verified and analysed for two cases of natural light and linearly polarized light incidence. Although the results indicate that the theoretical model is consistent with the experimental results, the theoretical model distinctly differs from the traditional paraxial approximation model. The results prove the accuracy and necessity of the theoretical model and the theoretical guiding significance for theoretical and systematic research of large field polarization imaging.

Top