Sample records for quantitative rescattering theory

  1. Ratios of double to single ionization of He and Ne by strong 400-nm laser pulses using the quantitative rescattering theory

    NASA Astrophysics Data System (ADS)

    Chen, Zhangjin; Li, Xiaojin; Zatsarinny, Oleg; Bartschat, Klaus; Lin, C. D.

    2018-01-01

    We present numerical simulations of the ratio between double and single ionization of He and Ne by intense laser pulses at wavelengths of 390 and 400 nm, respectively. The yields of doubly charged ions due to nonsequential double ionization (NSDI) are obtained by employing the quantitative rescattering (QRS) model. In this model, the NSDI ionization probability is expressed as a product of the returning electron wave packet (RWP) and the total scattering cross sections for laser-free electron impact excitation and electron impact ionization of the parent ion. According to the QRS theory, the same RWP is also responsible for the emission of high-energy above-threshold ionization photoelectrons. To obtain absolute double-ionization yields, the RWP is generated by solving the time-dependent Schrödinger equation (TDSE) within a one-electron model. The same TDSE results can also be taken to obtain single-ionization yields. By using the TDSE results to calibrate single ionization and the RWP obtained from the strong-field approximation, we further simplify the calculation such that the nonuniform laser intensity distribution in the focused laser beam can be accounted for. In addition, laser-free electron impact excitation and ionization cross sections are calculated using the state-of-the-art many-electron R -matrix theory. The simulation results for double-to-single-ionization ratios are found to compare well with experimental data and support the validity of the nonsequential double-ionization mechanism for the covered intensity region.

  2. Quantum trajectories for high-order-harmonic generation from multiple rescattering events in the long-wavelength regime

    NASA Astrophysics Data System (ADS)

    He, Lixin; Li, Yang; Wang, Zhe; Zhang, Qingbin; Lan, Pengfei; Lu, Peixiang

    2014-05-01

    We have performed the quantum trajectory analysis for high-order-harmonic generation (HHG) with different driving laser wavelengths. By defining the ratio of HHG yields of the Nth and first rescattering events (YN/Y1), we quantitatively evaluate the HHG contributions from multiple rescatterings. The results show that the HHG yield ratio increases gradually with the increase of the laser wavelength, which demonstrates that high-order rescatterings provide ascendent contributions to HHG at longer wavelength. By calculating the classical electron trajectories, we find significant differences exist in the electron behaviors between the first and high-order rescatterings. Further investigations have demonstrated that the increasing HHG yield ratio is mainly attributed to the relatively smaller contributions from the short path of the first electron rescattering at longer laser wavelength.

  3. Hard QCD rescattering in few nucleon systems

    NASA Astrophysics Data System (ADS)

    Maheswari, Dhiraj; Sargsian, Misak

    2017-01-01

    The theoretical framework of hard QCD rescattering mechanism (HRM) is extended to calculate the high energy γ3 He -> pd reaction at 900 center of mass angle. In HRM model , the incoming high energy photon strikes a quark from one of the nucleons in the target which subsequently undergoes hard rescattering with the quarks from the other nucleons generating hard two-body baryonic system in the final state of the reaction. Based on the HRM, a parameter free expression for the differential cross section for the reaction is derived, expressed through the 3 He -> pd transition spectral function, hard pd -> pd elastic scattering cross section and the effective charge of the quarks being interchanged in the hard rescattering process. The numerical estimates obtained from this expression for the differential cross section are in a good agreement with the data recently obtained at the Jefferson Lab experiment, showing the energy scaling of cross section with an exponent of s-17, also consistent with the quark counting rule. The angular and energy dependences of the cross section are also predicted within HRM which are in good agreement with the preliminary data of these distributions. Research is supported by the US Department of Energy.

  4. Polarization observables in hard rescattering mechanism of deuteron photodisintegration

    NASA Astrophysics Data System (ADS)

    Sargsian, Misak M.

    2004-05-01

    Polarization properties of high energy photodisintegration of the deuteron are studied within the framework of the hard rescattering mechanism (HRM). In HRM, a quark of one nucleon knocked-out by the incoming photon rescatters with a quark of the other nucleon leading to the production of two nucleons with high relative momentum. Summation of all relevant quark rescattering amplitudes allows us to express the scattering amplitude of the reaction through the convolution of a hard photon-quark interaction vertex, the large angle p-n scattering amplitude and the low momentum deuteron wave function. Within HRM, it is demonstrated that the polarization observables in hard photodisintegration of the deuteron can be expressed through the five helicity amplitudes of NN scattering at high momentum transfer. At 90° CM scattering HRM predicts the dominance of the isovector channel of hard pn rescattering, and it explains the observed smallness of induced, Py and transfered, Cx polarizations without invoking the argument of helicity conservation. Namely, HRM predicts that Py and Cx are proportional to the φ5 helicity amplitude which vanishes at θcm=90° due to symmetry reasons. HRM predicts also a nonzero value for Cz in the helicity-conserving regime and a positive Σ asymmetry which is related to the dominance of the isovector channel in the hard reinteraction. We extend our calculations to the region where large polarization effects are observed in pp scattering as well as give predictions for angular dependences.

  5. Rescattering contributions to rare B-meson decays

    NASA Astrophysics Data System (ADS)

    Gronau, Michael; London, David; Rosner, Jonathan L.

    2013-02-01

    Several B and Bs decays have been observed that have been cited as evidence for exchange (E), penguin annihilation (PA), and annihilation (A) processes, such as b¯d→u¯u, b¯s→u¯u, and b¯u→W*→c¯s, respectively. These amplitudes are normally thought to be suppressed, as they involve the spectator quark in the weak interaction and thus should be proportional to the B-meson decay constant fB. However, as pointed out a number of years ago, they can also be generated by rescattering from processes whose amplitudes do not involve fB, such as color-favored tree amplitudes. In this paper we investigate a number of processes such as B0→K+K-, Bs→π+π-, and B+→Ds+ϕ, and identify promising states from which they can be generated by rescattering. We find that E-and PA-type processes are characterized, respectively, by amplitudes ranging from 5% to 10% and from 15% to 20% with respect to the largest amplitude from which they can rescatter. Based on this regularity, using approximate flavor SU(3) symmetry in some cases and time-reversal invariance in others, we predict the branching fractions for a large number of as-yet-unseen B and Bs decays in an extensive range from order 10-9 to 10-4.

  6. Electron rescattering in above-threshold photodetachment of negative ions.

    PubMed

    Gazibegović-Busuladzić, A; Milosević, D B; Becker, W; Bergues, B; Hultgren, H; Kiyan, I Yu

    2010-03-12

    We present experimental and theoretical results on photodetachment of Br(-) and F(-) in a strong infrared laser field. The observed photoelectron spectra of Br(-) exhibit a high-energy plateau along the laser polarization direction, which is identified as being due to the rescattering effect. The shape and the extension of the plateau is found to be influenced by the depletion of negative ions during the interaction with the laser pulse. Our findings represent the first observation of electron rescattering in above-threshold photodetachment of an atomic system with a short-range potential.

  7. PRIMORDIAL GRAVITATIONAL WAVES AND RESCATTERED ELECTROMAGNETIC RADIATION IN THE COSMIC MICROWAVE BACKGROUND

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Dong-Hoon; Trippe, Sascha, E-mail: ki13130@gmail.com, E-mail: trippe@astro.snu.ac.kr

    Understanding the interaction of primordial gravitational waves (GWs) with the Cosmic Microwave Background (CMB) plasma is important for observational cosmology. In this article, we provide an analysis of an apparently as-yet-overlooked effect. We consider a single free electric charge and suppose that it can be agitated by primordial GWs propagating through the CMB plasma, resulting in periodic, regular motion along particular directions. Light reflected by the charge will be partially polarized, and this will imprint a characteristic pattern on the CMB. We study this effect by considering a simple model in which anisotropic incident electromagnetic (EM) radiation is rescattered bymore » a charge sitting in spacetime perturbed by GWs, and becomes polarized. As the charge is driven to move along particular directions, we calculate its dipole moment to determine the leading-order rescattered EM radiation. The Stokes parameters of the rescattered radiation exhibit a net linear polarization. We investigate how this polarization effect can be schematically represented out of the Stokes parameters. We work out the representations of gradient modes (E-modes) and curl modes (B-modes) to produce polarization maps. Although the polarization effect results from GWs, we find that its representations, the E- and B-modes, do not practically reflect the GW properties such as strain amplitude, frequency, and polarization states.« less

  8. Rescattering effects on intensity interferometry and initial conditions in relativistic heavy ion collisions

    NASA Astrophysics Data System (ADS)

    Li, Yang

    The properties of the quark-gluon plasma are being thoroughly studied by utilizing relativistic heavy ion collisions. After its invention in astronomy in the 1950s, intensity interferometry was found to be a robust method to probe the spatial and temporal information of the nuclear collisions also. Although rescattering effects are negligible in elementary particle collisions, it may be very important for heavy ion collisions at RHIC and in the future LHC. Rescattering after production will modify the measured correlation function and make it harder to extract the dynamical information from data. To better understand the data which are dimmed by this final state process, we derive a general formula for intensity interferometry which can calculate rescattering effects easily. The formula can be used both non-relativistically and relativistically. Numerically, we found that rescattering effects on kaon interferometry for RHIC experiments can modify the measured ratio of the outward radius to the sideward radius, which is a sensitive probe to the equation of state, by as large as 15%. It is a nontrivial contribution which should be included to understand the data more accurately. The second part of this thesis is on the initial conditions in relativistic heavy ion collisions. Although relativistic hydrodynamics is successful in explaining many aspects of the data, it is only valid after some finite time after nuclear contact. The results depend on the choice of initial conditions which, so far, have been very uncertain. I describe a formula based on the McLerran-Venugopalan model to compute the initial energy density. The soft gluon fields produced immediately after the overlap of the nuclei can be expanded as a power series of the proper time t. Solving Yang-Mills equations with color current conservation can give us the analytical formulas for the fields. The local color charges on the transverse plane are stochastic variables and have to be taken care of by random

  9. Quantitative cell biology: the essential role of theory.

    PubMed

    Howard, Jonathon

    2014-11-05

    Quantitative biology is a hot area, as evidenced by the recent establishment of institutes, graduate programs, and conferences with that name. But what is quantitative biology? What should it be? And how can it contribute to solving the big questions in biology? The past decade has seen very rapid development of quantitative experimental techniques, especially at the single-molecule and single-cell levels. In this essay, I argue that quantitative biology is much more than just the quantitation of these experimental results. Instead, it should be the application of the scientific method by which measurement is directed toward testing theories. In this view, quantitative biology is the recognition that theory and models play critical roles in biology, as they do in physics and engineering. By tying together experiment and theory, quantitative biology promises a deeper understanding of underlying mechanisms, when the theory works, or to new discoveries, when it does not. © 2014 Howard. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  10. Strong-field approximation for ionization of a diatomic molecule by a strong laser field. II. The role of electron rescattering off the molecular centers

    NASA Astrophysics Data System (ADS)

    Busuladžić, M.; Gazibegović-Busuladžić, A.; Milošević, D. B.; Becker, W.

    2008-09-01

    The strong-field approximation for ionization of diatomic molecules by a strong laser field [D. B. Milošević, Phys. Rev. A 74, 063404 (2006)] is generalized to include rescattering of the ionized electron wave packet off the molecular centers (the electron’s parent ion or the second atom). There are four rescattering contributions to the ionization rate, which are responsible for the high-energy plateau in the electron spectra and which interfere in a complicated manner. The spectra are even more complicated due to the different symmetry properties of the atomic orbitals of which a particular molecular orbital consists. Nevertheless, a comparatively simple condition emerges for the destructive interference of all these contributions, which yields a curve in the (Epf,θ) plane. Here θ is the electron emission angle and Epf is the electron kinetic energy. The resulting suppression of the rescattering plateau can be strong and affect a large area of the (Epf,θ) plane, depending on the orientation of the molecule. We illustrate this using the examples of the 3σg molecular orbital of N2 and the 1πg molecular orbital of O2 for various orientations of these molecules with respect to the laser polarization axis. For N2 , for perpendicular orientation and the equilibrium internuclear distance R0 , we find that the minima of the ionization rate form the curve Epfcos2θ=π2/(2R02) in the (Epf,θ) plane. For O2 the rescattering plateau is absent for perpendicular orientation.

  11. QTest: Quantitative Testing of Theories of Binary Choice.

    PubMed

    Regenwetter, Michel; Davis-Stober, Clintin P; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of "Random Cumulative Prospect Theory." A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences.

  12. QTest: Quantitative Testing of Theories of Binary Choice

    PubMed Central

    Regenwetter, Michel; Davis-Stober, Clintin P.; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of “Random Cumulative Prospect Theory.” A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences. PMID:24999495

  13. Experimental Retrieval of Target Structure Information from Laser-Induced Rescattered Photoelectron Momentum Distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Okunishi, M.; Pruemper, G.; Shimada, K.

    We have measured two-dimensional photoelectron momentum spectra of Ne, Ar, and Xe generated by 800-nm, 100-fs laser pulses and succeeded in identifying the spectral ridge region (back-rescattered ridges) which marks the location of the returning electrons that have been backscattered at their maximum kinetic energies. We demonstrate that the structural information, in particular the differential elastic scattering cross sections of the target ion by free electrons, can be accurately extracted from the intensity distributions of photoelectrons on the ridges, thus effecting a first step toward laser-induced self-imaging of the target, with unprecedented spatial and temporal resolutions.

  14. Thermal decay of a metastable state: Influence of rescattering on the quasistationary dynamical rate

    NASA Astrophysics Data System (ADS)

    Chushnyakova, M. V.; Gontchar, I. I.

    2018-03-01

    We study the effect of backscattering of the Brownian particles as they escape out of a metastable state overcoming the potential barrier. For this aim, we model this process numerically using the Langevin equations. This modeling is performed for the wide range of the friction constant covering both the energy and spatial diffusion regimes. It is shown how the influence of the descent stage on the quasistationary decay rate gradually disappears as the friction constant decreases. It is found that, in the energy diffusion regime, the rescattering absents and the descent stage does not influence the decay rate. As the value of friction increases, the descent alters the value of the rate by more than 50% for different values of thermal energy and different shapes of the potential. To study the influence of the backscattering on the decay rate, four potentials have been considered which coincide near the potential well and the barrier but differ beyond the barrier. It is shown that the potential for which the well and the barrier are described by two smoothly joined parabolas ("the parabolic potential") plays a role of a dividing range for the mutual layout of the quasistationary dynamical rate and the widely used in the literature Kramers rate. Namely, for the potentials with steeper tails, the Kramers rate RKM underestimates the true quasistationary dynamical rate RD, whereas for the less steep tails the opposite holds (inversion of RD/RKM ). It is demonstrated that the mutual layout of the values of RD for different potentials is explained by the rescattering of the particles from the potential tail.

  15. Penguin topologies, rescattering effects and penguin hunting with Bu,d-->KKoverline and B+/--->π+/-K

    NASA Astrophysics Data System (ADS)

    Buras, Andrzej J.; Fleischer, Robert; Mannel, Thomas

    1998-11-01

    In the recent literature, constraints on the CKM angle γ arising from the branching ratios for B± → π±K and Bd → π∓K± decays received a lot of attention. An important theoretical limitation of the accuracy of these bounds is due to rescattering effects, such as B+ → { π0K+} → π+K0. We point out that these processes are related to penguin topologies with internal up-quark exchanges and derive SU (2) isospin relations among the B+ → π+K0 and Bd0 → π-K+ decay amplitudes by defining "tree" and "penguin" amplitudes in a proper way, allowing the derivation of generalized bounds on the CKM angle γ. We propose strategies to obtain insights into the dynamics of penguin processes with the help of the decays B u,d → K overlineK and B± → π±K, derive a relation among the direct CP-violating asymmetries arising in these modes, and emphasize that rescattering effects can be included in the generalized bounds on γ completely this way. Moreover, we have a brief look at the impact of new physics.

  16. Inelastic Rescattering in B Decays to π π , π K, and K bar K, and Extraction of γ

    NASA Astrophysics Data System (ADS)

    Zenczykowski, P.

    2002-07-01

    We discuss all contributions from inelastic SU(3)-symmetric rescattering in B decays into a final pair of pseudoscalar mesons PP = π π , Kbar {K}, π K. FSI-induced modifications of amplitudes obtained from the quark-line approach are described in terms of a few parameters which take care of all possible SU(3)-symmetric forms relevant for final-state interactions. Although in general it appears impossible to uniquely determine FSI effects from the combined set of all π π , K bar {K}, and π K data, drawing some conclusions is feasible. In particular, it is shown that in leading order the amplitudes of strangeness-changing B decays depend on only one additional complex FSI-related parameter apart from those present in the definitions of penguin and tree amplitudes. It is also shown that joint considerations of U-spin-related Δ S=0 and | Δ S|=1 decay amplitudes are modified when non-negligible SU(3)-symmetric FSI are present. In particular, if rescattering in B+ → K+bar {K}0 is substantial, determination of the CP-violating weak angle γ from B+ → π + K0, B0d → π -K+, B0s → π + K-, and their CP counterparts might be susceptible to important FSI-induced corrections.

  17. Quantitative theory of diffraction by cylindrical scroll nanotubes.

    PubMed

    Khadiev, Azat; Khalitov, Zufar

    2018-05-01

    A quantitative theory of Fraunhofer diffraction by right- and left-handed multiwalled cylindrical scroll nanotubes is developed on the basis of the kinematical approach. The proposed theory is mainly dedicated to structural studies of individual nanotubes by the selected-area electron diffraction technique. Strong and diffuse reflections of the scroll nanotube were studied and explicit formulas that govern relations between the direct and reciprocal lattice of the scroll nanotube are achieved.

  18. Probing periodic potential of crystals via strong-field re-scattering

    NASA Astrophysics Data System (ADS)

    You, Yong Sing; Cunningham, Eric; Reis, David A.; Ghimire, Shambhu

    2018-06-01

    Strong-field ionization and re-scattering phenomena have been used to image angstrom-scale structures of isolated molecules in the gas phase. These methods typically make use of the anisotropic response of the participating molecular orbital. Recently, an anisotropic strong-field response has also been observed in high-order harmonic generation (HHG) from bulk crystals (2016 Nat. Phys. 13 345). In a (100) cut magnesium oxide crystal, extreme ultraviolet high-harmonics are found to depend strongly on the crystal structure and inter-atomic bonding. Here, we extend these measurements to other two important crystal orientations: (111) and (110). We find that HHG from these orientations is also strongly anisotropic. The underlying dynamics is understood using a real-space picture, where high-harmonics are produced via coherent collision of strong-field driven electrons from the atomic sites, including from the nearest neighbor atoms. We find that harmonic efficiency is enhanced when semi-classical electron trajectories connect to the concentrated valence charge distribution regions around the atomic cores. Similarly, the efficiency is suppressed when the trajectories miss the atomic cores. These results further support the real-space picture of HHG with implications for retrieving the periodic potential of the crystal, if not the wavefunctions in three-dimensions.

  19. The quantitative theory of within-host viral evolution

    NASA Astrophysics Data System (ADS)

    Rouzine, Igor M.; Weinberger, Leor S.

    2013-01-01

    During the 1990s, a group of virologists and physicists began development of a quantitative theory to explain the rapid evolution of human immunodeficiency virus type 1 (HIV-1). This theory also proved to be instrumental in understanding the rapid emergence of drug resistance in patients. Over the past two decades, this theory expanded to account for a broad array of factors important to viral evolution and propelled development of a generalized theory applicable to a broad range of asexual and partly sexual populations with many evolving sites. Here, we discuss the conceptual and theoretical tools developed to calculate the speed and other parameters of evolution, with a particular focus on the concept of ‘clonal interference’ and its applications to untreated patients.

  20. Indirect scaling methods for testing quantitative emotion theories.

    PubMed

    Junge, Martin; Reisenzein, Rainer

    2013-01-01

    Two studies investigated the utility of indirect scaling methods, based on graded pair comparisons, for the testing of quantitative emotion theories. In Study 1, we measured the intensity of relief and disappointment caused by lottery outcomes, and in Study 2, the intensity of disgust evoked by pictures, using both direct intensity ratings and graded pair comparisons. The stimuli were systematically constructed to reflect variables expected to influence the intensity of the emotions according to theoretical models of relief/disappointment and disgust, respectively. Two probabilistic scaling methods were used to estimate scale values from the pair comparison judgements: Additive functional measurement (AFM) and maximum likelihood difference scaling (MLDS). The emotion models were fitted to the direct and indirect intensity measurements using nonlinear regression (Study 1) and analysis of variance (Study 2). Both studies found substantially improved fits of the emotion models for the indirectly determined emotion intensities, with their advantage being evident particularly at the level of individual participants. The results suggest that indirect scaling methods yield more precise measurements of emotion intensity than rating scales and thereby provide stronger tests of emotion theories in general and quantitative emotion theories in particular.

  1. Extensions and evaluations of a general quantitative theory of forest structure and dynamics

    PubMed Central

    Enquist, Brian J.; West, Geoffrey B.; Brown, James H.

    2009-01-01

    Here, we present the second part of a quantitative theory for the structure and dynamics of forests under demographic and resource steady state. The theory is based on individual-level allometric scaling relations for how trees use resources, fill space, and grow. These scale up to determine emergent properties of diverse forests, including size–frequency distributions, spacing relations, canopy configurations, mortality rates, population dynamics, successional dynamics, and resource flux rates. The theory uniquely makes quantitative predictions for both stand-level scaling exponents and normalizations. We evaluate these predictions by compiling and analyzing macroecological datasets from several tropical forests. The close match between theoretical predictions and data suggests that forests are organized by a set of very general scaling rules. Our mechanistic theory is based on allometric scaling relations, is complementary to “demographic theory,” but is fundamentally different in approach. It provides a quantitative baseline for understanding deviations from predictions due to other factors, including disturbance, variation in branching architecture, asymmetric competition, resource limitation, and other sources of mortality, which are not included in the deliberately simplified theory. The theory should apply to a wide range of forests despite large differences in abiotic environment, species diversity, and taxonomic and functional composition. PMID:19363161

  2. Quantitative confirmation of diffusion-limited oxidation theories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gillen, K.T.; Clough, R.L.

    1990-01-01

    Diffusion-limited (heterogeneous) oxidation effects are often important for studies of polymer degradation. Such effects are common in polymers subjected to ionizing radiation at relatively high dose rate. To better understand the underlying oxidation processes and to aid in the planning of accelerated aging studies, it would be desirable to be able to monitor and quantitatively understand these effects. In this paper, we briefly review a theoretical diffusion approach which derives model profiles for oxygen surrounded sheets of material by combining oxygen permeation rates with kinetically based oxygen consumption expressions. The theory leads to a simple governing expression involving the oxygenmore » consumption and permeation rates together with two model parameters {alpha} and {beta}. To test the theory, gamma-initiated oxidation of a sheet of commercially formulated EPDM rubber was performed under conditions which led to diffusion-limited oxidation. Profile shapes from the theoretical treatments are shown to accurately fit experimentally derived oxidation profiles. In addition, direct measurements on the same EPDM material of the oxygen consumption and permeation rates, together with values of {alpha} and {beta} derived from the fitting procedure, allow us to quantitatively confirm for the first time the governing theoretical relationship. 17 refs., 3 figs.« less

  3. From information theory to quantitative description of steric effects.

    PubMed

    Alipour, Mojtaba; Safari, Zahra

    2016-07-21

    Immense efforts have been made in the literature to apply the information theory descriptors for investigating the electronic structure theory of various systems. In the present study, the information theoretic quantities, such as Fisher information, Shannon entropy, Onicescu information energy, and Ghosh-Berkowitz-Parr entropy, have been used to present a quantitative description for one of the most widely used concepts in chemistry, namely the steric effects. Taking the experimental steric scales for the different compounds as benchmark sets, there are reasonable linear relationships between the experimental scales of the steric effects and theoretical values of steric energies calculated from information theory functionals. Perusing the results obtained from the information theoretic quantities with the two representations of electron density and shape function, the Shannon entropy has the best performance for the purpose. On the one hand, the usefulness of considering the contributions of functional groups steric energies and geometries, and on the other hand, dissecting the effects of both global and local information measures simultaneously have also been explored. Furthermore, the utility of the information functionals for the description of steric effects in several chemical transformations, such as electrophilic and nucleophilic reactions and host-guest chemistry, has been analyzed. The functionals of information theory correlate remarkably with the stability of systems and experimental scales. Overall, these findings show that the information theoretic quantities can be introduced as quantitative measures of steric effects and provide further evidences of the quality of information theory toward helping theoreticians and experimentalists to interpret different problems in real systems.

  4. Quantitative verification of ab initio self-consistent laser theory.

    PubMed

    Ge, Li; Tandy, Robert J; Stone, A D; Türeci, Hakan E

    2008-10-13

    We generalize and test the recent "ab initio" self-consistent (AISC) time-independent semiclassical laser theory. This self-consistent formalism generates all the stationary lasing properties in the multimode regime (frequencies, thresholds, internal and external fields, output power and emission pattern) from simple inputs: the dielectric function of the passive cavity, the atomic transition frequency, and the transverse relaxation time of the lasing transition.We find that the theory gives excellent quantitative agreement with full time-dependent simulations of the Maxwell-Bloch equations after it has been generalized to drop the slowly-varying envelope approximation. The theory is infinite order in the non-linear hole-burning interaction; the widely used third order approximation is shown to fail badly.

  5. The Mapping Model: A Cognitive Theory of Quantitative Estimation

    ERIC Educational Resources Information Center

    von Helversen, Bettina; Rieskamp, Jorg

    2008-01-01

    How do people make quantitative estimations, such as estimating a car's selling price? Traditionally, linear-regression-type models have been used to answer this question. These models assume that people weight and integrate all information available to estimate a criterion. The authors propose an alternative cognitive theory for quantitative…

  6. Measurement Invariance: A Foundational Principle for Quantitative Theory Building

    ERIC Educational Resources Information Center

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    This article describes why measurement invariance is a critical issue to quantitative theory building within the field of human resource development. Readers will learn what measurement invariance is and how to test for its presence using techniques that are accessible to applied researchers. Using data from a LibQUAL+[TM] study of user…

  7. Quantitative kinetic theory of active matter

    NASA Astrophysics Data System (ADS)

    Ihle, Thomas; Chou, Yen-Liang

    2014-03-01

    Models of self-driven agents similar to the Vicsek model [Phys. Rev. Lett. 75 (1995) 1226] are studied by means of kinetic theory. In these models, particles try to align their travel directions with the average direction of their neighbours. At strong alignment a globally ordered state of collective motion forms. An Enskog-like kinetic theory is derived from the exact Chapman-Kolmogorov equation in phase space using Boltzmann's mean-field approximation of molecular chaos. The kinetic equation is solved numerically by a nonlocal Lattice-Boltzmann-like algorithm. Steep soliton-like waves are observed that lead to an abrupt jump of the global order parameter if the noise level is changed. The shape of the wave is shown to follow a novel scaling law and to quantitatively agree within 3 % with agent-based simulations at large particle speeds. This provides a mean-field mechanism to change the second-order character of the flocking transition to first order. Diagrammatic techniques are used to investigate small particle speeds, where the mean-field assumption of Molecular Chaos is invalid and where correlation effects need to be included.

  8. Imaging ultrafast dynamics of molecules with laser-induced electron diffraction.

    PubMed

    Lin, C D; Xu, Junliang

    2012-10-14

    We introduce a laser-induced electron diffraction method (LIED) for imaging ultrafast dynamics of small molecules with femtosecond mid-infrared lasers. When molecules are placed in an intense laser field, both low- and high-energy photoelectrons are generated. According to quantitative rescattering (QRS) theory, high-energy electrons are produced by a rescattering process where electrons born at the early phase of the laser pulse are driven back to rescatter with the parent ion. From the high-energy electron momentum spectra, field-free elastic electron-ion scattering differential cross sections (DCS), or diffraction images, can be extracted. With mid-infrared lasers as the driving pulses, it is further shown that the DCS can be used to extract atomic positions in a molecule with sub-angstrom spatial resolution, in close analogy to the standard electron diffraction method. Since infrared lasers with pulse duration of a few to several tens of femtoseconds are already available, LIED can be used for imaging dynamics of molecules with sub-angstrom spatial and a few-femtosecond temporal resolution. The first experiment with LIED has shown that the bond length of oxygen molecules shortens by 0.1 Å in five femtoseconds after single ionization. The principle behind LIED and its future outlook as a tool for dynamic imaging of molecules are presented.

  9. Growth of wormlike micelles in nonionic surfactant solutions: Quantitative theory vs. experiment.

    PubMed

    Danov, Krassimir D; Kralchevsky, Peter A; Stoyanov, Simeon D; Cook, Joanne L; Stott, Ian P; Pelan, Eddie G

    2018-06-01

    Despite the considerable advances of molecular-thermodynamic theory of micelle growth, agreement between theory and experiment has been achieved only in isolated cases. A general theory that can provide self-consistent quantitative description of the growth of wormlike micelles in mixed surfactant solutions, including the experimentally observed high peaks in viscosity and aggregation number, is still missing. As a step toward the creation of such theory, here we consider the simplest system - nonionic wormlike surfactant micelles from polyoxyethylene alkyl ethers, C i E j . Our goal is to construct a molecular-thermodynamic model that is in agreement with the available experimental data. For this goal, we systematized data for the micelle mean mass aggregation number, from which the micelle growth parameter was determined at various temperatures. None of the available models can give a quantitative description of these data. We constructed a new model, which is based on theoretical expressions for the interfacial-tension, headgroup-steric and chain-conformation components of micelle free energy, along with appropriate expressions for the parameters of the model, including their temperature and curvature dependencies. Special attention was paid to the surfactant chain-conformation free energy, for which a new more general formula was derived. As a result, relatively simple theoretical expressions are obtained. All parameters that enter these expressions are known, which facilitates the theoretical modeling of micelle growth for various nonionic surfactants in excellent agreement with the experiment. The constructed model can serve as a basis that can be further upgraded to obtain quantitative description of micelle growth in more complicated systems, including binary and ternary mixtures of nonionic, ionic and zwitterionic surfactants, which determines the viscosity and stability of various formulations in personal-care and house-hold detergency. Copyright © 2018

  10. Extension of nanoconfined DNA: Quantitative comparison between experiment and theory

    NASA Astrophysics Data System (ADS)

    Iarko, V.; Werner, E.; Nyberg, L. K.; Müller, V.; Fritzsche, J.; Ambjörnsson, T.; Beech, J. P.; Tegenfeldt, J. O.; Mehlig, K.; Westerlund, F.; Mehlig, B.

    2015-12-01

    The extension of DNA confined to nanochannels has been studied intensively and in detail. However, quantitative comparisons between experiments and model calculations are difficult because most theoretical predictions involve undetermined prefactors, and because the model parameters (contour length, Kuhn length, effective width) are difficult to compute reliably, leading to substantial uncertainties. Here we use a recent asymptotically exact theory for the DNA extension in the "extended de Gennes regime" that allows us to compare experimental results with theory. For this purpose, we performed experiments measuring the mean DNA extension and its standard deviation while varying the channel geometry, dye intercalation ratio, and ionic strength of the buffer. The experimental results agree very well with theory at high ionic strengths, indicating that the model parameters are reliable. At low ionic strengths, the agreement is less good. We discuss possible reasons. In principle, our approach allows us to measure the Kuhn length and the effective width of a single DNA molecule and more generally of semiflexible polymers in solution.

  11. Overview of Classical Test Theory and Item Response Theory for Quantitative Assessment of Items in Developing Patient-Reported Outcome Measures

    PubMed Central

    Cappelleri, Joseph C.; Lundy, J. Jason; Hays, Ron D.

    2014-01-01

    Introduction The U.S. Food and Drug Administration’s patient-reported outcome (PRO) guidance document defines content validity as “the extent to which the instrument measures the concept of interest” (FDA, 2009, p. 12). “Construct validity is now generally viewed as a unifying form of validity for psychological measurements, subsuming both content and criterion validity” (Strauss & Smith, 2009, p. 7). Hence both qualitative and quantitative information are essential in evaluating the validity of measures. Methods We review classical test theory and item response theory approaches to evaluating PRO measures including frequency of responses to each category of the items in a multi-item scale, the distribution of scale scores, floor and ceiling effects, the relationship between item response options and the total score, and the extent to which hypothesized “difficulty” (severity) order of items is represented by observed responses. Conclusion Classical test theory and item response theory can be useful in providing a quantitative assessment of items and scales during the content validity phase of patient-reported outcome measures. Depending on the particular type of measure and the specific circumstances, either one or both approaches should be considered to help maximize the content validity of PRO measures. PMID:24811753

  12. Application of Fault Management Theory to the Quantitative Selection of a Launch Vehicle Abort Trigger Suite

    NASA Technical Reports Server (NTRS)

    Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.

    2014-01-01

    This paper describes the quantitative application of the theory of System Health Management and its operational subset, Fault Management, to the selection of abort triggers for a human-rated launch vehicle, the United States' National Aeronautics and Space Administration's (NASA) Space Launch System (SLS). The results demonstrate the efficacy of the theory to assess the effectiveness of candidate failure detection and response mechanisms to protect humans from time-critical and severe hazards. The quantitative method was successfully used on the SLS to aid selection of its suite of abort triggers.

  13. Application of Fault Management Theory to the Quantitative Selection of a Launch Vehicle Abort Trigger Suite

    NASA Technical Reports Server (NTRS)

    Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.

    2014-01-01

    This paper describes the quantitative application of the theory of System Health Management and its operational subset, Fault Management, to the selection of Abort Triggers for a human-rated launch vehicle, the United States' National Aeronautics and Space Administration's (NASA) Space Launch System (SLS). The results demonstrate the efficacy of the theory to assess the effectiveness of candidate failure detection and response mechanisms to protect humans from time-critical and severe hazards. The quantitative method was successfully used on the SLS to aid selection of its suite of Abort Triggers.

  14. Extending Theory-Based Quantitative Predictions to New Health Behaviors.

    PubMed

    Brick, Leslie Ann D; Velicer, Wayne F; Redding, Colleen A; Rossi, Joseph S; Prochaska, James O

    2016-04-01

    Traditional null hypothesis significance testing suffers many limitations and is poorly adapted to theory testing. A proposed alternative approach, called Testing Theory-based Quantitative Predictions, uses effect size estimates and confidence intervals to directly test predictions based on theory. This paper replicates findings from previous smoking studies and extends the approach to diet and sun protection behaviors using baseline data from a Transtheoretical Model behavioral intervention (N = 5407). Effect size predictions were developed using two methods: (1) applying refined effect size estimates from previous smoking research or (2) using predictions developed by an expert panel. Thirteen of 15 predictions were confirmed for smoking. For diet, 7 of 14 predictions were confirmed using smoking predictions and 6 of 16 using expert panel predictions. For sun protection, 3 of 11 predictions were confirmed using smoking predictions and 5 of 19 using expert panel predictions. Expert panel predictions and smoking-based predictions poorly predicted effect sizes for diet and sun protection constructs. Future studies should aim to use previous empirical data to generate predictions whenever possible. The best results occur when there have been several iterations of predictions for a behavior, such as with smoking, demonstrating that expected values begin to converge on the population effect size. Overall, the study supports necessity in strengthening and revising theory with empirical data.

  15. Sender–receiver systems and applying information theory for quantitative synthetic biology

    PubMed Central

    Barcena Menendez, Diego; Senthivel, Vivek Raj; Isalan, Mark

    2015-01-01

    Sender–receiver (S–R) systems abound in biology, with communication systems sending information in various forms. Information theory provides a quantitative basis for analysing these processes and is being applied to study natural genetic, enzymatic and neural networks. Recent advances in synthetic biology are providing us with a wealth of artificial S–R systems, giving us quantitative control over networks with a finite number of well-characterised components. Combining the two approaches can help to predict how to maximise signalling robustness, and will allow us to make increasingly complex biological computers. Ultimately, pushing the boundaries of synthetic biology will require moving beyond engineering the flow of information and towards building more sophisticated circuits that interpret biological meaning. PMID:25282688

  16. Overview of classical test theory and item response theory for the quantitative assessment of items in developing patient-reported outcomes measures.

    PubMed

    Cappelleri, Joseph C; Jason Lundy, J; Hays, Ron D

    2014-05-01

    The US Food and Drug Administration's guidance for industry document on patient-reported outcomes (PRO) defines content validity as "the extent to which the instrument measures the concept of interest" (FDA, 2009, p. 12). According to Strauss and Smith (2009), construct validity "is now generally viewed as a unifying form of validity for psychological measurements, subsuming both content and criterion validity" (p. 7). Hence, both qualitative and quantitative information are essential in evaluating the validity of measures. We review classical test theory and item response theory (IRT) approaches to evaluating PRO measures, including frequency of responses to each category of the items in a multi-item scale, the distribution of scale scores, floor and ceiling effects, the relationship between item response options and the total score, and the extent to which hypothesized "difficulty" (severity) order of items is represented by observed responses. If a researcher has few qualitative data and wants to get preliminary information about the content validity of the instrument, then descriptive assessments using classical test theory should be the first step. As the sample size grows during subsequent stages of instrument development, confidence in the numerical estimates from Rasch and other IRT models (as well as those of classical test theory) would also grow. Classical test theory and IRT can be useful in providing a quantitative assessment of items and scales during the content-validity phase of PRO-measure development. Depending on the particular type of measure and the specific circumstances, the classical test theory and/or the IRT should be considered to help maximize the content validity of PRO measures. Copyright © 2014 Elsevier HS Journals, Inc. All rights reserved.

  17. A quantitative theory of the functions of the hippocampal CA3 network in memory

    PubMed Central

    Rolls, Edmund T.

    2013-01-01

    A quantitative computational theory of the operation of the hippocampal CA3 system as an autoassociation or attractor network used in episodic memory system is described. In this theory, the CA3 system operates as a single attractor or autoassociation network to enable rapid, one-trial, associations between any spatial location (place in rodents, or spatial view in primates) and an object or reward, and to provide for completion of the whole memory during recall from any part. The theory is extended to associations between time and object or reward to implement temporal order memory, also important in episodic memory. The dentate gyrus (DG) performs pattern separation by competitive learning to produce sparse representations suitable for setting up new representations in CA3 during learning, producing for example neurons with place-like fields from entorhinal cortex grid cells. The dentate granule cells produce by the very small number of mossy fiber (MF) connections to CA3 a randomizing pattern separation effect important during learning but not recall that separates out the patterns represented by CA3 firing to be very different from each other, which is optimal for an unstructured episodic memory system in which each memory must be kept distinct from other memories. The direct perforant path (pp) input to CA3 is quantitatively appropriate to provide the cue for recall in CA3, but not for learning. Tests of the theory including hippocampal subregion analyses and hippocampal NMDA receptor knockouts are described, and support the theory. PMID:23805074

  18. Scaling of the low-energy structure in above-threshold ionization in the tunneling regime: theory and experiment.

    PubMed

    Guo, L; Han, S S; Liu, X; Cheng, Y; Xu, Z Z; Fan, J; Chen, J; Chen, S G; Becker, W; Blaga, C I; DiChiara, A D; Sistrunk, E; Agostini, P; DiMauro, L F

    2013-01-04

    A calculation of the second-order (rescattering) term in the S-matrix expansion of above-threshold ionization is presented for the case when the binding potential is the unscreened Coulomb potential. Technical problems related to the divergence of the Coulomb scattering amplitude are avoided in the theory by considering the depletion of the atomic ground state due to the applied laser field, which is well defined and does not require the introduction of a screening constant. We focus on the low-energy structure, which was observed in recent experiments with a midinfrared wavelength laser field. Both the spectra and, in particular, the observed scaling versus the Keldysh parameter and the ponderomotive energy are reproduced. The theory provides evidence that the origin of the structure lies in the long-range Coulomb interaction.

  19. Rescattering Effects in the Hadronic-Light-by-Light Contribution to the Anomalous Magnetic Moment of the Muon

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colangelo, Gilberto; Hoferichter, Martin; Procura, Massimiliano

    We present a first model-independent calculation of ππ intermediate states in the hadronic-light-by-light (HLBL) contribution to the anomalous magnetic moment of the muon (g - 2) μ that goes beyond the scalar QED pion loop. To this end, we combine a recently developed dispersive description of the HLBL tensor with a partial-wave expansion and demonstrate that the known scalar-QED result is recovered after partial-wave resummation. Using dispersive fits to high-statistics data for the pion vector form factor, we provide an evaluation of the full pion box a π μ box = –15.9(2) x 10 -11. We then construct a suitablemore » input for the γ*γ* → ππ helicity partial waves, based on a pion-pole left-hand cut and show that for the dominant charged-pion contribution, this representation is consistent with the two-loop chiral prediction and the COMPASS measurement for the pion polarizability. This allows us to reliably estimate S-wave rescattering effects to the full pion box and leads to our final estimate for the sum of these two contributions a π μ box + a ππ,π-pole μ,J=0 LHC = –24(1) x 10 -11.« less

  20. Rescattering Effects in the Hadronic-Light-by-Light Contribution to the Anomalous Magnetic Moment of the Muon

    DOE PAGES

    Colangelo, Gilberto; Hoferichter, Martin; Procura, Massimiliano; ...

    2017-06-09

    We present a first model-independent calculation of ππ intermediate states in the hadronic-light-by-light (HLBL) contribution to the anomalous magnetic moment of the muon (g - 2) μ that goes beyond the scalar QED pion loop. To this end, we combine a recently developed dispersive description of the HLBL tensor with a partial-wave expansion and demonstrate that the known scalar-QED result is recovered after partial-wave resummation. Using dispersive fits to high-statistics data for the pion vector form factor, we provide an evaluation of the full pion box a π μ box = –15.9(2) x 10 -11. We then construct a suitablemore » input for the γ*γ* → ππ helicity partial waves, based on a pion-pole left-hand cut and show that for the dominant charged-pion contribution, this representation is consistent with the two-loop chiral prediction and the COMPASS measurement for the pion polarizability. This allows us to reliably estimate S-wave rescattering effects to the full pion box and leads to our final estimate for the sum of these two contributions a π μ box + a ππ,π-pole μ,J=0 LHC = –24(1) x 10 -11.« less

  1. Rings in above-threshold ionization: A quasiclassical analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lewenstein, M.; Kulander, K.C.; Schafer, K.J.

    1995-02-01

    A generalized strong-field approximation is formulated to describe atoms interacting with intense laser fields. We apply it to determine angular distributions of electrons in above-threshold ionization (ATI). The theory treats the effects of an electron rescattering from its parent ion core in a systematic perturbation series. Probability amplitudes for ionization are interpreted in terms of quasiclassical electron trajectories. We demonstrate that contributions from the direct tunneling processes in the absence of rescattering are not sufficient to describe the observed ATI spectra. We show that the high-energy portion of the spectrum, including recently discovered rings (i.e., complex features in the angularmore » distributions of outgoing electrons) are due to rescattering processes. We compare our quasiclassical results with exact numerical solutions.« less

  2. Rotorcraft control system design for uncertain vehicle dynamics using quantitative feedback theory

    NASA Technical Reports Server (NTRS)

    Hess, R. A.

    1994-01-01

    Quantitative Feedback Theory describes a frequency-domain technique for the design of multi-input, multi-output control systems which must meet time or frequency domain performance criteria when specified uncertainty exists in the linear description of the vehicle dynamics. This theory is applied to the design of the longitudinal flight control system for a linear model of the BO-105C rotorcraft. Uncertainty in the vehicle model is due to the variation in the vehicle dynamics over a range of airspeeds from 0-100 kts. For purposes of exposition, the vehicle description contains no rotor or actuator dynamics. The design example indicates the manner in which significant uncertainty exists in the vehicle model. The advantage of using a sequential loop closure technique to reduce the cost of feedback is demonstrated by example.

  3. Quantitative estimation of time-variable earthquake hazard by using fuzzy set theory

    NASA Astrophysics Data System (ADS)

    Deyi, Feng; Ichikawa, M.

    1989-11-01

    In this paper, the various methods of fuzzy set theory, called fuzzy mathematics, have been applied to the quantitative estimation of the time-variable earthquake hazard. The results obtained consist of the following. (1) Quantitative estimation of the earthquake hazard on the basis of seismicity data. By using some methods of fuzzy mathematics, seismicity patterns before large earthquakes can be studied more clearly and more quantitatively, highly active periods in a given region and quiet periods of seismic activity before large earthquakes can be recognized, similarities in temporal variation of seismic activity and seismic gaps can be examined and, on the other hand, the time-variable earthquake hazard can be assessed directly on the basis of a series of statistical indices of seismicity. Two methods of fuzzy clustering analysis, the method of fuzzy similarity, and the direct method of fuzzy pattern recognition, have been studied is particular. One method of fuzzy clustering analysis is based on fuzzy netting, and another is based on the fuzzy equivalent relation. (2) Quantitative estimation of the earthquake hazard on the basis of observational data for different precursors. The direct method of fuzzy pattern recognition has been applied to research on earthquake precursors of different kinds. On the basis of the temporal and spatial characteristics of recognized precursors, earthquake hazards in different terms can be estimated. This paper mainly deals with medium-short-term precursors observed in Japan and China.

  4. Nucleon form factors in dispersively improved chiral effective field theory. II. Electromagnetic form factors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alarcon, J. M.; Weiss, C.

    We study the nucleon electromagnetic form factors (EM FFs) using a recently developed method combining Chiral Effective Field Theory (more » $$\\chi$$EFT) and dispersion analysis. The spectral functions on the two-pion cut at $$t > 4 M_\\pi^2$$ are constructed using the elastic unitarity relation and an $N/D$ representation. $$\\chi$$EFT is used to calculate the real unctions $$J_\\pm^1 (t) = f_\\pm^1(t)/F_\\pi(t)$$ (ratios of the complex $$\\pi\\pi \\rightarrow N \\bar N$$ partial-wave amplitudes and the timelike pion FF), which are free of $$\\pi\\pi$$ rescattering. Rescattering effects are included through the empirical timelike pion FF $$|F_\\pi(t)|^2$$. The method allows us to compute the isovector EM spectral functions up to $$t \\sim 1$$ GeV$^2$ with controlled accuracy (LO, NLO, and partial N2LO). With the spectral functions we calculate the isovector nucleon EM FFs and their derivatives at $t = 0$ (EM radii, moments) using subtracted dispersion relations. We predict the values of higher FF derivatives with minimal uncertainties and explain their collective behavior. Finally, we estimate the individual proton and neutron FFs by adding an empirical parametrization of the isoscalar sector. Excellent agreement with the present low-$Q^2$ FF data is achieved up to $$\\sim$$0.5 GeV$^2$ for $$G_E$$, and up to $$\\sim$$0.2 GeV$^2$ for $$G_M$$. Our results can be used to guide the analysis of low-$Q^2$ elastic scattering data and the extraction of the proton charge radius.« less

  5. Nucleon form factors in dispersively improved chiral effective field theory. II. Electromagnetic form factors

    DOE PAGES

    Alarcon, J. M.; Weiss, C.

    2018-05-08

    We study the nucleon electromagnetic form factors (EM FFs) using a recently developed method combining Chiral Effective Field Theory (more » $$\\chi$$EFT) and dispersion analysis. The spectral functions on the two-pion cut at $$t > 4 M_\\pi^2$$ are constructed using the elastic unitarity relation and an $N/D$ representation. $$\\chi$$EFT is used to calculate the real unctions $$J_\\pm^1 (t) = f_\\pm^1(t)/F_\\pi(t)$$ (ratios of the complex $$\\pi\\pi \\rightarrow N \\bar N$$ partial-wave amplitudes and the timelike pion FF), which are free of $$\\pi\\pi$$ rescattering. Rescattering effects are included through the empirical timelike pion FF $$|F_\\pi(t)|^2$$. The method allows us to compute the isovector EM spectral functions up to $$t \\sim 1$$ GeV$^2$ with controlled accuracy (LO, NLO, and partial N2LO). With the spectral functions we calculate the isovector nucleon EM FFs and their derivatives at $t = 0$ (EM radii, moments) using subtracted dispersion relations. We predict the values of higher FF derivatives with minimal uncertainties and explain their collective behavior. Finally, we estimate the individual proton and neutron FFs by adding an empirical parametrization of the isoscalar sector. Excellent agreement with the present low-$Q^2$ FF data is achieved up to $$\\sim$$0.5 GeV$^2$ for $$G_E$$, and up to $$\\sim$$0.2 GeV$^2$ for $$G_M$$. Our results can be used to guide the analysis of low-$Q^2$ elastic scattering data and the extraction of the proton charge radius.« less

  6. Quantitative theory of driven nonlinear brain dynamics.

    PubMed

    Roberts, J A; Robinson, P A

    2012-09-01

    Strong periodic stimuli such as bright flashing lights evoke nonlinear responses in the brain and interact nonlinearly with ongoing cortical activity, but the underlying mechanisms for these phenomena are poorly understood at present. The dominant features of these experimentally observed dynamics are reproduced by the dynamics of a quantitative neural field model subject to periodic drive. Model power spectra over a range of drive frequencies show agreement with multiple features of experimental measurements, exhibiting nonlinear effects including entrainment over a range of frequencies around the natural alpha frequency f(α), subharmonic entrainment near 2f(α), and harmonic generation. Further analysis of the driven dynamics as a function of the drive parameters reveals rich nonlinear dynamics that is predicted to be observable in future experiments at high drive amplitude, including period doubling, bistable phase-locking, hysteresis, wave mixing, and chaos indicated by positive Lyapunov exponents. Moreover, photosensitive seizures are predicted for physiologically realistic model parameters yielding bistability between healthy and seizure dynamics. These results demonstrate the applicability of neural field models to the new regime of periodically driven nonlinear dynamics, enabling interpretation of experimental data in terms of specific generating mechanisms and providing new tests of the theory. Copyright © 2012 Elsevier Inc. All rights reserved.

  7. Quantitative theory of hydrophobic effect as a driving force of protein structure

    PubMed Central

    Perunov, Nikolay; England, Jeremy L

    2014-01-01

    Various studies suggest that the hydrophobic effect plays a major role in driving the folding of proteins. In the past, however, it has been challenging to translate this understanding into a predictive, quantitative theory of how the full pattern of sequence hydrophobicity in a protein shapes functionally important features of its tertiary structure. Here, we extend and apply such a phenomenological theory of the sequence-structure relationship in globular protein domains, which had previously been applied to the study of allosteric motion. In an effort to optimize parameters for the model, we first analyze the patterns of backbone burial found in single-domain crystal structures, and discover that classic hydrophobicity scales derived from bulk physicochemical properties of amino acids are already nearly optimal for prediction of burial using the model. Subsequently, we apply the model to studying structural fluctuations in proteins and establish a means of identifying ligand-binding and protein–protein interaction sites using this approach. PMID:24408023

  8. A robust rotorcraft flight control system design methodology utilizing quantitative feedback theory

    NASA Technical Reports Server (NTRS)

    Gorder, Peter James

    1993-01-01

    Rotorcraft flight control systems present design challenges which often exceed those associated with fixed-wing aircraft. First, large variations in the response characteristics of the rotorcraft result from the wide range of airspeeds of typical operation (hover to over 100 kts). Second, the assumption of vehicle rigidity often employed in the design of fixed-wing flight control systems is rarely justified in rotorcraft where rotor degrees of freedom can have a significant impact on the system performance and stability. This research was intended to develop a methodology for the design of robust rotorcraft flight control systems. Quantitative Feedback Theory (QFT) was chosen as the basis for the investigation. Quantitative Feedback Theory is a technique which accounts for variability in the dynamic response of the controlled element in the design robust control systems. It was developed to address a Multiple-Input Single-Output (MISO) design problem, and utilizes two degrees of freedom to satisfy the design criteria. Two techniques were examined for extending the QFT MISO technique to the design of a Multiple-Input-Multiple-Output (MIMO) flight control system (FCS) for a UH-60 Black Hawk Helicopter. In the first, a set of MISO systems, mathematically equivalent to the MIMO system, was determined. QFT was applied to each member of the set simultaneously. In the second, the same set of equivalent MISO systems were analyzed sequentially, with closed loop response information from each loop utilized in subsequent MISO designs. The results of each technique were compared, and the advantages of the second, termed Sequential Loop Closure, were clearly evident.

  9. Demand theory of gene regulation. II. Quantitative application to the lactose and maltose operons of Escherichia coli.

    PubMed Central

    Savageau, M A

    1998-01-01

    Induction of gene expression can be accomplished either by removing a restraining element (negative mode of control) or by providing a stimulatory element (positive mode of control). According to the demand theory of gene regulation, which was first presented in qualitative form in the 1970s, the negative mode will be selected for the control of a gene whose function is in low demand in the organism's natural environment, whereas the positive mode will be selected for the control of a gene whose function is in high demand. This theory has now been further developed in a quantitative form that reveals the importance of two key parameters: cycle time C, which is the average time for a gene to complete an ON/OFF cycle, and demand D, which is the fraction of the cycle time that the gene is ON. Here we estimate nominal values for the relevant mutation rates and growth rates and apply the quantitative demand theory to the lactose and maltose operons of Escherichia coli. The results define regions of the C vs. D plot within which selection for the wild-type regulatory mechanisms is realizable, and these in turn provide the first estimates for the minimum and maximum values of demand that are required for selection of the positive and negative modes of gene control found in these systems. The ratio of mutation rate to selection coefficient is the most relevant determinant of the realizable region for selection, and the most influential parameter is the selection coefficient that reflects the reduction in growth rate when there is superfluous expression of a gene. The quantitative theory predicts the rate and extent of selection for each mode of control. It also predicts three critical values for the cycle time. The predicted maximum value for the cycle time C is consistent with the lifetime of the host. The predicted minimum value for C is consistent with the time for transit through the intestinal tract without colonization. Finally, the theory predicts an optimum value

  10. Nucleon form factors in dispersively improved chiral effective field theory. II. Electromagnetic form factors

    NASA Astrophysics Data System (ADS)

    Alarcón, J. M.; Weiss, C.

    2018-05-01

    We study the nucleon electromagnetic form factors (EM FFs) using a recently developed method combining chiral effective field theory (χ EFT ) and dispersion analysis. The spectral functions on the two-pion cut at t >4 Mπ2 are constructed using the elastic unitarity relation and an N /D representation. χ EFT is used to calculate the real functions J±1(t ) =f±1(t ) /Fπ(t ) (ratios of the complex π π →N N ¯ partial-wave amplitudes and the timelike pion FF), which are free of π π rescattering. Rescattering effects are included through the empirical timelike pion FF | Fπ(t) | 2 . The method allows us to compute the isovector EM spectral functions up to t ˜1 GeV2 with controlled accuracy (leading order, next-to-leading order, and partial next-to-next-to-leading order). With the spectral functions we calculate the isovector nucleon EM FFs and their derivatives at t =0 (EM radii, moments) using subtracted dispersion relations. We predict the values of higher FF derivatives, which are not affected by higher-order chiral corrections and are obtained almost parameter-free in our approach, and explain their collective behavior. We estimate the individual proton and neutron FFs by adding an empirical parametrization of the isoscalar sector. Excellent agreement with the present low-Q2 FF data is achieved up to ˜0.5 GeV2 for GE, and up to ˜0.2 GeV2 for GM. Our results can be used to guide the analysis of low-Q2 elastic scattering data and the extraction of the proton charge radius.

  11. Hard Break-Up of Two-Nucleons and QCD Dynamics of NN Interaction

    NASA Astrophysics Data System (ADS)

    Sargsian, Misak

    2008-10-01

    We discus recent developments in theory of high energy two-body break-up of few-nucleon systems. The characteristics of these reactions are such that the hard two-body quasielastic subprocess can be clearly separated from the accompanying soft subprocesses. We discuss in details the hard rescattering model (HRM) in which hard photodisintegration develops in two stages. At first, photon knocks-out an energetic quark which rescatters subsequently with a quark of the other nucleon. The latter provides a mechanism of sharing the initial high momentum of the photon between two outgoing nucleons. This final state hard rescattering can be expressed through the hard NN scattering amplitude. Within HRM we discuss hard break-up reactions involving D and 3He targets and demonstrate how these reactions are sensitive to the dynamics of hard pn and pp interaction. Another development of HRM is the prediction of new helicity selection mechanism for hard two-body reactions, which was apparently confirmed in the recent JLab experiment.

  12. Demonstration of a viable quantitative theory for interplanetary type II radio bursts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmidt, J. M., E-mail: jschmidt@physics.usyd.edu.au; Cairns, Iver H.

    Between 29 November and 1 December 2013 the two widely separated spacecraft STEREO A and B observed a long lasting, intermittent, type II radio burst for the extended frequency range ≈ 4 MHz to 30 kHz, including an intensification when the shock wave of the associated coronal mass ejection (CME) reached STEREO A. We demonstrate for the first time our ability to quantitatively and accurately simulate the fundamental (F) and harmonic (H) emission of type II bursts from the higher corona (near 11 solar radii) to 1 AU. Our modeling requires the combination of data-driven three-dimensional magnetohydrodynamic simulations for the CME andmore » plasma background, carried out with the BATS-R-US code, with an analytic quantitative kinetic model for both F and H radio emission, including the electron reflection at the shock, growth of Langmuir waves and radio waves, and the radiations propagation to an arbitrary observer. The intensities and frequencies of the observed radio emissions vary hugely by factors ≈ 10{sup 6} and ≈ 10{sup 3}, respectively; the theoretical predictions are impressively accurate, being typically in error by less than a factor of 10 and 20 %, for both STEREO A and B. We also obtain accurate predictions for the timing and characteristics of the shock and local radio onsets at STEREO A, the lack of such onsets at STEREO B, and the z-component of the magnetic field at STEREO A ahead of the shock, and in the sheath. Very strong support is provided by these multiple agreements for the theory, the efficacy of the BATS-R-US code, and the vision of using type IIs and associated data-theory iterations to predict whether a CME will impact Earth’s magnetosphere and drive space weather events.« less

  13. Demonstration of a viable quantitative theory for interplanetary type II radio bursts

    NASA Astrophysics Data System (ADS)

    Schmidt, J. M.; Cairns, Iver H.

    2016-03-01

    Between 29 November and 1 December 2013 the two widely separated spacecraft STEREO A and B observed a long lasting, intermittent, type II radio burst for the extended frequency range ≈ 4 MHz to 30 kHz, including an intensification when the shock wave of the associated coronal mass ejection (CME) reached STEREO A. We demonstrate for the first time our ability to quantitatively and accurately simulate the fundamental (F) and harmonic (H) emission of type II bursts from the higher corona (near 11 solar radii) to 1 AU. Our modeling requires the combination of data-driven three-dimensional magnetohydrodynamic simulations for the CME and plasma background, carried out with the BATS-R-US code, with an analytic quantitative kinetic model for both F and H radio emission, including the electron reflection at the shock, growth of Langmuir waves and radio waves, and the radiations propagation to an arbitrary observer. The intensities and frequencies of the observed radio emissions vary hugely by factors ≈ 106 and ≈ 103, respectively; the theoretical predictions are impressively accurate, being typically in error by less than a factor of 10 and 20 %, for both STEREO A and B. We also obtain accurate predictions for the timing and characteristics of the shock and local radio onsets at STEREO A, the lack of such onsets at STEREO B, and the z-component of the magnetic field at STEREO A ahead of the shock, and in the sheath. Very strong support is provided by these multiple agreements for the theory, the efficacy of the BATS-R-US code, and the vision of using type IIs and associated data-theory iterations to predict whether a CME will impact Earth's magnetosphere and drive space weather events.

  14. Nucleon form factors in dispersively improved chiral effective field theory: Scalar form factor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alarcon Soriano, Jose Manuel; Weiss, Christian

    We propose a method for calculating the nucleon form factors (FFs) ofmore » $G$-parity-even operators by combining Chiral Effective Field Theory ($$\\chi$$EFT) and dispersion analysis. The FFs are expressed as dispersive integrals over the two-pion cut at $$t > 4 M_\\pi^2$$. The spectral functions are obtained from the elastic unitarity condition and expressed as products of the complex $$\\pi\\pi \\rightarrow N\\bar N$$ partial-wave amplitudes and the timelike pion FF. $$\\chi$$EFT is used to calculate the ratio of the partial-wave amplitudes and the pion FF, which is real and free of $$\\pi\\pi$$ rescattering in the $t$-channel ($N/D$ method). The rescattering effects are then incorporated by multiplying with the squared modulus of the empirical pion FF. The procedure results in a marked improvement compared to conventional $$\\chi$$EFT calculations of the spectral functions. We apply the method to the nucleon scalar FF and compute the scalar spectral function, the scalar radius, the $t$-dependent FF, and the Cheng-Dashen discrepancy. Higher-order chiral corrections are estimated through the $$\\pi N$$ low-energy constants. Results are in excellent agreement with dispersion-theoretical calculations. We elaborate several other interesting aspects of our method. The results show proper scaling behavior in the large-$$N_c$$ limit of QCD because the $$\\chi$$EFT includes $N$ and $$\\Delta$$ intermediate states. The squared modulus of the timelike pion FF required by our method can be extracted from Lattice QCD calculations of vacuum correlation functions of the operator at large Euclidean distances. Our method can be applied to the nucleon FFs of other operators of interest, such as the isovector-vector current, the energy-momentum tensor, and twist-2 QCD operators (moments of generalized parton distributions).« less

  15. Nucleon form factors in dispersively improved chiral effective field theory: Scalar form factor

    DOE PAGES

    Alarcon Soriano, Jose Manuel; Weiss, Christian

    2017-11-20

    We propose a method for calculating the nucleon form factors (FFs) ofmore » $G$-parity-even operators by combining Chiral Effective Field Theory ($$\\chi$$EFT) and dispersion analysis. The FFs are expressed as dispersive integrals over the two-pion cut at $$t > 4 M_\\pi^2$$. The spectral functions are obtained from the elastic unitarity condition and expressed as products of the complex $$\\pi\\pi \\rightarrow N\\bar N$$ partial-wave amplitudes and the timelike pion FF. $$\\chi$$EFT is used to calculate the ratio of the partial-wave amplitudes and the pion FF, which is real and free of $$\\pi\\pi$$ rescattering in the $t$-channel ($N/D$ method). The rescattering effects are then incorporated by multiplying with the squared modulus of the empirical pion FF. The procedure results in a marked improvement compared to conventional $$\\chi$$EFT calculations of the spectral functions. We apply the method to the nucleon scalar FF and compute the scalar spectral function, the scalar radius, the $t$-dependent FF, and the Cheng-Dashen discrepancy. Higher-order chiral corrections are estimated through the $$\\pi N$$ low-energy constants. Results are in excellent agreement with dispersion-theoretical calculations. We elaborate several other interesting aspects of our method. The results show proper scaling behavior in the large-$$N_c$$ limit of QCD because the $$\\chi$$EFT includes $N$ and $$\\Delta$$ intermediate states. The squared modulus of the timelike pion FF required by our method can be extracted from Lattice QCD calculations of vacuum correlation functions of the operator at large Euclidean distances. Our method can be applied to the nucleon FFs of other operators of interest, such as the isovector-vector current, the energy-momentum tensor, and twist-2 QCD operators (moments of generalized parton distributions).« less

  16. Time-dependent observables in heavy ion collisions. Part I. Setting up the formalism

    NASA Astrophysics Data System (ADS)

    Wu, Bin; Kovchegov, Yuri V.

    2018-03-01

    We adapt the Schwinger-Keldysh formalism to study heavy-ion collisions in perturbative QCD. Employing the formalism, we calculate the two-point gluon correlation function G 22 aμ, bν due to the lowest-order classical gluon fields in the McLerran-Venugopalan model of heavy ion collisions and observe an interesting transition from the classical fields to the quasi-particle picture at later times. Motivated by this observation, we push the formalism to higher orders in the coupling and calculate the contribution to G 22 aμ, bν coming from the diagrams representing a single rescattering between two of the produced gluons. We assume that the two gluons go on mass shell both before and after the rescattering. The result of our calculation depends on which region of integration over the proper time of the rescattering τ Z gives the correct correlation function at late proper time τ when the gluon distribution is measured. For (i) τ Z ≫ 1 /Q s and τ - τ Z ≫ 1 /Q s (with Q s the saturation scale) we obtain the same results as from the Boltzmann equation. For (ii) τ - τ Z ≫ τ Z ≫ 1 /Q s we end up with a result very different from kinetic theory and consistent with a picture of "free-streaming" particles. Due to the approximations made, our calculation is too coarse to indicate whether the region (i) or (ii) is the correct one: to resolve this controversy, we shall present a detailed diagrammatic calculation of the rescattering correction in the φ 4 theory in the second paper of this duplex.

  17. Single-shot carrier-envelope-phase-tagged ion-momentum imaging of nonsequential double ionization of argon in intense 4-fs laser fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Nora G.; Herrwerth, O.; Wirth, A.

    2011-01-15

    Single-shot carrier-envelope-phase (CEP) tagging is combined with a reaction mircoscope (REMI) to investigate CEP-dependent processes in atoms. Excellent experimental stability and data acquisition longevity are achieved. Using this approach, we study the CEP effects for nonsequential double ionization of argon in 4-fs laser fields at 750 nm and an intensity of 1.6x10{sup 14} W/cm{sup 2}. The Ar{sup 2+} ionization yield shows a pronounced CEP dependence which compares well with recent theoretical predictions employing quantitative rescattering theory [S. Micheau et al., Phys. Rev. A 79, 013417 (2009)]. Furthermore, we find strong CEP influences on the Ar{sup 2+} momentum spectra along themore » laser polarization axis.« less

  18. Theory and preliminary experimental verification of quantitative edge illumination x-ray phase contrast tomography.

    PubMed

    Hagen, C K; Diemoz, P C; Endrizzi, M; Rigon, L; Dreossi, D; Arfelli, F; Lopez, F C M; Longo, R; Olivo, A

    2014-04-07

    X-ray phase contrast imaging (XPCi) methods are sensitive to phase in addition to attenuation effects and, therefore, can achieve improved image contrast for weakly attenuating materials, such as often encountered in biomedical applications. Several XPCi methods exist, most of which have already been implemented in computed tomographic (CT) modality, thus allowing volumetric imaging. The Edge Illumination (EI) XPCi method had, until now, not been implemented as a CT modality. This article provides indications that quantitative 3D maps of an object's phase and attenuation can be reconstructed from EI XPCi measurements. Moreover, a theory for the reconstruction of combined phase and attenuation maps is presented. Both reconstruction strategies find applications in tissue characterisation and the identification of faint, weakly attenuating details. Experimental results for wires of known materials and for a biological object validate the theory and confirm the superiority of the phase over conventional, attenuation-based image contrast.

  19. Quantitative Reappraisal of the Helmholtz-Guyton Resonance Theory of Frequency Tuning in the Cochlea

    PubMed Central

    Babbs, Charles F.

    2011-01-01

    To explore the fundamental biomechanics of sound frequency transduction in the cochlea, a two-dimensional analytical model of the basilar membrane was constructed from first principles. Quantitative analysis showed that axial forces along the membrane are negligible, condensing the problem to a set of ordered one-dimensional models in the radial dimension, for which all parameters can be specified from experimental data. Solutions of the radial models for asymmetrical boundary conditions produce realistic deformation patterns. The resulting second-order differential equations, based on the original concepts of Helmholtz and Guyton, and including viscoelastic restoring forces, predict a frequency map and amplitudes of deflections that are consistent with classical observations. They also predict the effects of an observation hole drilled in the surrounding bone, the effects of curvature of the cochlear spiral, as well as apparent traveling waves under a variety of experimental conditions. A quantitative rendition of the classical Helmholtz-Guyton model captures the essence of cochlear mechanics and unifies the competing resonance and traveling wave theories. PMID:22028708

  20. A Network Neuroscience of Human Learning: Potential To Inform Quantitative Theories of Brain and Behavior

    PubMed Central

    Bassett, Danielle S.; Mattar, Marcelo G.

    2017-01-01

    Humans adapt their behavior to their external environment in a process often facilitated by learning. Efforts to describe learning empirically can be complemented by quantitative theories that map changes in neurophysiology to changes in behavior. In this review we highlight recent advances in network science that offer a sets of tools and a general perspective that may be particularly useful in understanding types of learning that are supported by distributed neural circuits. We describe recent applications of these tools to neuroimaging data that provide unique insights into adaptive neural processes, the attainment of knowledge, and the acquisition of new skills, forming a network neuroscience of human learning. While promising, the tools have yet to be linked to the well-formulated models of behavior that are commonly utilized in cognitive psychology. We argue that continued progress will require the explicit marriage of network approaches to neuroimaging data and quantitative models of behavior. PMID:28259554

  1. A Network Neuroscience of Human Learning: Potential to Inform Quantitative Theories of Brain and Behavior.

    PubMed

    Bassett, Danielle S; Mattar, Marcelo G

    2017-04-01

    Humans adapt their behavior to their external environment in a process often facilitated by learning. Efforts to describe learning empirically can be complemented by quantitative theories that map changes in neurophysiology to changes in behavior. In this review we highlight recent advances in network science that offer a sets of tools and a general perspective that may be particularly useful in understanding types of learning that are supported by distributed neural circuits. We describe recent applications of these tools to neuroimaging data that provide unique insights into adaptive neural processes, the attainment of knowledge, and the acquisition of new skills, forming a network neuroscience of human learning. While promising, the tools have yet to be linked to the well-formulated models of behavior that are commonly utilized in cognitive psychology. We argue that continued progress will require the explicit marriage of network approaches to neuroimaging data and quantitative models of behavior. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Rotorcraft flight control design using quantitative feedback theory and dynamic crossfeeds

    NASA Technical Reports Server (NTRS)

    Cheng, Rendy P.

    1995-01-01

    A multi-input, multi-output controls design with robust crossfeeds is presented for a rotorcraft in near-hovering flight using quantitative feedback theory (QFT). Decoupling criteria are developed for dynamic crossfeed design and implementation. Frequency dependent performance metrics focusing on piloted flight are developed and tested on 23 flight configurations. The metrics show that the resulting design is superior to alternative control system designs using conventional fixed-gain crossfeeds and to feedback-only designs which rely on high gains to suppress undesired off-axis responses. The use of dynamic, robust crossfeeds prior to the QFT design reduces the magnitude of required feedback gain and results in performance that meets current handling qualities specifications relative to the decoupling of off-axis responses. The combined effect of the QFT feedback design following the implementation of low-order, dynamic crossfeed compensator successfully decouples ten of twelve off-axis channels. For the other two channels it was not possible to find a single, low-order crossfeed that was effective.

  3. Towards a Quantitative Endogenous Network Theory of Cancer Genesis and Progression: beyond ``cancer as diseases of genome''

    NASA Astrophysics Data System (ADS)

    Ao, Ping

    2011-03-01

    There has been a tremendous progress in cancer research. However, it appears the current dominant cancer research framework of regarding cancer as diseases of genome leads impasse. Naturally questions have been asked that whether it is possible to develop alternative frameworks such that they can connect both to mutations and other genetic/genomic effects and to environmental factors. Furthermore, such framework can be made quantitative and with predictions experimentally testable. In this talk, I will present a positive answer to this calling. I will explain on our construction of endogenous network theory based on molecular-cellular agencies as dynamical variable. Such cancer theory explicitly demonstrates a profound connection to many fundamental concepts in physics, as such stochastic non-equilibrium processes, ``energy'' landscape, metastability, etc. It suggests that neneath cancer's daunting complexity may lie a simplicity that gives grounds for hope. The rationales behind such theory, its predictions, and its initial experimental verifications will be presented. Supported by USA NIH and China NSF.

  4. Quantitative Theoretical and Conceptual Framework Use in Agricultural Education Research

    ERIC Educational Resources Information Center

    Kitchel, Tracy; Ball, Anna L.

    2014-01-01

    The purpose of this philosophical paper was to articulate the disciplinary tenets for consideration when using theory in agricultural education quantitative research. The paper clarified terminology around the concept of theory in social sciences and introduced inaccuracies of theory use in agricultural education quantitative research. Finally,…

  5. Using the realist perspective to link theory from qualitative evidence synthesis to quantitative studies: Broadening the matrix approach.

    PubMed

    van Grootel, Leonie; van Wesel, Floryt; O'Mara-Eves, Alison; Thomas, James; Hox, Joop; Boeije, Hennie

    2017-09-01

    This study describes an approach for the use of a specific type of qualitative evidence synthesis in the matrix approach, a mixed studies reviewing method. The matrix approach compares quantitative and qualitative data on the review level by juxtaposing concrete recommendations from the qualitative evidence synthesis against interventions in primary quantitative studies. However, types of qualitative evidence syntheses that are associated with theory building generate theoretical models instead of recommendations. Therefore, the output from these types of qualitative evidence syntheses cannot directly be used for the matrix approach but requires transformation. This approach allows for the transformation of these types of output. The approach enables the inference of moderation effects instead of direct effects from the theoretical model developed in a qualitative evidence synthesis. Recommendations for practice are formulated on the basis of interactional relations inferred from the qualitative evidence synthesis. In doing so, we apply the realist perspective to model variables from the qualitative evidence synthesis according to the context-mechanism-outcome configuration. A worked example shows that it is possible to identify recommendations from a theory-building qualitative evidence synthesis using the realist perspective. We created subsets of the interventions from primary quantitative studies based on whether they matched the recommendations or not and compared the weighted mean effect sizes of the subsets. The comparison shows a slight difference in effect sizes between the groups of studies. The study concludes that the approach enhances the applicability of the matrix approach. Copyright © 2017 John Wiley & Sons, Ltd.

  6. Exploration of laser-driven electron-multirescattering dynamics in high-order harmonic generation

    DOE PAGES

    Li, Peng -Cheng; Sheu, Yae -Lin; Jooya, Hossein Z.; ...

    2016-09-06

    Multiple rescattering processes play an important role in high-order harmonic generation (HHG) in an intense laser field. However, the underlying multi-rescattering dynamics are still largely unexplored. Here we investigate the dynamical origin of multiple rescattering processes in HHG associated with the odd and even number of returning times of the electron to the parent ion. We perform fully ab initio quantum calculations and extend the empirical mode decomposition method to extract the individual multiple scattering contributions in HHG. We find that the tunneling ionization regime is responsible for the odd number times of rescattering and the corresponding short trajectories aremore » dominant. On the other hand, the multiphoton ionization regime is responsible for the even number times of rescattering and the corresponding long trajectories are dominant. Moreover, we discover that the multiphoton- and tunneling-ionization regimes in multiple rescattering processes occur alternatively. Our results uncover the dynamical origin of multiple rescattering processes in HHG for the first time. As a result, it also provides new insight regarding the control of the multiple rescattering processes for the optimal generation of ultrabroad band supercontinuum spectra and the production of single ultrashort attosecond laser pulse.« less

  7. Exploration of laser-driven electron-multirescattering dynamics in high-order harmonic generation

    PubMed Central

    Li, Peng-Cheng; Sheu, Yae-Lin; Jooya, Hossein Z.; Zhou, Xiao-Xin; Chu, Shih-I

    2016-01-01

    Multiple rescattering processes play an important role in high-order harmonic generation (HHG) in an intense laser field. However, the underlying multi-rescattering dynamics are still largely unexplored. Here we investigate the dynamical origin of multiple rescattering processes in HHG associated with the odd and even number of returning times of the electron to the parent ion. We perform fully ab initio quantum calculations and extend the empirical mode decomposition method to extract the individual multiple scattering contributions in HHG. We find that the tunneling ionization regime is responsible for the odd number times of rescattering and the corresponding short trajectories are dominant. On the other hand, the multiphoton ionization regime is responsible for the even number times of rescattering and the corresponding long trajectories are dominant. Moreover, we discover that the multiphoton- and tunneling-ionization regimes in multiple rescattering processes occur alternatively. Our results uncover the dynamical origin of multiple rescattering processes in HHG for the first time. It also provides new insight regarding the control of the multiple rescattering processes for the optimal generation of ultrabroad band supercontinuum spectra and the production of single ultrashort attosecond laser pulse. PMID:27596056

  8. Exploration of laser-driven electron-multirescattering dynamics in high-order harmonic generation.

    PubMed

    Li, Peng-Cheng; Sheu, Yae-Lin; Jooya, Hossein Z; Zhou, Xiao-Xin; Chu, Shih-I

    2016-09-06

    Multiple rescattering processes play an important role in high-order harmonic generation (HHG) in an intense laser field. However, the underlying multi-rescattering dynamics are still largely unexplored. Here we investigate the dynamical origin of multiple rescattering processes in HHG associated with the odd and even number of returning times of the electron to the parent ion. We perform fully ab initio quantum calculations and extend the empirical mode decomposition method to extract the individual multiple scattering contributions in HHG. We find that the tunneling ionization regime is responsible for the odd number times of rescattering and the corresponding short trajectories are dominant. On the other hand, the multiphoton ionization regime is responsible for the even number times of rescattering and the corresponding long trajectories are dominant. Moreover, we discover that the multiphoton- and tunneling-ionization regimes in multiple rescattering processes occur alternatively. Our results uncover the dynamical origin of multiple rescattering processes in HHG for the first time. It also provides new insight regarding the control of the multiple rescattering processes for the optimal generation of ultrabroad band supercontinuum spectra and the production of single ultrashort attosecond laser pulse.

  9. Time-dependent observables in heavy ion collisions. Part II. In search of pressure isotropization in the φ 4 theory

    NASA Astrophysics Data System (ADS)

    Kovchegov, Yuri V.; Wu, Bin

    2018-03-01

    To understand the dynamics of thermalization in heavy ion collisions in the perturbative framework it is essential to first find corrections to the free-streaming classical gluon fields of the McLerran-Venugopalan model. The corrections that lead to deviations from free streaming (and that dominate at late proper time) would provide evidence for the onset of isotropization (and, possibly, thermalization) of the produced medium. To find such corrections we calculate the late-time two-point Green function and the energy-momentum tensor due to a single 2 → 2 scattering process involving two classical fields. To make the calculation tractable we employ the scalar φ 4 theory instead of QCD. We compare our exact diagrammatic results for these quantities to those in kinetic theory and find disagreement between the two. The disagreement is in the dependence on the proper time τ and, for the case of the two-point function, is also in the dependence on the space-time rapidity η: the exact diagrammatic calculation is, in fact, consistent with the free streaming scenario. Kinetic theory predicts a build-up of longitudinal pressure, which, however, is not observed in the exact calculation. We conclude that we find no evidence for the beginning of the transition from the free-streaming classical fields to the kinetic theory description of the produced matter after a single 2 → 2 rescattering.

  10. Quantitative reactive modeling and verification.

    PubMed

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  11. Quantitative collision induced mass spectrometry of substituted piperazines - A correlative analysis between theory and experiment

    NASA Astrophysics Data System (ADS)

    Ivanova, Bojidarka; Spiteller, Michael

    2017-12-01

    The present paper deals with quantitative kinetics and thermodynamics of collision induced dissociation (CID) reactions of piperazines under different experimental conditions together with a systematic description of effect of counter-ions on common MS fragment reactions of piperazines; and intra-molecular effect of quaternary cyclization of substituted piperazines yielding to quaternary salts. There are discussed quantitative model equations of rate constants as well as free Gibbs energies of series of m-independent CID fragment processes in GP, which have been evidenced experimentally. Both kinetic and thermodynamic parameters are also predicted by computational density functional theory (DFT) and ab initio both static and dynamic methods. The paper examines validity of Maxwell-Boltzmann distribution to non-Boltzmann CID processes in quantitatively as well. The experiments conducted within the latter framework yield to an excellent correspondence with theoretical quantum chemical modeling. The important property of presented model equations of reaction kinetics is the applicability in predicting unknown and assigning of known mass spectrometric (MS) patterns. The nature of "GP" continuum of CID-MS coupled scheme of measurements with electrospray ionization (ESI) source is discussed, performing parallel computations in gas-phase (GP) and polar continuum at different temperatures and ionic strengths. The effect of pressure is presented. The study contributes significantly to methodological and phenomenological developments of CID-MS and its analytical implementations for quantitative and structural analyses. It also demonstrates great prospective of a complementary application of experimental CID-MS and computational quantum chemistry studying chemical reactivity, among others. To a considerable extend this work underlies the place of computational quantum chemistry to the field of experimental analytical chemistry in particular highlighting the structural analysis.

  12. Controls design with crossfeeds for hovering rotorcraft using quantitative feedback theory

    NASA Technical Reports Server (NTRS)

    Tischler, Mark B.; Biezad, Daniel J.; Cheng, Rendy

    1996-01-01

    A multi-input, multi-output controls design with dynamic crossfeed pre-compensation is presented for rotorcraft in near-hovering flight using Quantitative Feedback Theory (QFT). The resulting closed-loop control system bandwidth allows the rotorcraft to be considered for use as an inflight simulator. The use of dynamic, robust crossfeeds prior to the QFT design reduces the magnitude of required feedback gain and results in performance that meets most handling qualities specifications relative to the decoupling of off-axis responses. Handling qualities are Level 1 for both low-gain tasks and high-gain tasks in the roll, pitch, and yaw axes except for the 10 deg/sec moderate-amplitude yaw command where the rotorcraft exhibits Level 2 handling qualities in the yaw axis caused by phase lag. The combined effect of the QFT feedback design following the implementation of low-order, dynamic crossfeed compensators successfully decouples ten of twelve off-axis channels. For the other two channels it was not possible to find a single, low-order crossfeed that was effective. This is an area to be investigated in future research.

  13. Synthesis strategy: building a culturally sensitive mid-range theory of risk perception using literary, quantitative, and qualitative methods.

    PubMed

    Siaki, Leilani A; Loescher, Lois J; Trego, Lori L

    2013-03-01

    This article presents a discussion of development of a mid-range theory of risk perception. Unhealthy behaviours contribute to the development of health inequalities worldwide. The link between perceived risk and successful health behaviour change is inconclusive, particularly in vulnerable populations. This may be attributed to inattention to culture. The synthesis strategy of theory building guided the process using three methods: (1) a systematic review of literature published between 2000-2011 targeting perceived risk in vulnerable populations; (2) qualitative and (3) quantitative data from a study of Samoan Pacific Islanders at high risk of cardiovascular disease and diabetes. Main concepts of this theory include risk attention, appraisal processes, cognition, and affect. Overarching these concepts is health-world view: cultural ways of knowing, beliefs, values, images, and ideas. This theory proposes the following: (1) risk attention varies based on knowledge of the health risk in the context of health-world views; (2) risk appraisals are influenced by affect, health-world views, cultural customs, and protocols that intersect with the health risk; (3) strength of cultural beliefs, values, and images (cultural identity) mediate risk attention and risk appraisal influencing the likelihood that persons will engage in health-promoting behaviours that may contradict cultural customs/protocols. Interventions guided by a culturally sensitive mid-range theory may improve behaviour-related health inequalities in vulnerable populations. The synthesis strategy is an intensive process for developing a culturally sensitive mid-range theory. Testing of the theory will ascertain its usefulness for reducing health inequalities in vulnerable groups. © 2012 Blackwell Publishing Ltd.

  14. Application of Fault Management Theory to the Quantitative Selection of a Launch Vehicle Abort Trigger Suite

    NASA Technical Reports Server (NTRS)

    Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.

    2014-01-01

    The theory of System Health Management (SHM) and of its operational subset Fault Management (FM) states that FM is implemented as a "meta" control loop, known as an FM Control Loop (FMCL). The FMCL detects that all or part of a system is now failed, or in the future will fail (that is, cannot be controlled within acceptable limits to achieve its objectives), and takes a control action (a response) to return the system to a controllable state. In terms of control theory, the effectiveness of each FMCL is estimated based on its ability to correctly estimate the system state, and on the speed of its response to the current or impending failure effects. This paper describes how this theory has been successfully applied on the National Aeronautics and Space Administration's (NASA) Space Launch System (SLS) Program to quantitatively estimate the effectiveness of proposed abort triggers so as to select the most effective suite to protect the astronauts from catastrophic failure of the SLS. The premise behind this process is to be able to quantitatively provide the value versus risk trade-off for any given abort trigger, allowing decision makers to make more informed decisions. All current and planned crewed launch vehicles have some form of vehicle health management system integrated with an emergency launch abort system to ensure crew safety. While the design can vary, the underlying principle is the same: detect imminent catastrophic vehicle failure, initiate launch abort, and extract the crew to safety. Abort triggers are the detection mechanisms that identify that a catastrophic launch vehicle failure is occurring or is imminent and cause the initiation of a notification to the crew vehicle that the escape system must be activated. While ensuring that the abort triggers provide this function, designers must also ensure that the abort triggers do not signal that a catastrophic failure is imminent when in fact the launch vehicle can successfully achieve orbit. That is

  15. Characteristics of quantitative nursing research from 1990 to 2010.

    PubMed

    Yarcheski, Adela; Mahon, Noreen E

    2013-12-01

    To assess author credentials of quantitative research in nursing, the composition of the research teams, and the disciplinary focus of the theories tested. Nursing Research, Western Journal of Nursing Research, and Journal of Advanced Nursing were selected for this descriptive study; 1990, 1995, 2000, 2005, and 2010 were included. The final sample consisted of 484 quantitative research articles. From 1990 to 2010, there was an increase in first authors holding doctoral degrees, research from other countries, and funding. Solo authorship decreased; multi-authorship and multidisciplinary teams increased. Theories tested were mostly from psychology; the testing of nursing theory was modest. Multidisciplinary research far outdistanced interdisciplinary research. Quantitative nursing research can be characterized as multidisciplinary (distinct theories from different disciplines) rather than discipline-specific to nursing. Interdisciplinary (theories synthesized from different disciplines) research has been conducted minimally. This study provides information about the growth of the scientific knowledge base of nursing, which has implications for practice. © 2013 Sigma Theta Tau International.

  16. Application of Fault Management Theory to the Quantitive Selection of a Launch Vehicle Abort Trigger Suite

    NASA Technical Reports Server (NTRS)

    Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.

    2014-01-01

    SHM/FM theory has been successfully applied to the selection of the baseline set Abort Triggers for the NASA SLS center dot Quantitative assessment played a useful role in the decision process ? M&FM, which is new within NASA MSFC, required the most "new" work, as this quantitative analysis had never been done before center dot Required development of the methodology and tool to mechanize the process center dot Established new relationships to the other groups ? The process is now an accepted part of the SLS design process, and will likely be applied to similar programs in the future at NASA MSFC ? Future improvements center dot Improve technical accuracy ?Differentiate crew survivability due to an abort, vs. survivability even no immediate abort occurs (small explosion with little debris) ?Account for contingent dependence of secondary triggers on primary triggers ?Allocate "? LOC Benefit" of each trigger when added to the previously selected triggers. center dot Reduce future costs through the development of a specialized tool ? Methodology can be applied to any manned/unmanned vehicle, in space or terrestrial

  17. A dispersive treatment of decays

    NASA Astrophysics Data System (ADS)

    Stoffer, Peter; Colangelo, Gilberto; Passemar, Emilie

    2017-01-01

    decays have several features of interest: they allow an accurate measurement of ππ-scattering lengths; the decay is the best source for the determination of some low-energy constants of chiral perturbation theory (χPT) one form factor of the decay is connected to the chiral anomaly. We present the results of our dispersive analysis of decays, which provides a resummation of ππ- and Kπ-rescattering effects. The free parameters of the dispersion relation are fitted to the data of the high-statistics experiments E865 and NA48/2. By matching to χPT at NLO and NNLO, we determine the low-energy constants and . In contrast to a pure chiral treatment, the dispersion relation describes the observed curvature of one of the form factors, which we understand as an effect of rescattering beyond NNLO.

  18. High Energy Break-Up of Few-Nucleon Systems

    NASA Astrophysics Data System (ADS)

    Sargsian, Misak

    2008-03-01

    We discus recent developments in theory of high energy two-body break-up reactions of few-nucleon systems. The characteristics of these reactions are such that the hard two-body quasielastic subprocess can be clearly separated from the accompanying soft subprocesses. We discuss in details the hard rescattering model (HRM) in which hard photodisintegration develops in two stages. At first, photon knocks-out an energetic quark which rescatters subsequently with a quark of the other nucleon. The latter provides a mechanism of sharing the initial high momentum of the photon by the outgoing two nucleons. Within HRM we discuss hard break-up reactions involving 2D and 3He targets. Another development of HRM is the prediction of new helicity selection mechanism for hard two-body reactions, which was apparently confirmed in the recent JLab experiment.

  19. Levels of reconstruction as complementarity in mixed methods research: a social theory-based conceptual framework for integrating qualitative and quantitative research.

    PubMed

    Carroll, Linda J; Rothe, J Peter

    2010-09-01

    Like other areas of health research, there has been increasing use of qualitative methods to study public health problems such as injuries and injury prevention. Likewise, the integration of qualitative and quantitative research (mixed-methods) is beginning to assume a more prominent role in public health studies. Likewise, using mixed-methods has great potential for gaining a broad and comprehensive understanding of injuries and their prevention. However, qualitative and quantitative research methods are based on two inherently different paradigms, and their integration requires a conceptual framework that permits the unity of these two methods. We present a theory-driven framework for viewing qualitative and quantitative research, which enables us to integrate them in a conceptually sound and useful manner. This framework has its foundation within the philosophical concept of complementarity, as espoused in the physical and social sciences, and draws on Bergson's metaphysical work on the 'ways of knowing'. Through understanding how data are constructed and reconstructed, and the different levels of meaning that can be ascribed to qualitative and quantitative findings, we can use a mixed-methods approach to gain a conceptually sound, holistic knowledge about injury phenomena that will enhance our development of relevant and successful interventions.

  20. On the RNG theory of turbulence

    NASA Technical Reports Server (NTRS)

    Lam, S. H.

    1992-01-01

    The Yakhot and Orszag (1986) renormalization group (RNG) theory of turbulence has generated a number of scaling law constants in reasonable quantitative agreement with experiments. The theory itself is highly mathematical, and its assumptions and approximations are not easily appreciated. The present paper reviews the RNG theory and recasts it in more conventional terms using a distinctly different viewpoint. A new formulation based on an alternative interpretation of the origin of the random force is presented, showing that the artificially introduced epsilon in the original theory is an adjustable parameter, thus offering a plausible explanation for the remarkable record of quantitative success of the so-called epsilon-expansion procedure.

  1. Performance Theories for Sentence Coding: Some Quantitative Models

    ERIC Educational Resources Information Center

    Aaronson, Doris; And Others

    1977-01-01

    This study deals with the patterns of word-by-word reading times over a sentence when the subject must code the linguistic information sufficiently for immediate verbatim recall. A class of quantitative models is considered that would account for reading times at phrase breaks. (Author/RM)

  2. Another Curriculum Requirement? Quantitative Reasoning in Economics: Some First Steps

    ERIC Educational Resources Information Center

    O'Neill, Patrick B.; Flynn, David T.

    2013-01-01

    In this paper, we describe first steps toward focusing on quantitative reasoning in an intermediate microeconomic theory course. We find student attitudes toward quantitative aspects of economics improve over the duration of the course (as we would hope). Perhaps more importantly, student attitude toward quantitative reasoning improves, in…

  3. Quantitative results of stellar evolution and pulsation theories.

    NASA Technical Reports Server (NTRS)

    Fricke, K.; Stobie, R. S.; Strittmatter, P. A.

    1971-01-01

    The discrepancy between the masses of Cepheid variables deduced from evolution theory and pulsation theory is examined. The effect of input physics on evolutionary tracks is first discussed; in particular, changes in the opacity are considered. The sensitivity of pulsation masses to opacity changes and to the ascribed values of luminosity and effective temperature are then analyzed. The Cepheid mass discrepancy is discussed in the light of the results already obtained. Other astronomical evidence, including the mass-luminosity relation for main sequence stars, the solar neutrino flux, and cluster ages are also considered in an attempt to determine the most likely source of error in the event that substantial mass loss has not occurred.

  4. Electron-molecule scattering in a strong laser field: Two-center interference effects

    NASA Astrophysics Data System (ADS)

    Dakić, J.; Habibović, D.; Čerkić, A.; Busuladžić, M.; Milošević, D. B.

    2017-10-01

    Laser-assisted scattering of electrons on diatomic molecules is considered using the S -matrix theory within the second Born approximation. The first term of the expansion in powers of the scattering potential corresponds to the direct or single laser-assisted scattering of electrons on molecular targets, while the second term of this expansion corresponds to the laser-assisted rescattering or double scattering. The rescattered electrons may have considerably higher energies in the final state than those that scattered only once. For multicenter polyatomic molecules scattering and rescattering may happen at any center and in any order. All these cases contribute to the scattering amplitude and the interference of different contributions leads to an increase or a decrease of the differential cross section in particular electron energy regions. For diatomic molecules there are two such contributions for single scattering and four contributions for double scattering. Analyzing the spectra of the scattered electrons, we find two interesting effects. For certain molecular orientations, the plateaus in the electron energy spectrum, characteristic of laser-assisted electron-atom scattering, are replaced by a sequence of gradually declining maxima, caused by the two-center interference effects. The second effect is the appearance of symmetric U -shaped structures in the angle-resolved energy spectra, which are described very well by the analytical formulas we provide.

  5. [Quantitative assessment of urban ecosystem services flow based on entropy theory: A case study of Beijing, China].

    PubMed

    Li, Jing Xin; Yang, Li; Yang, Lei; Zhang, Chao; Huo, Zhao Min; Chen, Min Hao; Luan, Xiao Feng

    2018-03-01

    Quantitative evaluation of ecosystem service is a primary premise for rational resources exploitation and sustainable development. Examining ecosystem services flow provides a scientific method to quantity ecosystem services. We built an assessment indicator system based on land cover/land use under the framework of four types of ecosystem services. The types of ecosystem services flow were reclassified. Using entropy theory, disorder degree and developing trend of indicators and urban ecosystem were quantitatively assessed. Beijing was chosen as the study area, and twenty-four indicators were selected for evaluation. The results showed that the entropy value of Beijing urban ecosystem during 2004 to 2015 was 0.794 and the entropy flow was -0.024, suggesting a large disordered degree and near verge of non-health. The system got maximum values for three times, while the mean annual variation of the system entropy value increased gradually in three periods, indicating that human activities had negative effects on urban ecosystem. Entropy flow reached minimum value in 2007, implying the environmental quality was the best in 2007. The determination coefficient for the fitting function of total permanent population in Beijing and urban ecosystem entropy flow was 0.921, indicating that urban ecosystem health was highly correlated with total permanent population.

  6. Quantitative genetic versions of Hamilton's rule with empirical applications

    PubMed Central

    McGlothlin, Joel W.; Wolf, Jason B.; Brodie, Edmund D.; Moore, Allen J.

    2014-01-01

    Hamilton's theory of inclusive fitness revolutionized our understanding of the evolution of social interactions. Surprisingly, an incorporation of Hamilton's perspective into the quantitative genetic theory of phenotypic evolution has been slow, despite the popularity of quantitative genetics in evolutionary studies. Here, we discuss several versions of Hamilton's rule for social evolution from a quantitative genetic perspective, emphasizing its utility in empirical applications. Although evolutionary quantitative genetics offers methods to measure each of the critical parameters of Hamilton's rule, empirical work has lagged behind theory. In particular, we lack studies of selection on altruistic traits in the wild. Fitness costs and benefits of altruism can be estimated using a simple extension of phenotypic selection analysis that incorporates the traits of social interactants. We also discuss the importance of considering the genetic influence of the social environment, or indirect genetic effects (IGEs), in the context of Hamilton's rule. Research in social evolution has generated an extensive body of empirical work focusing—with good reason—almost solely on relatedness. We argue that quantifying the roles of social and non-social components of selection and IGEs, in addition to relatedness, is now timely and should provide unique additional insights into social evolution. PMID:24686930

  7. A dispersive treatment of K ι4 decays

    DOE PAGES

    Stoffer, Peter; Colangelo, Gilberto; Passemar, Emilie

    2017-01-01

    K ι4 decays have several features of interest: they allow an accurate measurement of ππ-scattering lengths; the decay is the best source for the determination of some low-energy constants of chiral perturbation theory (χPT); one form factor of the decay is connected to the chiral anomaly. Here, we present the results of our dispersive analysis of K ι4 decays, which provides a resummation of ππ- and Kπ-rescattering effects. The free parameters of the dispersion relation are fitted to the data of the high-statistics experiments E865 and NA48/2. By matching to χPT at NLO and NNLO, we determine the low-energy constantsmore » and L r 1, L r 2, and L r 3. In contrast to a pure chiral treatment, the dispersion relation describes the observed curvature of one of the K ι4 form factors, which we understand as an effect of rescattering beyond NNLO.« less

  8. Electromagnetic braking: A simple quantitative model

    NASA Astrophysics Data System (ADS)

    Levin, Yan; da Silveira, Fernando L.; Rizzato, Felipe B.

    2006-09-01

    A calculation is presented that quantitatively accounts for the terminal velocity of a cylindrical magnet falling through a long copper or aluminum pipe. The experiment and the theory are a dramatic illustration of Faraday's and Lenz's laws.

  9. Using Expectancy Theory to quantitatively dissociate the neural representation of motivation from its influential factors in the human brain: An fMRI study.

    PubMed

    Kohli, Akshay; Blitzer, David N; Lefco, Ray W; Barter, Joseph W; Haynes, M Ryan; Colalillo, Sam A; Ly, Martina; Zink, Caroline F

    2018-05-08

    Researchers have yet to apply a formal operationalized theory of motivation to neurobiology that would more accurately and precisely define neural activity underlying motivation. We overcome this challenge with the novel application of the Expectancy Theory of Motivation to human fMRI to identify brain activity that explicitly reflects motivation. Expectancy Theory quantitatively describes how individual constructs determine motivation by defining motivation force as the product of three variables: expectancy - belief that effort will better performance; instrumentality - belief that successful performance leads to particular outcome, and valence - outcome desirability. Here, we manipulated information conveyed by reward-predicting cues such that relative cue-evoked activity patterns could be statistically mapped to individual Expectancy Theory variables. The variable associated with activity in any voxel is only reported if it replicated between two groups of healthy participants. We found signals in midbrain, ventral striatum, sensorimotor cortex, and visual cortex that specifically map to motivation itself, rather than other factors. This is important because, for the first time, it empirically clarifies approach motivation neural signals during reward anticipation. It also highlights the effectiveness of the application of Expectancy Theory to neurobiology to more precisely and accurately probe motivation neural correlates than has been achievable previously. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. The power of exact conditions in electronic structure theory

    NASA Astrophysics Data System (ADS)

    Bartlett, Rodney J.; Ranasinghe, Duminda S.

    2017-02-01

    Once electron correlation is included in an effective one-particle operator, one has a correlated orbital theory (COT). One such theory is Kohn-Sham density functional theory (KS-DFT), but there are others. Such methods have the prospect to redefine traditional Molecular Orbital (MO) theory by building a quantitative component upon its conceptual framework. This paper asks the question what conditions should such a theory satisfy and can this be accomplished? One such condition for a COT is that the orbital eigenvalues should satisfy an ionization theorem that generalizes Koopmans' approximation to the exact principal ionization potentials for every electron in a molecule. Guided by this principle, minimal parameterizations of KS-DFT are made that provide a good approximation to a quantitative MO theory.

  11. Quantitative steps in the evolution of metabolic organisation as specified by the Dynamic Energy Budget theory.

    PubMed

    Kooijman, S A L M; Troost, T A

    2007-02-01

    The Dynamic Energy Budget (DEB) theory quantifies the metabolic organisation of organisms on the basis of mechanistically inspired assumptions. We here sketch a scenario for how its various modules, such as maintenance, storage dynamics, development, differentiation and life stages could have evolved since the beginning of life. We argue that the combination of homeostasis and maintenance induced the development of reserves and that subsequent increases in the maintenance costs came with increases of the reserve capacity. Life evolved from a multiple reserves - single structure system (prokaryotes, many protoctists) to systems with multiple reserves and two structures (plants) or single reserve and single structure (animals). This had profound consequences for the possible effects of temperature on rates. We present an alternative explanation for what became known as the down-regulation of maintenance at high growth rates in microorganisms; the density of the limiting reserve increases with the growth rate, and reserves do not require maintenance while structure-specific maintenance costs are independent of the growth rate. This is also the mechanism behind the variation of the respiration rate with body size among species. The DEB theory specifies reserve dynamics on the basis of the requirements of weak homeostasis and partitionability. We here present a new and simple mechanism for this dynamics which accounts for the rejection of mobilised reserve by busy maintenance/growth machinery. This module, like quite a few other modules of DEB theory, uses the theory of Synthesising Units; we review recent progress in this field. The plasticity of membranes that evolved in early eukaryotes is a major step forward in metabolic evolution; we discuss quantitative aspects of the efficiency of phagocytosis relative to the excretion of digestive enzymes to illustrate its importance. Some processes of adaptation and gene expression can be understood in terms of allocation

  12. Quantitative prediction of solute strengthening in aluminium alloys.

    PubMed

    Leyson, Gerard Paul M; Curtin, William A; Hector, Louis G; Woodward, Christopher F

    2010-09-01

    Despite significant advances in computational materials science, a quantitative, parameter-free prediction of the mechanical properties of alloys has been difficult to achieve from first principles. Here, we present a new analytic theory that, with input from first-principles calculations, is able to predict the strengthening of aluminium by substitutional solute atoms. Solute-dislocation interaction energies in and around the dislocation core are first calculated using density functional theory and a flexible-boundary-condition method. An analytic model for the strength, or stress to move a dislocation, owing to the random field of solutes, is then presented. The theory, which has no adjustable parameters and is extendable to other metallic alloys, predicts both the energy barriers to dislocation motion and the zero-temperature flow stress, allowing for predictions of finite-temperature flow stresses. Quantitative comparisons with experimental flow stresses at temperature T=78 K are made for Al-X alloys (X=Mg, Si, Cu, Cr) and good agreement is obtained.

  13. Theory Development: A Bridge between Practice and Research

    ERIC Educational Resources Information Center

    Southern, Stephen; Devlin, James

    2010-01-01

    Theory development is an intentional process by which marriage and family counselors may bridge the gap between research and practice. The theory building process includes inductive and deductive forms of reasoning, qualitative and quantitative approaches to knowledge development, and diffusion of innovations. Grounded theory provides an…

  14. Behavioral Momentum Theory: Equations and Applications

    ERIC Educational Resources Information Center

    Nevin, John A.; Shahan, Timothy A.

    2011-01-01

    Behavioral momentum theory provides a quantitative account of how reinforcers experienced within a discriminative stimulus context govern the persistence of behavior that occurs in that context. The theory suggests that all reinforcers obtained in the presence of a discriminative stimulus increase resistance to change, regardless of whether those…

  15. Quantitative Literacy: Geosciences and Beyond

    NASA Astrophysics Data System (ADS)

    Richardson, R. M.; McCallum, W. G.

    2002-12-01

    Quantitative literacy seems like such a natural for the geosciences, right? The field has gone from its origin as a largely descriptive discipline to one where it is hard to imagine failing to bring a full range of mathematical tools to the solution of geological problems. Although there are many definitions of quantitative literacy, we have proposed one that is analogous to the UNESCO definition of conventional literacy: "A quantitatively literate person is one who, with understanding, can both read and represent quantitative information arising in his or her everyday life." Central to this definition is the concept that a curriculum for quantitative literacy must go beyond the basic ability to "read and write" mathematics and develop conceptual understanding. It is also critical that a curriculum for quantitative literacy be engaged with a context, be it everyday life, humanities, geoscience or other sciences, business, engineering, or technology. Thus, our definition works both within and outside the sciences. What role do geoscience faculty have in helping students become quantitatively literate? Is it our role, or that of the mathematicians? How does quantitative literacy vary between different scientific and engineering fields? Or between science and nonscience fields? We will argue that successful quantitative literacy curricula must be an across-the-curriculum responsibility. We will share examples of how quantitative literacy can be developed within a geoscience curriculum, beginning with introductory classes for nonmajors (using the Mauna Loa CO2 data set) through graduate courses in inverse theory (using singular value decomposition). We will highlight six approaches to across-the curriculum efforts from national models: collaboration between mathematics and other faculty; gateway testing; intensive instructional support; workshops for nonmathematics faculty; quantitative reasoning requirement; and individual initiative by nonmathematics faculty.

  16. Revisiting the arthritogenic peptide theory: quantitative not qualitative changes in the peptide repertoire of HLA-B27 allotypes.

    PubMed

    Schittenhelm, Ralf B; Sian, Terry C C Lim Kam; Wilmann, Pascal G; Dudek, Nadine L; Purcell, Anthony W

    2015-03-01

    The association of HLA-B27 with spondyloarthropathy is one of the strongest documented for any autoimmune disease. A common hypothesis for this association is the arthritogenic peptide concept. This dictates that differences in the peptide binding preferences of disease-associated and non-disease-associated HLA-B27 allotypes underlie the presentation of bacterial and self-peptides, leading to cross-reactive T cell immunity and subsequent autoimmune attack of affected tissues. The aim of this study was to analyze and compare self-peptides from 8 HLA-B27 allotypes, to increase existing data sets of HLA-B27 ligands, to refine and compare their consensus-binding motifs, and to reveal similarities and differences in the peptide repertoire of the HLA-B27 subtypes. Qualitative differences in the peptides bound to the 8 most frequent HLA-B27 subtypes were determined by tandem mass spectrometry, and quantitative changes in allelic binding specificities were determined by highly sensitive and targeted multiple reaction monitoring mass spectrometry. We identified >7,500 major histocompatibility complex class I peptides derived from the 8 most common HLA-B27 allotypes (HLA-B*27:02 to HLA-B*27:09). We describe individual binding motifs for these alleles for the 9-12-mer ligands. The peptide repertoires of these closely related alleles showed significant overlap. Allelic polymorphisms resulting in changes in the amino acid composition of the antigen-binding cleft manifested largely as quantitative changes in the peptide cargo of these molecules. Absolute binding preferences of HLA-B27 allotypes do not explain disease association. The arthritogenic peptide theory needs to be reassessed in terms of quantitative changes in self-peptide presentation, T cell selection, and altered conformation of bound peptides. Copyright © 2015 by the American College of Rheumatology.

  17. Dynamic molecular structure retrieval from low-energy laser-induced electron diffraction spectra

    NASA Astrophysics Data System (ADS)

    Vu, Dinh-Duy T.; Phan, Ngoc-Loan T.; Hoang, Van-Hung; Le, Van-Hoang

    2017-12-01

    A recently developed quantitative rescattering theory showed that a laser-free elastic cross section can be separated from laser-induced electron diffraction (LIED) spectra. Based upon this idea, Blaga et al investigated the possibility of reconstructing molecular structure from LIED spectra (2012 Nature 483 7388). In the above study, an independent atoms model (IAM) was used to interpret high-energy electron-molecule collisions induced by a mid-infrared laser. Our research aims to extend the application range of this structural retrieval method to low-energy spectra induced by more common near-infrared laser sources. The IAM is insufficient in this case, so we switch to a more comprehensive model—the multiple scattering (MS) theory. From the original version concerning only neutral targets, we upgrade the model so that it is compatible with electron-ion collisions at low energy. With available LIED experiment data of CO2 and O2, the upgraded MS is shown to be greatly effective as a tool for molecular imaging from spectra induced by a near-infrared laser. The captured image is at about 2 fs after the ionization, shorter than the period 4-6 fs by using the mid-infrared laser in Blaga’s experiment.

  18. Quantifying falsifiability of scientific theories

    NASA Astrophysics Data System (ADS)

    Nemenman, Ilya

    I argue that the notion of falsifiability, a key concept in defining a valid scientific theory, can be quantified using Bayesian Model Selection, which is a standard tool in modern statistics. This relates falsifiability to the quantitative version of the statistical Occam's razor, and allows transforming some long-running arguments about validity of scientific theories from philosophical discussions to rigorous mathematical calculations.

  19. Resource Theory of Superposition

    NASA Astrophysics Data System (ADS)

    Theurer, T.; Killoran, N.; Egloff, D.; Plenio, M. B.

    2017-12-01

    The superposition principle lies at the heart of many nonclassical properties of quantum mechanics. Motivated by this, we introduce a rigorous resource theory framework for the quantification of superposition of a finite number of linear independent states. This theory is a generalization of resource theories of coherence. We determine the general structure of operations which do not create superposition, find a fundamental connection to unambiguous state discrimination, and propose several quantitative superposition measures. Using this theory, we show that trace decreasing operations can be completed for free which, when specialized to the theory of coherence, resolves an outstanding open question and is used to address the free probabilistic transformation between pure states. Finally, we prove that linearly independent superposition is a necessary and sufficient condition for the faithful creation of entanglement in discrete settings, establishing a strong structural connection between our theory of superposition and entanglement theory.

  20. Wavelength and intensity dependence of recollision-enhanced multielectron effects in high-order harmonic generation

    NASA Astrophysics Data System (ADS)

    Abanador, Paul M.; Mauger, François; Lopata, Kenneth; Gaarde, Mette B.; Schafer, Kenneth J.

    2018-04-01

    Using a model molecular system (A2) with two active electrons restricted to one dimension, we examine high-order harmonic generation (HHG) enhanced by rescattering. Our results show that even at intensities well below the single ionization saturation, harmonics generated from the cation (A2+ ) can be significantly enhanced due to the rescattering of the electron that is initially ionized. This two-electron effect is manifested by the appearance of a secondary plateau and cutoff in the HHG spectrum, extending beyond the predicted cutoff in the single active electron approximation. We use our molecular model to investigate the wavelength dependence of rescattering enhanced HHG, which was first reported in a model atomic system [I. Tikhomirov, T. Sato, and K. L. Ishikawa, Phys. Rev. Lett. 118, 203202 (2017), 10.1103/PhysRevLett.118.203202]. We demonstrate that the HHG yield in the secondary cutoff is highly sensitive to the available electron rescattering energies as indicated by a dramatic scaling with respect to driving wavelength.

  1. [Quantitative analysis method based on fractal theory for medical imaging of normal brain development in infants].

    PubMed

    Li, Heheng; Luo, Liangping; Huang, Li

    2011-02-01

    The present paper is aimed to study the fractal spectrum of the cerebral computerized tomography in 158 normal infants of different age groups, based on the calculation of chaotic theory. The distribution range of neonatal period was 1.88-1.90 (mean = 1.8913 +/- 0.0064); It reached a stable condition at the level of 1.89-1.90 during 1-12 months old (mean = 1.8927 +/- 0.0045); The normal range of 1-2 years old infants was 1.86-1.90 (mean = 1.8863 +/- 4 0.0085); It kept the invariance of the quantitative value among 1.88-1.91(mean = 1.8958 +/- 0.0083) during 2-3 years of age. ANOVA indicated there's no significant difference between boys and girls (F = 0.243, P > 0.05), but the difference of age groups was significant (F = 8.947, P < 0.001). The fractal dimension of cerebral computerized tomography in normal infants computed by box methods was maintained at an efficient stability from 1.86 to 1.91. It indicated that there exit some attractor modes in pediatric brain development.

  2. Dynamical Systems Theory in Quantitative Psychology and Cognitive Science: A Fair Discrimination between Deterministic and Statistical Counterparts is Required.

    PubMed

    Gadomski, Adam; Ausloos, Marcel; Casey, Tahlia

    2017-04-01

    This article addresses a set of observations framed in both deterministic as well as statistical formal guidelines. It operates within the framework of nonlinear dynamical systems theory (NDS). It is argued that statistical approaches can manifest themselves ambiguously, creating practical discrepancies in psychological and cognitive data analyses both quantitatively and qualitatively. This is sometimes termed in literature as 'questionable research practices.' This communication points to the demand for a deeper awareness of the data 'initial conditions, allowing to focus on pertinent evolution constraints in such systems.' It also considers whether the exponential (Malthus-type) or the algebraic (Pareto-type) statistical distribution ought to be effectively considered in practical interpretations. The role of repetitive specific behaviors by patients seeking treatment is examined within the NDS frame. The significance of these behaviors, involving a certain memory effect seems crucial in determining a patient's progression or regression. With this perspective, it is discussed how a sensitively applied hazardous or triggering factor can be helpful for well-controlled psychological strategic treatments; those attributable to obsessive-compulsive disorders or self-injurious behaviors are recalled in particular. There are both inherent criticality- and complexity-exploiting (reduced-variance based) relations between a therapist and a patient that can be intrinsically included in NDS theory.

  3. QUANTITATIVE DECISION TOOLS AND MANAGEMENT DEVELOPMENT PROGRAMS.

    ERIC Educational Resources Information Center

    BYARS, LLOYD L.; NUNN, GEOFFREY E.

    THIS ARTICLE OUTLINED THE CURRENT STATUS OF QUANTITATIVE METHODS AND OPERATIONS RESEARCH (OR), SKETCHED THE STRENGTHS OF TRAINING EFFORTS AND ISOLATED WEAKNESSES, AND FORMULATED WORKABLE CRITERIA FOR EVALUATING SUCCESS OF OPERATIONS RESEARCH TRAINING PROGRAMS. A SURVEY OF 105 COMPANIES REVEALED THAT PERT, INVENTORY CONTROL THEORY AND LINEAR…

  4. QuantCrit: Rectifying Quantitative Methods through Critical Race Theory

    ERIC Educational Resources Information Center

    Garcia, Nichole M.; López, Nancy; Vélez, Verónica N.

    2018-01-01

    Critical race theory (CRT) in education centers, examines, and seeks to transform the relationship that undergirds race, racism, and power. CRT scholars have applied a critical race framework to advance research methodologies, namely qualitative interventions. Informed by this work, and 15 years later, this article reconsiders the possibilities of…

  5. Universality and predictability in molecular quantitative genetics.

    PubMed

    Nourmohammad, Armita; Held, Torsten; Lässig, Michael

    2013-12-01

    Molecular traits, such as gene expression levels or protein binding affinities, are increasingly accessible to quantitative measurement by modern high-throughput techniques. Such traits measure molecular functions and, from an evolutionary point of view, are important as targets of natural selection. We review recent developments in evolutionary theory and experiments that are expected to become building blocks of a quantitative genetics of molecular traits. We focus on universal evolutionary characteristics: these are largely independent of a trait's genetic basis, which is often at least partially unknown. We show that universal measurements can be used to infer selection on a quantitative trait, which determines its evolutionary mode of conservation or adaptation. Furthermore, universality is closely linked to predictability of trait evolution across lineages. We argue that universal trait statistics extends over a range of cellular scales and opens new avenues of quantitative evolutionary systems biology. Copyright © 2013. Published by Elsevier Ltd.

  6. Image change detection using paradoxical theory for patient follow-up quantitation and therapy assessment.

    PubMed

    David, Simon; Visvikis, Dimitris; Quellec, Gwénolé; Le Rest, Catherine Cheze; Fernandez, Philippe; Allard, Michèle; Roux, Christian; Hatt, Mathieu

    2012-09-01

    In clinical oncology, positron emission tomography (PET) imaging can be used to assess therapeutic response by quantifying the evolution of semi-quantitative values such as standardized uptake value, early during treatment or after treatment. Current guidelines do not include metabolically active tumor volume (MATV) measurements and derived parameters such as total lesion glycolysis (TLG) to characterize the response to the treatment. To achieve automatic MATV variation estimation during treatment, we propose an approach based on the change detection principle using the recent paradoxical theory, which models imprecision, uncertainty, and conflict between sources. It was applied here simultaneously to pre- and post-treatment PET scans. The proposed method was applied to both simulated and clinical datasets, and its performance was compared to adaptive thresholding applied separately on pre- and post-treatment PET scans. On simulated datasets, the adaptive threshold was associated with significantly higher classification errors than the developed approach. On clinical datasets, the proposed method led to results more consistent with the known partial responder status of these patients. The method requires accurate rigid registration of both scans which can be obtained only in specific body regions and does not explicitly model uptake heterogeneity. In further investigations, the change detection of intra-MATV tracer uptake heterogeneity will be developed by incorporating textural features into the proposed approach.

  7. An overview of quantitative approaches in Gestalt perception.

    PubMed

    Jäkel, Frank; Singh, Manish; Wichmann, Felix A; Herzog, Michael H

    2016-09-01

    Gestalt psychology is often criticized as lacking quantitative measurements and precise mathematical models. While this is true of the early Gestalt school, today there are many quantitative approaches in Gestalt perception and the special issue of Vision Research "Quantitative Approaches in Gestalt Perception" showcases the current state-of-the-art. In this article we give an overview of these current approaches. For example, ideal observer models are one of the standard quantitative tools in vision research and there is a clear trend to try and apply this tool to Gestalt perception and thereby integrate Gestalt perception into mainstream vision research. More generally, Bayesian models, long popular in other areas of vision research, are increasingly being employed to model perceptual grouping as well. Thus, although experimental and theoretical approaches to Gestalt perception remain quite diverse, we are hopeful that these quantitative trends will pave the way for a unified theory. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. On Measuring Quantitative Interpretations of Reasonable Doubt

    ERIC Educational Resources Information Center

    Dhami, Mandeep K.

    2008-01-01

    Beyond reasonable doubt represents a probability value that acts as the criterion for conviction in criminal trials. I introduce the membership function (MF) method as a new tool for measuring quantitative interpretations of reasonable doubt. Experiment 1 demonstrated that three different methods (i.e., direct rating, decision theory based, and…

  9. Advanced quantitative measurement methodology in physics education research

    NASA Astrophysics Data System (ADS)

    Wang, Jing

    parts. The first part involves the comparison between item response theory (IRT) and classical test theory (CTT). The two theories both provide test item statistics for educational inferences and decisions. The two theories are both applied to Force Concept Inventory data obtained from students enrolled in The Ohio State University. Effort was made to examine the similarity and difference between the two theories, and the possible explanation to the difference. The study suggests that item response theory is more sensitive to the context and conceptual features of the test items than classical test theory. The IRT parameters provide a better measure than CTT parameters for the educational audience to investigate item features. The second part of the dissertation is on the measure of association for binary data. In quantitative assessment, binary data is often encountered because of its simplicity. The current popular measures of association fail under some extremely unbalanced conditions. However, the occurrence of these conditions is not rare in educational data. Two popular association measures, the Pearson's correlation and the tetrachoric correlation are examined. A new method, model based association is introduced, and an educational testing constraint is discussed. The existing popular methods are compared with the model based association measure with and without the constraint. Connections between the value of association and the context and conceptual features of questions are discussed in detail. Results show that all the methods have their advantages and disadvantages. Special attention to the test and data conditions is necessary. The last part of the dissertation is focused on exploratory factor analysis (EFA). The theoretical advantages of EFA are discussed. Typical misunderstanding and misusage of EFA are explored. The EFA is performed on Lawson's Classroom Test of Scientific Reasoning (LCTSR), a widely used assessment on scientific reasoning skills. The

  10. A Qualitative-Quantitative H-NMR Experiment for the Instrumental Analysis Laboratory.

    ERIC Educational Resources Information Center

    Phillips, John S.; Leary, James J.

    1986-01-01

    Describes an experiment combining qualitative and quantitative information from hydrogen nuclear magnetic resonance spectra. Reviews theory, discusses the experimental approach, and provides sample results. (JM)

  11. Revisiting final state interaction in charmless Bq→P P decays

    NASA Astrophysics Data System (ADS)

    Chua, Chun-Khiang

    2018-05-01

    Various new measurements in charmless Bu ,d ,s→P P modes, where P is a low lying pseudoscalar meson, are reported by Belle and LHCb. These include the rates of B0→π0π0, η π0, Bs→η'η', B0→K+K- and Bs0→π+π- decays. Some of these modes are highly suppressed and are among the rarest B decays. Direct C P asymmetries on various modes are constantly updated. It is well known that direct C P asymmetries and rates of suppressed modes are sensitive to final state interaction (FSI). As new measurements are reported and more data will be collected, it is interesting and timely to revisit the rescattering effects in Bu ,d ,s→P P states. We perform a χ2 analysis with all available data on C P -averaged rates and C P asymmetries in B¯u ,d ,s→P P decays. Our numerical results are compared to data and those from factorization approach. The quality of the fit is improved significantly from the factorization results in the presence of rescattering. The relations on topological amplitudes and rescattering are explored and they help to provide a better understanding of the effects of FSI. As suggested by U(3) symmetry on topological amplitudes and FSI, a vanishing exchange rescattering scenario is considered. The exchange, annihilation, u -penguin, u -penguin annihilation, and some electroweak penguin amplitudes are enhanced significantly via annihilation and total annihilation rescatterings. In particular, the u -penguin annihilation amplitude is sizably enhanced by the tree amplitude via total annihilation rescattering. These enhancements affect rates and C P asymmetries. Predictions can be checked in the near future.

  12. A quantitative test of population genetics using spatiogenetic patterns in bacterial colonies.

    PubMed

    Korolev, Kirill S; Xavier, João B; Nelson, David R; Foster, Kevin R

    2011-10-01

    It is widely accepted that population-genetics theory is the cornerstone of evolutionary analyses. Empirical tests of the theory, however, are challenging because of the complex relationships between space, dispersal, and evolution. Critically, we lack quantitative validation of the spatial models of population genetics. Here we combine analytics, on- and off-lattice simulations, and experiments with bacteria to perform quantitative tests of the theory. We study two bacterial species, the gut microbe Escherichia coli and the opportunistic pathogen Pseudomonas aeruginosa, and show that spatiogenetic patterns in colony biofilms of both species are accurately described by an extension of the one-dimensional stepping-stone model. We use one empirical measure, genetic diversity at the colony periphery, to parameterize our models and show that we can then accurately predict another key variable: the degree of short-range cell migration along an edge. Moreover, the model allows us to estimate other key parameters, including effective population size (density) at the expansion frontier. While our experimental system is a simplification of natural microbial community, we argue that it constitutes proof of principle that the spatial models of population genetics can quantitatively capture organismal evolution.

  13. Quantitative magnetic resonance micro-imaging methods for pharmaceutical research.

    PubMed

    Mantle, M D

    2011-09-30

    The use of magnetic resonance imaging (MRI) as a tool in pharmaceutical research is now well established and the current literature covers a multitude of different pharmaceutically relevant research areas. This review focuses on the use of quantitative magnetic resonance micro-imaging techniques and how they have been exploited to extract information that is of direct relevance to the pharmaceutical industry. The article is divided into two main areas. The first half outlines the theoretical aspects of magnetic resonance and deals with basic magnetic resonance theory, the effects of nuclear spin-lattice (T(1)), spin-spin (T(2)) relaxation and molecular diffusion upon image quantitation, and discusses the applications of rapid magnetic resonance imaging techniques. In addition to the theory, the review aims to provide some practical guidelines for the pharmaceutical researcher with an interest in MRI as to which MRI pulse sequences/protocols should be used and when. The second half of the article reviews the recent advances and developments that have appeared in the literature concerning the use of quantitative micro-imaging methods to pharmaceutically relevant research. Copyright © 2010 Elsevier B.V. All rights reserved.

  14. Theory of Semiconducting Superlattices and Microstructures

    DTIC Science & Technology

    1992-03-01

    theory elucidated the various factors affecting deep levels, sets forth the conditions for obtaining shallow-deep transitions, and predicts that Si (a...theory elucidates the various factors affecting deep levels, sets forth the conditions for obtaining shallow-deep transitions, and predicts that Si (a...ondenotes the anion vacancy, which can be thought any quantitative theoretical factor are theof as originating from Column-O of the Period strengths of

  15. Dark solitons, D-branes and noncommutative tachyon field theory

    NASA Astrophysics Data System (ADS)

    Giaccari, Stefano; Nian, Jun

    2017-11-01

    In this paper we discuss the boson/vortex duality by mapping the (3+1)D Gross-Pitaevskii theory into an effective string theory in the presence of boundaries. Via the effective string theory, we find the Seiberg-Witten map between the commutative and the noncommutative tachyon field theories, and consequently identify their soliton solutions with D-branes in the effective string theory. We perform various checks of the duality map and the identification of soliton solutions. This new insight between the Gross-Pitaevskii theory and the effective string theory explains the similarity of these two systems at quantitative level.

  16. Motivation to Speak English: A Self-Determination Theory Perspective

    ERIC Educational Resources Information Center

    Dincer, Ali; Yesilyurt, Savas

    2017-01-01

    Based on a modern motivation theory of learning, self-determination theory (SDT), this study aimed to investigate the relationships between English as a foreign language (EFL) learners' motivation to speak, autonomous regulation, autonomy support from teachers, and classroom engagement, with both quantitative and qualitative approaches. The…

  17. Real Fantasies in Mathematics Education: Numeracy, Quantitative Reasoners, and Transdisciplinary Wicked Problems

    ERIC Educational Resources Information Center

    Craig, Jeffrey Carl

    2017-01-01

    This dissertation has seven chapters. In chapter one, I discuss through why I am doing this dissertation, my positionality, and how I learned from and with all of my committee members. Chapter two is where I situate my dissertation study through developing a social theory of quantitative literacy by translating a social theory of literacy (Barton…

  18. Studying learning in the healthcare setting: the potential of quantitative diary methods.

    PubMed

    Ciere, Yvette; Jaarsma, Debbie; Visser, Annemieke; Sanderman, Robbert; Snippe, Evelien; Fleer, Joke

    2015-08-01

    Quantitative diary methods are longitudinal approaches that involve the repeated measurement of aspects of peoples' experience of daily life. In this article, we outline the main characteristics and applications of quantitative diary methods and discuss how their use may further research in the field of medical education. Quantitative diary methods offer several methodological advantages, such as measuring aspects of learning with great detail, accuracy and authenticity. Moreover, they enable researchers to study how and under which conditions learning in the health care setting occurs and in which way learning can be promoted. Hence, quantitative diary methods may contribute to theory development and the optimization of teaching methods in medical education.

  19. Ionization, photoelectron dynamics and elastic scattering in relativistic, ultra-strong field

    NASA Astrophysics Data System (ADS)

    Luo, Sui

    wave-function spread. A relativistic rescattering enhancement occurs at 2 x 1018 W/cm2, commensurate with relativistic motion of a classical electron in a single field cycle. The good comparison between the results with available experiments suggests the theory approach is well suited to modeling scattering in the ultrastrong intensity regime. We investigate the elastic scattering process as it changes from strong to ultrastrong fields with the photoelectron angular distributions from Ne, Ar, and Xe. Noble gas species with Hartree-Fock scattering potentials show a reduction in elastic rescattering with the increasing energy of ultrastrong fields. It is found that as one increases the returning photoelectron energy, rescattering becomes the dominating mechanism behind the yield distribution as the emission angle for all the species extends from 0° to 90°. The relativistic effects and the magnetic field do not change the angular distribution until one is well into the Gamma r "1 regime where the Lorentz defection significantly reduces the yield. As we proceed to the highest energy, the angular emission range narrows as the mechanism changes over to backscattering into narrow angles along the electric field.

  20. Loop Shaping Control Design for a Supersonic Propulsion System Model Using Quantitative Feedback Theory (QFT) Specifications and Bounds

    NASA Technical Reports Server (NTRS)

    Connolly, Joseph W.; Kopasakis, George

    2010-01-01

    This paper covers the propulsion system component modeling and controls development of an integrated mixed compression inlet and turbojet engine that will be used for an overall vehicle Aero-Propulso-Servo-Elastic (APSE) model. Using previously created nonlinear component-level propulsion system models, a linear integrated propulsion system model and loop shaping control design have been developed. The design includes both inlet normal shock position control and jet engine rotor speed control for a potential supersonic commercial transport. A preliminary investigation of the impacts of the aero-elastic effects on the incoming flow field to the propulsion system are discussed, however, the focus here is on developing a methodology for the propulsion controls design that prevents unstart in the inlet and minimizes the thrust oscillation experienced by the vehicle. Quantitative Feedback Theory (QFT) specifications and bounds, and aspects of classical loop shaping are used in the control design process. Model uncertainty is incorporated in the design to address possible error in the system identification mapping of the nonlinear component models into the integrated linear model.

  1. On the classic and modern theories of matching.

    PubMed

    McDowell, J J

    2005-07-01

    Classic matching theory, which is based on Herrnstein's (1961) original matching equation and includes the well-known quantitative law of effect, is almost certainly false. The theory is logically inconsistent with known experimental findings, and experiments have shown that its central constant-k assumption is not tenable. Modern matching theory, which is based on the power function version of the original matching equation, remains tenable, although it has not been discussed or studied extensively. The modern theory is logically consistent with known experimental findings, it predicts the fact and details of the violation of the classic theory's constant-k assumption, and it accurately describes at least some data that are inconsistent with the classic theory.

  2. Nuclear medicine and quantitative imaging research (instrumentation and quantitative methods of evaluation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beck, R.N.; Cooper, M.D.

    1990-09-01

    This report summarizes goals and accomplishments of the research program supported under DOE Grant No. FG02-86ER60418 entitled Instrumentation and Quantitative Methods of Evaluation, with R. Beck, P. I. and M. Cooper, Co-P.I. during the period January 15, 1990 through September 1, 1990. This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development andmore » transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. 7 figs.« less

  3. A synthesis theory for self-oscillating adaptive systems /SOAS/

    NASA Technical Reports Server (NTRS)

    Horowitz, I.; Smay, J.; Shapiro, A.

    1974-01-01

    A quantitative synthesis theory is presented for the Self-Oscillating Adaptive System (SOAS), whose nonlinear element has a static, odd character with hard saturation. The synthesis theory is based upon the quasilinear properties of the SOAS to forced inputs, which permits the extension of quantitative linear feedback theory to the SOAS. A reasonable definition of optimum design is shown to be the minimization of the limit cycle frequency. The great advantages of the SOAS is its zero sensitivity to pure gain changes. However, quasilinearity and control of the limit cycle amplitude at the system output, impose additional constraints which partially or completely cancel this advantage, depending on the numerical values of the design parameters. By means of narrow-band filtering, an additional factor is introduced which permits trade-off between filter complexity and limit cycle frequency minimization.

  4. Quantitative genetics of disease traits.

    PubMed

    Wray, N R; Visscher, P M

    2015-04-01

    John James authored two key papers on the theory of risk to relatives for binary disease traits and the relationship between parameters on the observed binary scale and an unobserved scale of liability (James Annals of Human Genetics, 1971; 35: 47; Reich, James and Morris Annals of Human Genetics, 1972; 36: 163). These two papers are John James' most cited papers (198 and 328 citations, November 2014). They have been influential in human genetics and have recently gained renewed popularity because of their relevance to the estimation of quantitative genetics parameters for disease traits using SNP data. In this review, we summarize the two early papers and put them into context. We show recent extensions of the theory for ascertained case-control data and review recent applications in human genetics. © 2015 Blackwell Verlag GmbH.

  5. Rapid quantitative chemical mapping of surfaces with sub-2 nm resolution

    NASA Astrophysics Data System (ADS)

    Lai, Chia-Yun; Perri, Saverio; Santos, Sergio; Garcia, Ricardo; Chiesa, Matteo

    2016-05-01

    We present a theory that exploits four observables in bimodal atomic force microscopy to produce maps of the Hamaker constant H. The quantitative H maps may be employed by the broader community to directly interpret the high resolution of standard bimodal AFM images as chemical maps while simultaneously quantifying chemistry in the non-contact regime. We further provide a simple methodology to optimize a range of operational parameters for which H is in the closest agreement with the Lifshitz theory in order to (1) simplify data acquisition and (2) generalize the methodology to any set of cantilever-sample systems.We present a theory that exploits four observables in bimodal atomic force microscopy to produce maps of the Hamaker constant H. The quantitative H maps may be employed by the broader community to directly interpret the high resolution of standard bimodal AFM images as chemical maps while simultaneously quantifying chemistry in the non-contact regime. We further provide a simple methodology to optimize a range of operational parameters for which H is in the closest agreement with the Lifshitz theory in order to (1) simplify data acquisition and (2) generalize the methodology to any set of cantilever-sample systems. Electronic supplementary information (ESI) available. See DOI: 10.1039/c6nr00496b

  6. Developing Geoscience Students' Quantitative Skills

    NASA Astrophysics Data System (ADS)

    Manduca, C. A.; Hancock, G. S.

    2005-12-01

    Sophisticated quantitative skills are an essential tool for the professional geoscientist. While students learn many of these sophisticated skills in graduate school, it is increasingly important that they have a strong grounding in quantitative geoscience as undergraduates. Faculty have developed many strong approaches to teaching these skills in a wide variety of geoscience courses. A workshop in June 2005 brought together eight faculty teaching surface processes and climate change to discuss and refine activities they use and to publish them on the Teaching Quantitative Skills in the Geosciences website (serc.Carleton.edu/quantskills) for broader use. Workshop participants in consultation with two mathematics faculty who have expertise in math education developed six review criteria to guide discussion: 1) Are the quantitative and geologic goals central and important? (e.g. problem solving, mastery of important skill, modeling, relating theory to observation); 2) Does the activity lead to better problem solving? 3) Are the quantitative skills integrated with geoscience concepts in a way that makes sense for the learning environment and supports learning both quantitative skills and geoscience? 4) Does the methodology support learning? (e.g. motivate and engage students; use multiple representations, incorporate reflection, discussion and synthesis) 5) Are the materials complete and helpful to students? 6) How well has the activity worked when used? Workshop participants found that reviewing each others activities was very productive because they thought about new ways to teach and the experience of reviewing helped them think about their own activity from a different point of view. The review criteria focused their thinking about the activity and would be equally helpful in the design of a new activity. We invite a broad international discussion of the criteria(serc.Carleton.edu/quantskills/workshop05/review.html).The Teaching activities can be found on the

  7. A philosophy of science perspective on the quantitative analysis of behavior.

    PubMed

    Smith, Terry L

    2015-05-01

    B.F. Skinner argued that the science of behavior would progress more rapidly without appealing to theories of learning. He also suggested that theories in a quite different sense were possible, but that the science of behavior as of 1950 was not ready for them. The following analysis distinguishes between Skinner's two concepts of theory. It argues that theory in the second sense has arisen in the quantitative analysis of behavior. The attempt to give a dynamic account of the static regularities of this theory, however, has produced a theory in the first sense. Within its limited domain, this theory offers a rigorous alternative to cognitive accounts of behavior. Rather than distracting attention from actual behavior, it has now led to novel predictions about it. This article is part of a Special Issue entitled 'SQAB 2014'. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Genomic Quantitative Genetics to Study Evolution in the Wild.

    PubMed

    Gienapp, Phillip; Fior, Simone; Guillaume, Frédéric; Lasky, Jesse R; Sork, Victoria L; Csilléry, Katalin

    2017-12-01

    Quantitative genetic theory provides a means of estimating the evolutionary potential of natural populations. However, this approach was previously only feasible in systems where the genetic relatedness between individuals could be inferred from pedigrees or experimental crosses. The genomic revolution opened up the possibility of obtaining the realized proportion of genome shared among individuals in natural populations of virtually any species, which could promise (more) accurate estimates of quantitative genetic parameters in virtually any species. Such a 'genomic' quantitative genetics approach relies on fewer assumptions, offers a greater methodological flexibility, and is thus expected to greatly enhance our understanding of evolution in natural populations, for example, in the context of adaptation to environmental change, eco-evolutionary dynamics, and biodiversity conservation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Boltzmann, Darwin and Directionality theory

    NASA Astrophysics Data System (ADS)

    Demetrius, Lloyd A.

    2013-09-01

    Boltzmann’s statistical thermodynamics is a mathematical theory which relates the macroscopic properties of aggregates of interacting molecules with the laws of their interaction. The theory is based on the concept thermodynamic entropy, a statistical measure of the extent to which energy is spread throughout macroscopic matter. Macroscopic evolution of material aggregates is quantitatively explained in terms of the principle: Thermodynamic entropy increases as the composition of the aggregate changes under molecular collision. Darwin’s theory of evolution is a qualitative theory of the origin of species and the adaptation of populations to their environment. A central concept in the theory is fitness, a qualitative measure of the capacity of an organism to contribute to the ancestry of future generations. Macroscopic evolution of populations of living organisms can be qualitatively explained in terms of a neo-Darwinian principle: Fitness increases as the composition of the population changes under variation and natural selection. Directionality theory is a quantitative model of the Darwinian argument of evolution by variation and selection. This mathematical theory is based on the concept evolutionary entropy, a statistical measure which describes the rate at which an organism appropriates energy from the environment and reinvests this energy into survivorship and reproduction. According to directionality theory, microevolutionary dynamics, that is evolution by mutation and natural selection, can be quantitatively explained in terms of a directionality principle: Evolutionary entropy increases when the resources are diverse and of constant abundance; but decreases when the resource is singular and of variable abundance. This report reviews the analytical and empirical support for directionality theory, and invokes the microevolutionary dynamics of variation and selection to delineate the principles which govern macroevolutionary dynamics of speciation and

  10. Quantitative descriptions of generalized arousal, an elementary function of the vertebrate brain

    PubMed Central

    Quinkert, Amy Wells; Vimal, Vivek; Weil, Zachary M.; Reeke, George N.; Schiff, Nicholas D.; Banavar, Jayanth R.; Pfaff, Donald W.

    2011-01-01

    We review a concept of the most primitive, fundamental function of the vertebrate CNS, generalized arousal (GA). Three independent lines of evidence indicate the existence of GA: statistical, genetic, and mechanistic. Here we ask, is this concept amenable to quantitative analysis? Answering in the affirmative, four quantitative approaches have proven useful: (i) factor analysis, (ii) information theory, (iii) deterministic chaos, and (iv) application of a Gaussian equation. It strikes us that, to date, not just one but at least four different quantitative approaches seem necessary for describing different aspects of scientific work on GA. PMID:21555568

  11. Shaping Social Work Science: What Should Quantitative Researchers Do?

    ERIC Educational Resources Information Center

    Guo, Shenyang

    2015-01-01

    Based on a review of economists' debates on mathematical economics, this article discusses a key issue for shaping the science of social work--research methodology. The article describes three important tasks quantitative researchers need to fulfill in order to enhance the scientific rigor of social work research. First, to test theories using…

  12. Behavioral momentum theory: equations and applications.

    PubMed

    Nevin, John A; Shahan, Timothy A

    2011-01-01

    Behavioral momentum theory provides a quantitative account of how reinforcers experienced within a discriminative stimulus context govern the persistence of behavior that occurs in that context. The theory suggests that all reinforcers obtained in the presence of a discriminative stimulus increase resistance to change, regardless of whether those reinforcers are contingent on the target behavior, are noncontingent, or are even contingent on an alternative behavior. In this paper, we describe the equations that constitute the theory and address their application to issues of particular importance in applied settings. The theory provides a framework within which to consider the effects of interventions such as extinction, noncontingent reinforcement, differential reinforcement of alternative behavior, and other phenomena (e.g., resurgence). Finally, the theory predicts some counterintuitive and potentially counterproductive effects of alternative reinforcement, and can serve as an integrative guide for intervention when its terms are identified with the relevant conditions of applied settings.

  13. ECR plasma thruster research - Preliminary theory and experiments

    NASA Technical Reports Server (NTRS)

    Sercel, Joel C.; Fitzgerald, Dennis J.

    1989-01-01

    A preliminary theory of the operation of the electron-cyclotron-resonance (ECR) plasma thruster is described along with an outline of recent experiments. This work is presented to communicate the status of an ongoing research effort directed at developing a unified theory to quantitatively describe the operation of the ECR plasma thruster. The theory is presented as a set of nonlinear ordinary differential equations and boundary conditions which describe the plasma density, velocity, and electron temperature. Diagnostic tools developed to measure plasma conditions in the existing research device are described.

  14. Modelling Transposition Latencies: Constraints for Theories of Serial Order Memory

    ERIC Educational Resources Information Center

    Farrell, Simon; Lewandowsky, Stephan

    2004-01-01

    Several competing theories of short-term memory can explain serial recall performance at a quantitative level. However, most theories to date have not been applied to the accompanying pattern of response latencies, thus ignoring a rich and highly diagnostic aspect of performance. This article explores and tests the error latency predictions of…

  15. Evolutionary Game Theory Analysis of Tumor Progression

    NASA Astrophysics Data System (ADS)

    Wu, Amy; Liao, David; Sturm, James; Austin, Robert

    2014-03-01

    Evolutionary game theory applied to two interacting cell populations can yield quantitative prediction of the future densities of the two cell populations based on the initial interaction terms. We will discuss how in a complex ecology that evolutionary game theory successfully predicts the future densities of strains of stromal and cancer cells (multiple myeloma), and discuss the possible clinical use of such analysis for predicting cancer progression. Supported by the National Science Foundation and the National Cancer Institute.

  16. Test of Achievement in Quantitative Economics for Secondary Schools: Construction and Validation Using Item Response Theory

    ERIC Educational Resources Information Center

    Eleje, Lydia I.; Esomonu, Nkechi P. M.

    2018-01-01

    A Test to measure achievement in quantitative economics among secondary school students was developed and validated in this study. The test is made up 20 multiple choice test items constructed based on quantitative economics sub-skills. Six research questions guided the study. Preliminary validation was done by two experienced teachers in…

  17. The Threat of Common Method Variance Bias to Theory Building

    ERIC Educational Resources Information Center

    Reio, Thomas G., Jr.

    2010-01-01

    The need for more theory building scholarship remains one of the pressing issues in the field of HRD. Researchers can employ quantitative, qualitative, and/or mixed methods to support vital theory-building efforts, understanding however that each approach has its limitations. The purpose of this article is to explore common method variance bias as…

  18. Employing Theories Far beyond Their Limits - Linear Dichroism Theory.

    PubMed

    Mayerhöfer, Thomas G

    2018-05-15

    Using linear polarized light, it is possible in case of ordered structures, such as stretched polymers or single crystals, to determine the orientation of the transition moments of electronic and vibrational transitions. This not only helps to resolve overlapping bands, but also assigning the symmetry species of the transitions and to elucidate the structure. To perform spectral evaluation quantitatively, a sometimes "Linear Dichroism Theory" called approach is very often used. This approach links the relative orientation of the transition moment and polarization direction to the quantity absorbance. This linkage is highly questionable for several reasons. First of all, absorbance is a quantity that is by its definition not compatible with Maxwell's equations. Furthermore, absorbance seems not to be the quantity which is generally compatible with linear dichroism theory. In addition, linear dichroism theory disregards that it is not only the angle between transition moment and polarization direction, but also the angle between sample surface and transition moment, that influences band shape and intensity. Accordingly, the often invoked "magic angle" has never existed and the orientation distribution influences spectra to a much higher degree than if linear dichroism theory would hold strictly. A last point that is completely ignored by linear dichroism theory is the fact that partially oriented or randomly-oriented samples usually consist of ordered domains. It is their size relative to the wavelength of light that can also greatly influence a spectrum. All these findings can help to elucidate orientation to a much higher degree by optical methods than currently thought possible by the users of linear dichroism theory. Hence, it is the goal of this contribution to point out these shortcomings of linear dichroism theory to its users to stimulate efforts to overcome the long-lasting stagnation of this important field. © 2018 Wiley-VCH Verlag GmbH & Co. KGa

  19. A Quantitative Theory of Human Color Choices

    PubMed Central

    Komarova, Natalia L.; Jameson, Kimberly A.

    2013-01-01

    The system for colorimetry adopted by the Commission Internationale de l’Eclairage (CIE) in 1931, along with its subsequent improvements, represents a family of light mixture models that has served well for many decades for stimulus specification and reproduction when highly controlled color standards are important. Still, with regard to color appearance many perceptual and cognitive factors are known to contribute to color similarity, and, in general, to all cognitive judgments of color. Using experimentally obtained odd-one-out triad similarity judgments from 52 observers, we demonstrate that CIE-based models can explain a good portion (but not all) of the color similarity data. Color difference quantified by CIELAB ΔE explained behavior at levels of 81% (across all colors), 79% (across red colors), and 66% (across blue colors). We show that the unexplained variation cannot be ascribed to inter- or intra-individual variations among the observers, and points to the presence of additional factors shared by the majority of responders. Based on this, we create a quantitative model of a lexicographic semiorder type, which shows how different perceptual and cognitive influences can trade-off when making color similarity judgments. We show that by incorporating additional influences related to categorical and lightness and saturation factors, the model explains more of the triad similarity behavior, namely, 91% (all colors), 90% (reds), and 87% (blues). We conclude that distance in a CIE model is but the first of several layers in a hierarchy of higher-order cognitive influences that shape color triad choices. We further discuss additional mitigating influences outside the scope of CIE modeling, which can be incorporated in this framework, including well-known influences from language, stimulus set effects, and color preference bias. We also discuss universal and cultural aspects of the model as well as non-uniformity of the color space with respect to different

  20. Qualitative and quantitative reasoning about thermodynamics

    NASA Technical Reports Server (NTRS)

    Skorstad, Gordon; Forbus, Ken

    1989-01-01

    One goal of qualitative physics is to capture the tacit knowledge of engineers and scientists. It is shown how Qualitative Process theory can be used to express concepts of engineering thermodynamics. In particular, it is shown how to integrate qualitative and quantitative knowledge to solve textbook problems involving thermodynamic cycles, such as gas turbine plants and steam power plants. These ideas were implemented in a program called SCHISM. Its analysis of a sample textbook problem is described and plans for future work are discussed.

  1. Experimental Study of Exclusive H2(e,e'p)n Reaction Mechanisms at High Q2

    NASA Astrophysics Data System (ADS)

    Egiyan, K. S.; Asryan, G.; Gevorgyan, N.; Griffioen, K. A.; Laget, J. M.; Kuhn, S. E.; Adams, G.; Amaryan, M. J.; Ambrozewicz, P.; Anghinolfi, M.; Audit, G.; Avakian, H.; Bagdasaryan, H.; Baillie, N.; Ball, J. P.; Baltzell, N. A.; Barrow, S.; Batourine, V.; Battaglieri, M.; Bedlinskiy, I.; Bektasoglu, M.; Bellis, M.; Benmouna, N.; Berman, B. L.; Biselli, A. S.; Blaszczyk, L.; Bouchigny, S.; Boiarinov, S.; Bradford, R.; Branford, D.; Briscoe, W. J.; Brooks, W. K.; Bültmann, S.; Burkert, V. D.; Butuceanu, C.; Calarco, J. R.; Careccia, S. L.; Carman, D. S.; Cazes, A.; Chen, S.; Cole, P. L.; Collins, P.; Coltharp, P.; Cords, D.; Corvisiero, P.; Crabb, D.; Crede, V.; Cummings, J. P.; Dashyan, N.; de Masi, R.; de Vita, R.; de Sanctis, E.; Degtyarenko, P. V.; Denizli, H.; Dennis, L.; Deur, A.; Dharmawardane, K. V.; Dickson, R.; Djalali, C.; Dodge, G. E.; Donnelly, J.; Doughty, D.; Dugger, M.; Dytman, S.; Dzyubak, O. P.; Egiyan, H.; El Fassi, L.; Elouadrhiri, L.; Eugenio, P.; Fatemi, R.; Fedotov, G.; Feldman, G.; Feuerbach, R. J.; Fersch, R.; Garçon, M.; Gavalian, G.; Gilfoyle, G. P.; Giovanetti, K. L.; Girod, F. X.; Goetz, J. T.; Gonenc, A.; Gordon, C. I. O.; Gothe, R. W.; Guidal, M.; Guillo, M.; Guler, N.; Guo, L.; Gyurjyan, V.; Hadjidakis, C.; Hafidi, K.; Hakobyan, H.; Hakobyan, R. S.; Hanretty, C.; Hardie, J.; Hersman, F. W.; Hicks, K.; Hleiqawi, I.; Holtrop, M.; Hyde-Wright, C. E.; Ilieva, Y.; Ireland, D. G.; Ishkhanov, B. S.; Isupov, E. L.; Ito, M. M.; Jenkins, D.; Jo, H. S.; Joo, K.; Juengst, H. G.; Kalantarians, N.; Kellie, J. D.; Khandaker, M.; Kim, W.; Klein, A.; Klein, F. J.; Klimenko, A. V.; Kossov, M.; Krahn, Z.; Kramer, L. H.; Kubarovsky, V.; Kuhn, J.; Kuleshov, S. V.; Lachniet, J.; Langheinrich, J.; Lawrence, D.; Li, Ji; Livingston, K.; Lu, H. Y.; MacCormick, M.; Marchand, C.; Markov, N.; Mattione, P.; McAleer, S.; McKinnon, B.; McNabb, J. W. C.; Mecking, B. A.; Mehrabyan, S.; Melone, J. J.; Mestayer, M. D.; Meyer, C. A.; Mibe, T.; Mikhailov, K.; Minehart, R.; Mirazita, M.; Miskimen, R.; Mokeev, V.; Moriya, K.; Morrow, S. A.; Moteabbed, M.; Mueller, J.; Munevar, E.; Mutchler, G. S.; Nadel-Turonski, P.; Nasseripour, R.; Niccolai, S.; Niculescu, G.; Niculescu, I.; Niczyporuk, B. B.; Niroula, M. R.; Niyazov, R. A.; Nozar, M.; O'Rielly, G. V.; Osipenko, M.; Ostrovidov, A. I.; Park, K.; Pasyuk, E.; Paterson, C.; Anefalos Pereira, S.; Pierce, J.; Pivnyuk, N.; Pocanic, D.; Pogorelko, O.; Pozdniakov, S.; Preedom, B. M.; Price, J. W.; Prok, Y.; Protopopescu, D.; Raue, B. A.; Riccardi, G.; Ricco, G.; Ripani, M.; Ritchie, B. G.; Ronchetti, F.; Rosner, G.; Rossi, P.; Sabatié, F.; Salamanca, J.; Salgado, C.; Santoro, J. P.; Sapunenko, V.; Schumacher, R. A.; Serov, V. S.; Sharabian, Y. G.; Shvedunov, N. V.; Skabelin, A. V.; Smith, E. S.; Smith, L. C.; Sober, D. I.; Sokhan, D.; Stavinsky, A.; Stepanyan, S. S.; Stepanyan, S.; Stokes, B. E.; Stoler, P.; Strauch, S.; Taiuti, M.; Tedeschi, D. J.; Thoma, U.; Tkabladze, A.; Tkachenko, S.; Todor, L.; Tur, C.; Ungaro, M.; Vineyard, M. F.; Vlassov, A. V.; Watts, D. P.; Weinstein, L. B.; Weygand, D. P.; Williams, M.; Wolin, E.; Wood, M. H.; Yegneswaran, A.; Zana, L.; Zhang, J.; Zhao, B.; Zhao, Z. W.

    2007-06-01

    The reaction H2(e,e'p)n has been studied with full kinematic coverage for photon virtuality 1.75theory indicate that for very low values of neutron recoil momentum (pn<100MeV/c) the neutron is primarily a spectator and the reaction can be described by the plane-wave impulse approximation. For 100rescattering dominates the cross section, while Δ production followed by the NΔ→NN transition is the primary contribution at higher momenta.

  2. LECTURES ON DECISION THEORY,

    DTIC Science & Technology

    These lecture notes deal with the mathematical theory of decision - making , i.e., wihematical models of situations in which there is a set of...individual and group decision - making as a quantitative science, in contrast with a field such as physics, suggests that mathematical theorizing on...phenomena of decision - making is very much an exploratory enterprise and that ex isting models have limited generality and appli cability. The purpose is to

  3. Theory, Research, Practice, Caring, and Spirituality: Zen and the Art of Educational Administration

    ERIC Educational Resources Information Center

    Place, A. William

    2005-01-01

    Educational theory should be the foundation for most practice and research, but this is not a one-way relationship. Good inductive research may help to develop theory instead of coming from it. The relationship of theory to research can be different in qualitative research than in quantitative research. Practice can and should also inform and be…

  4. Scaling theory of magnetoresistance and carrier localization in Ga1-xMnxAs.

    PubMed

    Moca, C P; Sheu, B L; Samarth, N; Schiffer, P; Janko, B; Zarand, G

    2009-04-03

    We compare experimental resistivity data on Ga1-xMnxAs films with theoretical calculations using a scaling theory for strongly disordered ferromagnets. The characteristic features of the temperature dependent resistivity can be quantitatively understood through this approach as originating from the close vicinity of the metal-insulator transition. However, accounting for thermal fluctuations is crucial for a quantitative description of the magnetic field induced changes in resistance. While the noninteracting scaling theory is in reasonable agreement with the data, we find clear evidence for interaction effects at low temperatures.

  5. Critically appraising qualitative research: a guide for clinicians more familiar with quantitative techniques.

    PubMed

    Kisely, Stephen; Kendall, Elizabeth

    2011-08-01

    Papers using qualitative methods are increasingly common in psychiatric journals. This overview is an introduction to critically appraising a qualitative paper for clinicians who are more familiar with quantitative methods. Qualitative research uses data from interviews (semi-structured or unstructured), focus groups, observations or written materials. Data analysis is inductive, allowing meaning to emerge from the data, rather than the more deductive, hypothesis centred approach of quantitative research. This overview compares and contrasts quantitative and qualitative research methods. Quantitative concepts such as reliability, validity, statistical power, bias and generalisability have qualitative equivalents. These include triangulation, trustworthiness, saturation, reflexivity and applicability. Reflexivity also shares features of transference. Qualitative approaches include: ethnography, action-assessment, grounded theory, case studies and mixed methods. Qualitative research can complement quantitative approaches. An understanding of both is useful in critically appraising the psychiatric literature.

  6. Design of Learning Model of Logic and Algorithms Based on APOS Theory

    ERIC Educational Resources Information Center

    Hartati, Sulis Janu

    2014-01-01

    This research questions were "how do the characteristics of learning model of logic & algorithm according to APOS theory" and "whether or not these learning model can improve students learning outcomes". This research was conducted by exploration, and quantitative approach. Exploration used in constructing theory about the…

  7. The moon illusion: II. A reference theory.

    PubMed

    Baird, J C

    1982-09-01

    The present theory provides explanations for the moon illusion and related issues involving size and distance perception in natural, outdoor settings. Although some assumptions of previous theories are rejected, other pivotal aspects are retained in this formulation. In particular, the present theory states that both the sky and ground are important referents in judging the spatial extent of the moon. Neither factor alone can account for all the available data, but quantitative models incorporating both factors do quite well when applied to the parametric findings of Holway and Boring, as well as to the results obtained by Kaufman and Rock. The reference theory and its associated class of specific models suggest new theoretical directions and experimental tests to narrow yet further the selection of appropriate explanations for one of visual perception's oldest unsolved puzzles.

  8. Conceptual Diversity, Moderators, and Theoretical Issues in Quantitative Studies of Cultural Capital Theory

    ERIC Educational Resources Information Center

    Tan, Cheng Yong

    2017-01-01

    The present study reviewed quantitative empirical studies examining the relationship between cultural capital and student achievement. Results showed that researchers had conceptualized and measured cultural capital in different ways. It is argued that the more holistic understanding of the construct beyond highbrow cultural consumption must be…

  9. A general theory of interference fringes in x-ray phase grating imaging.

    PubMed

    Yan, Aimin; Wu, Xizeng; Liu, Hong

    2015-06-01

    The authors note that the concept of the Talbot self-image distance in x-ray phase grating interferometry is indeed not well defined for polychromatic x-rays, because both the grating phase shift and the fractional Talbot distances are all x-ray wavelength-dependent. For x-ray interferometry optimization, there is a need for a quantitative theory that is able to predict if a good intensity modulation is attainable at a given grating-to-detector distance. In this work, the authors set out to meet this need. In order to apply Fourier analysis directly to the intensity fringe patterns of two-dimensional and one-dimensional phase grating interferometers, the authors start their derivation from a general phase space theory of x-ray phase-contrast imaging. Unlike previous Fourier analyses, the authors evolved the Wigner distribution to obtain closed-form expressions of the Fourier coefficients of the intensity fringes for any grating-to-detector distance, even if it is not a fractional Talbot distance. The developed theory determines the visibility of any diffraction order as a function of the grating-to-detector distance, the phase shift of the grating, and the x-ray spectrum. The authors demonstrate that the visibilities of diffraction orders can serve as the indicators of the underlying interference intensity modulation. Applying the theory to the conventional and inverse geometry configurations of single-grating interferometers, the authors demonstrated that the proposed theory provides a quantitative tool for the grating interferometer optimization with or without the Talbot-distance constraints. In this work, the authors developed a novel theory of the interference intensity fringes in phase grating x-ray interferometry. This theory provides a quantitative tool in design optimization of phase grating x-ray interferometers.

  10. Toward a quantitative approach to migrants integration

    NASA Astrophysics Data System (ADS)

    Barra, A.; Contucci, P.

    2010-03-01

    Migration phenomena and all the related issues, like integration of different social groups, are intrinsically complex problems since they strongly depend on several competitive mechanisms as economic factors, cultural differences and many others. By identifying a few essential assumptions, and using the statistical mechanics of complex systems, we propose a novel quantitative approach that provides a minimal theory for those phenomena. We show that the competitive interactions in decision making between a population of N host citizens and P immigrants, a bi-partite spin-glass, give rise to a social consciousness inside the host community in the sense of the associative memory of neural networks. The theory leads to a natural quantitative definition of migrant's "integration" inside the community. From the technical point of view this minimal picture assumes, as control parameters, only general notions like the strength of the random interactions, the ratio between the sizes of the two parties and the cultural influence. Few steps forward, toward more refined models, which include a digression on the kind of the felt experiences and some structure on the random interaction topology (as dilution to avoid the plain mean-field approach) and correlations of experiences felt between the two parties (biasing the distribution of the coupling) are discussed at the end, where we show the robustness of our approach.

  11. Renormalization Group Theory for the Imbalanced Fermi Gas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gubbels, K. B.; Stoof, H. T. C.

    2008-04-11

    We formulate a Wilsonian renormalization group theory for the imbalanced Fermi gas. The theory is able to recover quantitatively well-established results in both the weak-coupling and the strong-coupling (unitarity) limits. We determine for the latter case the line of second-order phase transitions of the imbalanced Fermi gas and, in particular, the location of the tricritical point. We obtain good agreement with the recent experiments of Y. Shin et al. [Nature (London) 451, 689 (2008)].

  12. Secondary Schools Principals and Their Job Satisfaction: A Test of Process Theories

    ERIC Educational Resources Information Center

    Maforah, Tsholofelo Paulinah

    2015-01-01

    The study aims to test the validity of process theories on the job satisfaction of previously disadvantaged Secondary School principals in the North West province. A mixed-method approach consisting of both quantitative and qualitative methods was used for the study. A questionnaire was administered during the quantitative phase with a sample that…

  13. Corporatized Higher Education: A Quantitative Study Examining Faculty Motivation Using Self-Determination Theory

    ERIC Educational Resources Information Center

    Brown, Aaron D.

    2016-01-01

    The intent of this research is to offer a quantitative analysis of self-determined faculty motivation within the current corporate model of higher education across public and private research universities. With such a heightened integration of accountability structures, external reward systems, and the ongoing drive for more money and…

  14. Quantitative design of emergency monitoring network for river chemical spills based on discrete entropy theory.

    PubMed

    Shi, Bin; Jiang, Jiping; Sivakumar, Bellie; Zheng, Yi; Wang, Peng

    2018-05-01

    Field monitoring strategy is critical for disaster preparedness and watershed emergency environmental management. However, development of such is also highly challenging. Despite the efforts and progress thus far, no definitive guidelines or solutions are available worldwide for quantitatively designing a monitoring network in response to river chemical spill incidents, except general rules based on administrative divisions or arbitrary interpolation on routine monitoring sections. To address this gap, a novel framework for spatial-temporal network design was proposed in this study. The framework combines contaminant transport modelling with discrete entropy theory and spectral analysis. The water quality model was applied to forecast the spatio-temporal distribution of contaminant after spills and then corresponding information transfer indexes (ITIs) and Fourier approximation periodic functions were estimated as critical measures for setting sampling locations and times. The results indicate that the framework can produce scientific preparedness plans of emergency monitoring based on scenario analysis of spill risks as well as rapid design as soon as the incident happened but not prepared. The framework was applied to a hypothetical spill case based on tracer experiment and a real nitrobenzene spill incident case to demonstrate its suitability and effectiveness. The newly-designed temporal-spatial monitoring network captured major pollution information at relatively low costs. It showed obvious benefits for follow-up early-warning and treatment as well as for aftermath recovery and assessment. The underlying drivers of ITIs as well as the limitations and uncertainty of the approach were analyzed based on the case studies. Comparison with existing monitoring network design approaches, management implications, and generalized applicability were also discussed. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. What Is True Halving in the Payoff Matrix of Game Theory?

    PubMed

    Ito, Hiromu; Katsumata, Yuki; Hasegawa, Eisuke; Yoshimura, Jin

    2016-01-01

    In game theory, there are two social interpretations of rewards (payoffs) for decision-making strategies: (1) the interpretation based on the utility criterion derived from expected utility theory and (2) the interpretation based on the quantitative criterion (amount of gain) derived from validity in the empirical context. A dynamic decision theory has recently been developed in which dynamic utility is a conditional (state) variable that is a function of the current wealth of a decision maker. We applied dynamic utility to the equal division in dove-dove contests in the hawk-dove game. Our results indicate that under the utility criterion, the half-share of utility becomes proportional to a player's current wealth. Our results are consistent with studies of the sense of fairness in animals, which indicate that the quantitative criterion has greater validity than the utility criterion. We also find that traditional analyses of repeated games must be reevaluated.

  16. Are Teacher Course Evaluations Biased against Faculty That Teach Quantitative Methods Courses?

    ERIC Educational Resources Information Center

    Royal, Kenneth D.; Stockdale, Myrah R.

    2015-01-01

    The present study investigated graduate students' responses to teacher/course evaluations (TCE) to determine if students' responses were inherently biased against faculty who teach quantitative methods courses. Item response theory (IRT) and Differential Item Functioning (DIF) techniques were utilized for data analysis. Results indicate students…

  17. Quantitative Skills as a Graduate Learning Outcome: Exploring Students' Evaluative Expertise

    ERIC Educational Resources Information Center

    Matthews, Kelly E.; Adams, Peter; Goos, Merrilyn

    2017-01-01

    In the biosciences, quantitative skills are an essential graduate learning outcome. Efforts to evidence student attainment at the whole of degree programme level are rare and making sense of such data is complex. We draw on assessment theories from Sadler (evaluative expertise) and Boud (sustainable assessment) to interpret final-year bioscience…

  18. Cleavage Entropy as Quantitative Measure of Protease Specificity

    PubMed Central

    Fuchs, Julian E.; von Grafenstein, Susanne; Huber, Roland G.; Margreiter, Michael A.; Spitzer, Gudrun M.; Wallnoefer, Hannes G.; Liedl, Klaus R.

    2013-01-01

    A purely information theory-guided approach to quantitatively characterize protease specificity is established. We calculate an entropy value for each protease subpocket based on sequences of cleaved substrates extracted from the MEROPS database. We compare our results with known subpocket specificity profiles for individual proteases and protease groups (e.g. serine proteases, metallo proteases) and reflect them quantitatively. Summation of subpocket-wise cleavage entropy contributions yields a measure for overall protease substrate specificity. This total cleavage entropy allows ranking of different proteases with respect to their specificity, separating unspecific digestive enzymes showing high total cleavage entropy from specific proteases involved in signaling cascades. The development of a quantitative cleavage entropy score allows an unbiased comparison of subpocket-wise and overall protease specificity. Thus, it enables assessment of relative importance of physicochemical and structural descriptors in protease recognition. We present an exemplary application of cleavage entropy in tracing substrate specificity in protease evolution. This highlights the wide range of substrate promiscuity within homologue proteases and hence the heavy impact of a limited number of mutations on individual substrate specificity. PMID:23637583

  19. Solitonic Dispersive Hydrodynamics: Theory and Observation

    NASA Astrophysics Data System (ADS)

    Maiden, Michelle D.; Anderson, Dalton V.; Franco, Nevil A.; El, Gennady A.; Hoefer, Mark A.

    2018-04-01

    Ubiquitous nonlinear waves in dispersive media include localized solitons and extended hydrodynamic states such as dispersive shock waves. Despite their physical prominence and the development of thorough theoretical and experimental investigations of each separately, experiments and a unified theory of solitons and dispersive hydrodynamics are lacking. Here, a general soliton-mean field theory is introduced and used to describe the propagation of solitons in macroscopic hydrodynamic flows. Two universal adiabatic invariants of motion are identified that predict trapping or transmission of solitons by hydrodynamic states. The result of solitons incident upon smooth expansion waves or compressive, rapidly oscillating dispersive shock waves is the same, an effect termed hydrodynamic reciprocity. Experiments on viscous fluid conduits quantitatively confirm the soliton-mean field theory with broader implications for nonlinear optics, superfluids, geophysical fluids, and other dispersive hydrodynamic media.

  20. Quantitative influence of risk factors on blood glucose level.

    PubMed

    Chen, Songjing; Luo, Senlin; Pan, Limin; Zhang, Tiemei; Han, Longfei; Zhao, Haixiu

    2014-01-01

    The aim of this study is to quantitatively analyze the influence of risk factors on the blood glucose level, and to provide theory basis for understanding the characteristics of blood glucose change and confirming the intervention index for type 2 diabetes. The quantitative method is proposed to analyze the influence of risk factors on blood glucose using back propagation (BP) neural network. Ten risk factors are screened first. Then the cohort is divided into nine groups by gender and age. According to the minimum error principle, nine BP models are trained respectively. The quantitative values of the influence of different risk factors on the blood glucose change can be obtained by sensitivity calculation. The experiment results indicate that weight is the leading cause of blood glucose change (0.2449). The second factors are cholesterol, age and triglyceride. The total ratio of these four factors reaches to 77% of the nine screened risk factors. And the sensitivity sequences can provide judgment method for individual intervention. This method can be applied to risk factors quantitative analysis of other diseases and potentially used for clinical practitioners to identify high risk populations for type 2 diabetes as well as other disease.

  1. Ranking Theory and Conditional Reasoning.

    PubMed

    Skovgaard-Olsen, Niels

    2016-05-01

    Ranking theory is a formal epistemology that has been developed in over 600 pages in Spohn's recent book The Laws of Belief, which aims to provide a normative account of the dynamics of beliefs that presents an alternative to current probabilistic approaches. It has long been received in the AI community, but it has not yet found application in experimental psychology. The purpose of this paper is to derive clear, quantitative predictions by exploiting a parallel between ranking theory and a statistical model called logistic regression. This approach is illustrated by the development of a model for the conditional inference task using Spohn's (2013) ranking theoretic approach to conditionals. Copyright © 2015 Cognitive Science Society, Inc.

  2. Theory for the solvation of nonpolar solutes in water

    NASA Astrophysics Data System (ADS)

    Urbic, T.; Vlachy, V.; Kalyuzhnyi, Yu. V.; Dill, K. A.

    2007-11-01

    We recently developed an angle-dependent Wertheim integral equation theory (IET) of the Mercedes-Benz (MB) model of pure water [Silverstein et al., J. Am. Chem. Soc. 120, 3166 (1998)]. Our approach treats explicitly the coupled orientational constraints within water molecules. The analytical theory offers the advantage of being less computationally expensive than Monte Carlo simulations by two orders of magnitude. Here we apply the angle-dependent IET to studying the hydrophobic effect, the transfer of a nonpolar solute into MB water. We find that the theory reproduces the Monte Carlo results qualitatively for cold water and quantitatively for hot water.

  3. Theory for the solvation of nonpolar solutes in water.

    PubMed

    Urbic, T; Vlachy, V; Kalyuzhnyi, Yu V; Dill, K A

    2007-11-07

    We recently developed an angle-dependent Wertheim integral equation theory (IET) of the Mercedes-Benz (MB) model of pure water [Silverstein et al., J. Am. Chem. Soc. 120, 3166 (1998)]. Our approach treats explicitly the coupled orientational constraints within water molecules. The analytical theory offers the advantage of being less computationally expensive than Monte Carlo simulations by two orders of magnitude. Here we apply the angle-dependent IET to studying the hydrophobic effect, the transfer of a nonpolar solute into MB water. We find that the theory reproduces the Monte Carlo results qualitatively for cold water and quantitatively for hot water.

  4. Empirical and pragmatic adequacy of grounded theory: Advancing nurse empowerment theory for nurses' practice.

    PubMed

    Udod, Sonia A; Racine, Louise

    2017-12-01

    To draw on the findings of a grounded theory study aimed at exploring how power is exercised in nurse-manager relationships in the hospital setting, this paper examines the empirical and pragmatic adequacy of grounded theory as a methodology to advance the concept of empowerment in the area of nursing leadership and management. The evidence on staff nurse empowerment has highlighted the magnitude of individual and organisational outcomes, but has not fully explicated the micro-level processes underlying how power is exercised, shared or created within the nurse-manager relationship. Although grounded theory is a widely adopted nursing research methodology, it remains less used in nursing leadership because of the dominance of quantitative approaches to research. Grounded theory methodology provides the empirical and pragmatic relevance to inform nursing practice and policy. Grounded theory is a relevant qualitative approach to use in leadership research as it provides a fine and detailed analysis of the process underlying complexity and bureaucracy. Discursive paper. A critical examination of the empirical and pragmatic relevance of grounded theory by (Corbin & Strauss, , ) as a method for analysing and solving problems in nurses' practice is provided. This paper provides evidence to support the empirical and pragmatic adequacy of grounded theory methodology. Although the application of the ontological, epistemological and methodological assumptions of grounded theory is challenging, this methodology is useful to address real-life problems in nursing practice by developing theoretical explanations of nurse empowerment, or lack thereof, in the workplace. Grounded theory represents a relevant methodology to inform nursing leadership research. Grounded theory is anchored in the reality of practice. The strength of grounded theory is to provide results that can be readily applied to clinical practice and policy as they arise from problems that affect practice and that

  5. Discriminant content validity: a quantitative methodology for assessing content of theory-based measures, with illustrative applications.

    PubMed

    Johnston, Marie; Dixon, Diane; Hart, Jo; Glidewell, Liz; Schröder, Carin; Pollard, Beth

    2014-05-01

    In studies involving theoretical constructs, it is important that measures have good content validity and that there is not contamination of measures by content from other constructs. While reliability and construct validity are routinely reported, to date, there has not been a satisfactory, transparent, and systematic method of assessing and reporting content validity. In this paper, we describe a methodology of discriminant content validity (DCV) and illustrate its application in three studies. Discriminant content validity involves six steps: construct definition, item selection, judge identification, judgement format, single-sample test of content validity, and assessment of discriminant items. In three studies, these steps were applied to a measure of illness perceptions (IPQ-R) and control cognitions. The IPQ-R performed well with most items being purely related to their target construct, although timeline and consequences had small problems. By contrast, the study of control cognitions identified problems in measuring constructs independently. In the final study, direct estimation response formats for theory of planned behaviour constructs were found to have as good DCV as Likert format. The DCV method allowed quantitative assessment of each item and can therefore inform the content validity of the measures assessed. The methods can be applied to assess content validity before or after collecting data to select the appropriate items to measure theoretical constructs. Further, the data reported for each item in Appendix S1 can be used in item or measure selection. Statement of contribution What is already known on this subject? There are agreed methods of assessing and reporting construct validity of measures of theoretical constructs, but not their content validity. Content validity is rarely reported in a systematic and transparent manner. What does this study add? The paper proposes discriminant content validity (DCV), a systematic and transparent method

  6. What Is True Halving in the Payoff Matrix of Game Theory?

    PubMed Central

    Hasegawa, Eisuke; Yoshimura, Jin

    2016-01-01

    In game theory, there are two social interpretations of rewards (payoffs) for decision-making strategies: (1) the interpretation based on the utility criterion derived from expected utility theory and (2) the interpretation based on the quantitative criterion (amount of gain) derived from validity in the empirical context. A dynamic decision theory has recently been developed in which dynamic utility is a conditional (state) variable that is a function of the current wealth of a decision maker. We applied dynamic utility to the equal division in dove-dove contests in the hawk-dove game. Our results indicate that under the utility criterion, the half-share of utility becomes proportional to a player’s current wealth. Our results are consistent with studies of the sense of fairness in animals, which indicate that the quantitative criterion has greater validity than the utility criterion. We also find that traditional analyses of repeated games must be reevaluated. PMID:27487194

  7. Louis Guttman's Contributions to Classical Test Theory

    ERIC Educational Resources Information Center

    Zimmerman, Donald W.; Williams, Richard H.; Zumbo, Bruno D.; Ross, Donald

    2005-01-01

    This article focuses on Louis Guttman's contributions to the classical theory of educational and psychological tests, one of the lesser known of his many contributions to quantitative methods in the social sciences. Guttman's work in this field provided a rigorous mathematical basis for ideas that, for many decades after Spearman's initial work,…

  8. Deuteron Compton scattering below pion photoproduction threshold

    NASA Astrophysics Data System (ADS)

    Levchuk, M. I.; L'vov, A. I.

    2000-07-01

    Deuteron Compton scattering below pion photoproduction threshold is considered in the framework of the nonrelativistic diagrammatic approach with the Bonn OBE potential. A complete gauge-invariant set of diagrams is taken into account which includes resonance diagrams without and with NN-rescattering and diagrams with one- and two-body seagulls. The seagull operators are analyzed in detail, and their relations with free- and bound-nucleon polarizabilities are discussed. It is found that both dipole and higher-order polarizabilities of the nucleon are required for a quantitative description of recent experimental data. An estimate of the isospin-averaged dipole electromagnetic polarizabilities of the nucleon and the polarizabilities of the neutron is obtained from the data.

  9. Semi-quantitative spectrographic analysis and rank correlation in geochemistry

    USGS Publications Warehouse

    Flanagan, F.J.

    1957-01-01

    The rank correlation coefficient, rs, which involves less computation than the product-moment correlation coefficient, r, can be used to indicate the degree of relationship between two elements. The method is applicable in situations where the assumptions underlying normal distribution correlation theory may not be satisfied. Semi-quantitative spectrographic analyses which are reported as grouped or partly ranked data can be used to calculate rank correlations between elements. ?? 1957.

  10. Analyzing force concept inventory with item response theory

    NASA Astrophysics Data System (ADS)

    Wang, Jing; Bao, Lei

    2010-10-01

    Item response theory is a popular assessment method used in education. It rests on the assumption of a probability framework that relates students' innate ability and their performance on test questions. Item response theory transforms students' raw test scores into a scaled proficiency score, which can be used to compare results obtained with different test questions. The scaled score also addresses the issues of ceiling effects and guessing, which commonly exist in quantitative assessment. We used item response theory to analyze the force concept inventory (FCI). Our results show that item response theory can be useful for analyzing physics concept surveys such as the FCI and produces results about the individual questions and student performance that are beyond the capability of classical statistics. The theory yields detailed measurement parameters regarding the difficulty, discrimination features, and probability of correct guess for each of the FCI questions.

  11. Prospect theory in the health domain: a quantitative assessment.

    PubMed

    Attema, Arthur E; Brouwer, Werner B F; I'Haridon, Olivier

    2013-12-01

    It is well-known that expected utility (EU) has empirical deficiencies. Cumulative prospect theory (CPT) has developed as an alternative with more descriptive validity. However, CPT's full function had not yet been quantified in the health domain. This paper is therefore the first to simultaneously measure utility of life duration, probability weighting, and loss aversion in this domain. We observe loss aversion and risk aversion for gains and losses, which for gains can be explained by probabilistic pessimism. Utility for gains is almost linear. For losses, we find less weighting of probability 1/2 and concave utility. This contrasts with the common finding of convex utility for monetary losses. However, CPT was proposed to explain choices among lotteries involving monetary outcomes. Life years are arguably very different from monetary outcomes and need not generate convex utility for losses. Moreover, utility of life duration reflects discounting, causing concave utility. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. Analytical progress in the theory of vesicles under linear flow

    NASA Astrophysics Data System (ADS)

    Farutin, Alexander; Biben, Thierry; Misbah, Chaouqi

    2010-06-01

    Vesicles are becoming a quite popular model for the study of red blood cells. This is a free boundary problem which is rather difficult to handle theoretically. Quantitative computational approaches constitute also a challenge. In addition, with numerical studies, it is not easy to scan within a reasonable time the whole parameter space. Therefore, having quantitative analytical results is an essential advance that provides deeper understanding of observed features and can be used to accompany and possibly guide further numerical development. In this paper, shape evolution equations for a vesicle in a shear flow are derived analytically with precision being cubic (which is quadratic in previous theories) with regard to the deformation of the vesicle relative to a spherical shape. The phase diagram distinguishing regions of parameters where different types of motion (tank treading, tumbling, and vacillating breathing) are manifested is presented. This theory reveals unsuspected features: including higher order terms and harmonics (even if they are not directly excited by the shear flow) is necessary, whatever the shape is close to a sphere. Not only does this theory cure a quite large quantitative discrepancy between previous theories and recent experiments and numerical studies, but also it reveals a phenomenon: the VB mode band in parameter space, which is believed to saturate after a moderate shear rate, exhibits a striking widening beyond a critical shear rate. The widening results from excitation of fourth-order harmonic. The obtained phase diagram is in a remarkably good agreement with recent three-dimensional numerical simulations based on the boundary integral formulation. Comparison of our results with experiments is systematically made.

  13. Towards program theory validation: Crowdsourcing the qualitative analysis of participant experiences.

    PubMed

    Harman, Elena; Azzam, Tarek

    2018-02-01

    This exploratory study examines a novel tool for validating program theory through crowdsourced qualitative analysis. It combines a quantitative pattern matching framework traditionally used in theory-driven evaluation with crowdsourcing to analyze qualitative interview data. A sample of crowdsourced participants are asked to read an interview transcript and identify whether program theory components (Activities and Outcomes) are discussed and to highlight the most relevant passage about that component. The findings indicate that using crowdsourcing to analyze qualitative data can differentiate between program theory components that are supported by a participant's experience and those that are not. This approach expands the range of tools available to validate program theory using qualitative data, thus strengthening the theory-driven approach. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. DOE Theory Graduate Student Fellowship: Gustavo Marques Tavares

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmaltz, Martin

    2015-12-30

    Marques Tavares was awarded a fellowship for his proposal “The ttbar asymmetry and beyond” to starting in September 2012. This is the final report summarizing the research activities and accomplishments achieved with this grant support. With support from the DOE graduate fellowship Marques Tavares, Katz and Xu at BU have investigated a new technique for obtaining quantitative results in strongly coupled field theories with broken conformal invariance. Such theories are especially interesting as they may be candidates for physics beyond the standard model with possible applications to strongly coupled electroweak symmetry breaking. However, because of the strong coupling even qualitativemore » results about the spectrum of such theories are not rigorously understood.« less

  15. Burnett-Cattaneo continuum theory for shock waves.

    PubMed

    Holian, Brad Lee; Mareschal, Michel; Ravelo, Ramon

    2011-02-01

    We model strong shock-wave propagation, both in the ideal gas and in the dense Lennard-Jones fluid, using a refinement of earlier work, which accounts for the cold compression in the early stages of the shock rise by a nonlinear, Burnett-like, strain-rate dependence of the thermal conductivity, and relaxation of kinetic-temperature components on the hot, compressed side of the shock front. The relaxation of the disequilibrium among the three components of the kinetic temperature, namely, the difference between the component in the direction of a planar shock wave and those in the transverse directions, particularly in the region near the shock front, is accomplished at a much more quantitative level by a rigorous application of the Cattaneo-Maxwell relaxation equation to a reference solution, namely, the steady shock-wave solution of linear Navier-Stokes-Fourier theory, along with the nonlinear Burnett heat-flux term. Our new continuum theory is in nearly quantitative agreement with nonequilibrium molecular-dynamics simulations under strong shock-wave conditions, using relaxation parameters obtained from the reference solution. ©2011 American Physical Society

  16. Quantitative amd Qualitative Sources of Affect: How Unexpectedness and Valence Relate to Pleasantness and Preference. Technical Report No. 293.

    ERIC Educational Resources Information Center

    Iran-Nejad, Asghar; Ortony, Andrew

    Optimal-level theories maintain that the quality of affect is a function of a quantitative arousal potential dimension. An alternative view is that the quantitative dimension merely modulates preexisting qualitative properties and is therefore only responsible for changes in the degree of affect. Thus, the quality of affect, whether it is positive…

  17. A quantitative description for efficient financial markets

    NASA Astrophysics Data System (ADS)

    Immonen, Eero

    2015-09-01

    In this article we develop a control system model for describing efficient financial markets. We define the efficiency of a financial market in quantitative terms by robust asymptotic price-value equality in this model. By invoking the Internal Model Principle of robust output regulation theory we then show that under No Bubble Conditions, in the proposed model, the market is efficient if and only if the following conditions hold true: (1) the traders, as a group, can identify any mispricing in asset value (even if no one single trader can do it accurately), and (2) the traders, as a group, incorporate an internal model of the value process (again, even if no one single trader knows it). This main result of the article, which deliberately avoids the requirement for investor rationality, demonstrates, in quantitative terms, that the more transparent the markets are, the more efficient they are. An extensive example is provided to illustrate the theoretical development.

  18. General quantitative genetic methods for comparative biology: phylogenies, taxonomies and multi-trait models for continuous and categorical characters.

    PubMed

    Hadfield, J D; Nakagawa, S

    2010-03-01

    Although many of the statistical techniques used in comparative biology were originally developed in quantitative genetics, subsequent development of comparative techniques has progressed in relative isolation. Consequently, many of the new and planned developments in comparative analysis already have well-tested solutions in quantitative genetics. In this paper, we take three recent publications that develop phylogenetic meta-analysis, either implicitly or explicitly, and show how they can be considered as quantitative genetic models. We highlight some of the difficulties with the proposed solutions, and demonstrate that standard quantitative genetic theory and software offer solutions. We also show how results from Bayesian quantitative genetics can be used to create efficient Markov chain Monte Carlo algorithms for phylogenetic mixed models, thereby extending their generality to non-Gaussian data. Of particular utility is the development of multinomial models for analysing the evolution of discrete traits, and the development of multi-trait models in which traits can follow different distributions. Meta-analyses often include a nonrandom collection of species for which the full phylogenetic tree has only been partly resolved. Using missing data theory, we show how the presented models can be used to correct for nonrandom sampling and show how taxonomies and phylogenies can be combined to give a flexible framework with which to model dependence.

  19. EvolQG - An R package for evolutionary quantitative genetics

    PubMed Central

    Melo, Diogo; Garcia, Guilherme; Hubbe, Alex; Assis, Ana Paula; Marroig, Gabriel

    2016-01-01

    We present an open source package for performing evolutionary quantitative genetics analyses in the R environment for statistical computing. Evolutionary theory shows that evolution depends critically on the available variation in a given population. When dealing with many quantitative traits this variation is expressed in the form of a covariance matrix, particularly the additive genetic covariance matrix or sometimes the phenotypic matrix, when the genetic matrix is unavailable and there is evidence the phenotypic matrix is sufficiently similar to the genetic matrix. Given this mathematical representation of available variation, the \\textbf{EvolQG} package provides functions for calculation of relevant evolutionary statistics; estimation of sampling error; corrections for this error; matrix comparison via correlations, distances and matrix decomposition; analysis of modularity patterns; and functions for testing evolutionary hypotheses on taxa diversification. PMID:27785352

  20. Transcending the Quantitative-Qualitative Divide with Mixed Methods Research: A Multidimensional Framework for Understanding Congruence and Completeness in the Study of Values

    ERIC Educational Resources Information Center

    McLafferty, Charles L., Jr.; Slate, John R.; Onwuegbuzie, Anthony J.

    2010-01-01

    Quantitative research dominates published literature in the helping professions. Mixed methods research, which integrates quantitative and qualitative methodologies, has received a lukewarm reception. The authors address the iterative separation that infuses theory, praxis, philosophy, methodology, training, and public perception and propose a…

  1. Dynamic Self-Consistent Field Theories for Polymer Blends and Block Copolymers

    NASA Astrophysics Data System (ADS)

    Kawakatsu, Toshihiro

    Understanding the behavior of the phase separated domain structures and rheological properties of multi-component polymeric systems require detailed information on the dynamics of domains and that of conformations of constituent polymer chains. Self-consistent field (SCF) theory is a useful tool to treat such a problem because the conformation entropy of polymer chains in inhomogeneous systems can be evaluated quantitatively using this theory. However, when we turn our attention to the dynamic properties in a non-equilibrium state, the basic assumption of the SCF theory, i.e. the assumption of equilibrium chain conformation, breaks down. In order to avoid such a difficulty, dynamic SCF theories were developed. In this chapter, we give a brief review of the recent developments of dynamic SCF theories, and discuss where the cutting-edge of this theory is.

  2. Comparing theories of reference-dependent choice.

    PubMed

    Bhatia, Sudeep

    2017-09-01

    Preferences are influenced by the presence or absence of salient choice options, known as reference points. This behavioral tendency is traditionally attributed to the loss aversion and diminishing sensitivity assumptions of prospect theory. In contrast, some psychological research suggests that reference dependence is caused by attentional biases that increase the subjective weighting of the reference point's primary attributes. Although both theories are able to successfully account for behavioral findings involving reference dependence, this article shows that these theories make diverging choice predictions when available options are inferior to the reference point. It presents the results of 2 studies that use settings with inferior choice options to compare these 2 theories. The analysis involves quantitative fits to participant-level choice data, and the results indicate that most participants are better described by models with attentional bias than they are by models with loss aversion and diminishing sensitivity. These differences appear to be caused by violations of loss aversion and diminishing sensitivity in losses. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  3. Combining qualitative and quantitative research within mixed method research designs: a methodological review.

    PubMed

    Östlund, Ulrika; Kidd, Lisa; Wengström, Yvonne; Rowa-Dewar, Neneh

    2011-03-01

    It has been argued that mixed methods research can be useful in nursing and health science because of the complexity of the phenomena studied. However, the integration of qualitative and quantitative approaches continues to be one of much debate and there is a need for a rigorous framework for designing and interpreting mixed methods research. This paper explores the analytical approaches (i.e. parallel, concurrent or sequential) used in mixed methods studies within healthcare and exemplifies the use of triangulation as a methodological metaphor for drawing inferences from qualitative and quantitative findings originating from such analyses. This review of the literature used systematic principles in searching CINAHL, Medline and PsycINFO for healthcare research studies which employed a mixed methods approach and were published in the English language between January 1999 and September 2009. In total, 168 studies were included in the results. Most studies originated in the United States of America (USA), the United Kingdom (UK) and Canada. The analytic approach most widely used was parallel data analysis. A number of studies used sequential data analysis; far fewer studies employed concurrent data analysis. Very few of these studies clearly articulated the purpose for using a mixed methods design. The use of the methodological metaphor of triangulation on convergent, complementary, and divergent results from mixed methods studies is exemplified and an example of developing theory from such data is provided. A trend for conducting parallel data analysis on quantitative and qualitative data in mixed methods healthcare research has been identified in the studies included in this review. Using triangulation as a methodological metaphor can facilitate the integration of qualitative and quantitative findings, help researchers to clarify their theoretical propositions and the basis of their results. This can offer a better understanding of the links between theory and

  4. Limits of quantitation - Yet another suggestion

    NASA Astrophysics Data System (ADS)

    Carlson, Jill; Wysoczanski, Artur; Voigtman, Edward

    2014-06-01

    The work presented herein suggests that the limit of quantitation concept may be rendered substantially less ambiguous and ultimately more useful as a figure of merit by basing it upon the significant figure and relative measurement error ideas due to Coleman, Auses and Gram, coupled with the correct instantiation of Currie's detection limit methodology. Simple theoretical results are presented for a linear, univariate chemical measurement system with homoscedastic Gaussian noise, and these are tested against both Monte Carlo computer simulations and laser-excited molecular fluorescence experimental results. Good agreement among experiment, theory and simulation is obtained and an easy extension to linearly heteroscedastic Gaussian noise is also outlined.

  5. Will Quantitative Proteomics Redefine Some of the Key Concepts in Skeletal Muscle Physiology?

    PubMed

    Gizak, Agnieszka; Rakus, Dariusz

    2016-01-11

    Molecular and cellular biology methodology is traditionally based on the reasoning called "the mechanistic explanation". In practice, this means identifying and selecting correlations between biological processes which result from our manipulation of a biological system. In theory, a successful application of this approach requires precise knowledge about all parameters of a studied system. However, in practice, due to the systems' complexity, this requirement is rarely, if ever, accomplished. Typically, it is limited to a quantitative or semi-quantitative measurements of selected parameters (e.g., concentrations of some metabolites), and a qualitative or semi-quantitative description of expression/post-translational modifications changes within selected proteins. A quantitative proteomics approach gives a possibility of quantitative characterization of the entire proteome of a biological system, in the context of the titer of proteins as well as their post-translational modifications. This enables not only more accurate testing of novel hypotheses but also provides tools that can be used to verify some of the most fundamental dogmas of modern biology. In this short review, we discuss some of the consequences of using quantitative proteomics to verify several key concepts in skeletal muscle physiology.

  6. Mode-specific tunneling using the Qim path: theory and an application to full-dimensional malonaldehyde.

    PubMed

    Wang, Yimin; Bowman, Joel M

    2013-10-21

    We present a theory of mode-specific tunneling that makes use of the general tunneling path along the imaginary-frequency normal mode of the saddle point, Qim, and the associated relaxed potential, V(Qim) [Y. Wang and J. M. Bowman, J. Chem. Phys. 129, 121103 (2008)]. The novel aspect of the theory is the projection of the normal modes of a minimum onto the Qim path and the determination of turning points on V(Qim). From that projection, the change in tunneling upon mode excitation can be calculated. If the projection is zero, no enhancement of tunneling is predicted. In that case vibrationally adiabatic (VA) theory could apply. However, if the projection is large then VA theory is not applicable. The approach is applied to mode-specific tunneling in full-dimensional malonaldehyde, using an accurate full-dimensional potential energy surface. Results are in semi-quantitative agreement with experiment for modes that show large enhancement of the tunneling, relative to the ground state tunneling splitting. For the six out-of-plane modes, which have zero projection on the planar Qim path, VA theory does apply, and results from that theory agree qualitatively and even semi-quantitatively with experiment. We also verify the failure of simple VA theory for modes that show large enhancement of tunneling.

  7. Bad Questions: An Essay Involving Item Response Theory

    ERIC Educational Resources Information Center

    Thissen, David

    2016-01-01

    David Thissen, a professor in the Department of Psychology and Neuroscience, Quantitative Program at the University of North Carolina, has consulted and served on technical advisory committees for assessment programs that use item response theory (IRT) over the past couple decades. He has come to the conclusion that there are usually two purposes…

  8. Measurement Models for Reasoned Action Theory.

    PubMed

    Hennessy, Michael; Bleakley, Amy; Fishbein, Martin

    2012-03-01

    Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are effect indicators that reflect the operation of a latent variable scale. We identify the issues when effect and causal indicators are present in a single analysis and conclude that both types of indicators can be incorporated in the analysis of data based on the reasoned action approach.

  9. Measurement Models for Reasoned Action Theory

    PubMed Central

    Hennessy, Michael; Bleakley, Amy; Fishbein, Martin

    2012-01-01

    Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are effect indicators that reflect the operation of a latent variable scale. We identify the issues when effect and causal indicators are present in a single analysis and conclude that both types of indicators can be incorporated in the analysis of data based on the reasoned action approach. PMID:23243315

  10. Emission source functions in heavy ion collisions

    NASA Astrophysics Data System (ADS)

    Shapoval, V. M.; Sinyukov, Yu. M.; Karpenko, Iu. A.

    2013-12-01

    Three-dimensional pion and kaon emission source functions are extracted from hydrokinetic model (HKM) simulations of central Au+Au collisions at the top Relativistic Heavy Ion Collider (RHIC) energy sNN=200 GeV. The model describes well the experimental data, previously obtained by the PHENIX and STAR collaborations using the imaging technique. In particular, the HKM reproduces the non-Gaussian heavy tails of the source function in the pair transverse momentum (out) and beam (long) directions, observed in the pion case and practically absent for kaons. The role of rescatterings and long-lived resonance decays in forming the mentioned long-range tails is investigated. The particle rescattering contribution to the out tail seems to be dominating. The model calculations also show substantial relative emission times between pions (with mean value 13 fm/c in the longitudinally comoving system), including those coming from resonance decays and rescatterings. A prediction is made for the source functions in Large Hadron Collider (LHC) Pb+Pb collisions at sNN=2.76 TeV, which are still not extracted from the measured correlation functions.

  11. 2D problems of surface growth theory with applications to additive manufacturing

    NASA Astrophysics Data System (ADS)

    Manzhirov, A. V.; Mikhin, M. N.

    2018-04-01

    We study 2D problems of surface growth theory of deformable solids and their applications to the analysis of the stress-strain state of AM fabricated products and structures. Statements of the problems are given, and a solution method based on the approaches of the theory of functions of a complex variable is suggested. Computations are carried out for model problems. Qualitative and quantitative results are discussed.

  12. Recycling and Ambivalence: Quantitative and Qualitative Analyses of Household Recycling among Young Adults

    ERIC Educational Resources Information Center

    Ojala, Maria

    2008-01-01

    Theories about ambivalence, as well as quantitative and qualitative empirical approaches, are applied to obtain an understanding of recycling among young adults. A questionnaire was mailed to 422 Swedish young people. Regression analyses showed that a mix of negative emotions (worry) and positive emotions (hope and joy) about the environmental…

  13. A quantitative quantum chemical model of the Dewar-Knott color rule for cationic diarylmethanes

    NASA Astrophysics Data System (ADS)

    Olsen, Seth

    2012-04-01

    We document the quantitative manifestation of the Dewar-Knott color rule in a four-electron, three-orbital state-averaged complete active space self-consistent field (SA-CASSCF) model of a series of bridge-substituted cationic diarylmethanes. We show that the lowest excitation energies calculated using multireference perturbation theory based on the model are linearly correlated with the development of hole density in an orbital localized on the bridge, and the depletion of pair density in the same orbital. We quantitatively express the correlation in the form of a generalized Hammett equation.

  14. Quantitative Comparison of Photothermal Heat Generation between Gold Nanospheres and Nanorods.

    PubMed

    Qin, Zhenpeng; Wang, Yiru; Randrianalisoa, Jaona; Raeesi, Vahid; Chan, Warren C W; Lipiński, Wojciech; Bischof, John C

    2016-07-21

    Gold nanoparticles (GNPs) are widely used for biomedical applications due to unique optical properties, established synthesis methods, and biological compatibility. Despite important applications of plasmonic heating in thermal therapy, imaging, and diagnostics, the lack of quantification in heat generation leads to difficulties in comparing the heating capability for new plasmonic nanostructures and predicting the therapeutic and diagnostic outcome. This study quantifies GNP heat generation by experimental measurements and theoretical predictions for gold nanospheres (GNS) and nanorods (GNR). Interestingly, the results show a GNP-type dependent agreement between experiment and theory. The measured heat generation of GNS matches well with theory, while the measured heat generation of GNR is only 30% of that predicted theoretically at peak absorption. This then leads to a surprising finding that the polydispersity, the deviation of nanoparticle size and shape from nominal value, significantly influences GNR heat generation (>70% reduction), while having a limited effect for GNS (<10% change). This work demonstrates that polydispersity is an important metric in quantitatively predicting plasmonic heat generation and provides a validated framework to quantitatively compare the heating capabilities between gold and other plasmonic nanostructures.

  15. Quantitative Comparison of Photothermal Heat Generation between Gold Nanospheres and Nanorods

    NASA Astrophysics Data System (ADS)

    Qin, Zhenpeng; Wang, Yiru; Randrianalisoa, Jaona; Raeesi, Vahid; Chan, Warren C. W.; Lipiński, Wojciech; Bischof, John C.

    2016-07-01

    Gold nanoparticles (GNPs) are widely used for biomedical applications due to unique optical properties, established synthesis methods, and biological compatibility. Despite important applications of plasmonic heating in thermal therapy, imaging, and diagnostics, the lack of quantification in heat generation leads to difficulties in comparing the heating capability for new plasmonic nanostructures and predicting the therapeutic and diagnostic outcome. This study quantifies GNP heat generation by experimental measurements and theoretical predictions for gold nanospheres (GNS) and nanorods (GNR). Interestingly, the results show a GNP-type dependent agreement between experiment and theory. The measured heat generation of GNS matches well with theory, while the measured heat generation of GNR is only 30% of that predicted theoretically at peak absorption. This then leads to a surprising finding that the polydispersity, the deviation of nanoparticle size and shape from nominal value, significantly influences GNR heat generation (>70% reduction), while having a limited effect for GNS (<10% change). This work demonstrates that polydispersity is an important metric in quantitatively predicting plasmonic heat generation and provides a validated framework to quantitatively compare the heating capabilities between gold and other plasmonic nanostructures.

  16. The Theories of Turbulence

    NASA Technical Reports Server (NTRS)

    Bass, J; Agostini, L

    1955-01-01

    The theory of turbulence reached its full growth at the end of the 19th century as a result of the work by Boussinesq and Reynolds. It then underwent a long period of stagnation which ended under the impulse given to it by the development of wind tunnels caused by the needs of aviation. Numerous researchers, attempted to put Reynolds' elementary statistical theory into a more precise form. During the war, some isolated scientists - von Weizsacker and Heisenberg in Germany, Kolmogoroff in Russia, Onsager in the U.S.A. - started a program of research. By a system of assumptions which make it possible to approach the structure of turbulence in well-defined limiting conditions quantitatively, they obtained a certain number of laws on the correlations and the spectrum. Since the late reports have improved the mathematical language of turbulence, it was deemed advisable to start with a detailed account of the mathematical methods applicable to turbulence, inspired at first by the work of the French school, above all for the basic principles, then the work of the foreigners, above all for the theory of the spectrum.

  17. Quantitative Characterization of Nanostructured Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dr. Frank

    The two-and-a-half day symposium on the "Quantitative Characterization of Nanostructured Materials" will be the first comprehensive meeting on this topic held under the auspices of a major U.S. professional society. Spring MRS Meetings provide a natural venue for this symposium as they attract a broad audience of researchers that represents a cross-section of the state-of-the-art regarding synthesis, structure-property relations, and applications of nanostructured materials. Close interactions among the experts in local structure measurements and materials researchers will help both to identify measurement needs pertinent to real-world materials problems and to familiarize the materials research community with the state-of-the-art local structuremore » measurement techniques. We have chosen invited speakers that reflect the multidisciplinary and international nature of this topic and the need to continually nurture productive interfaces among university, government and industrial laboratories. The intent of the symposium is to provide an interdisciplinary forum for discussion and exchange of ideas on the recent progress in quantitative characterization of structural order in nanomaterials using different experimental techniques and theory. The symposium is expected to facilitate discussions on optimal approaches for determining atomic structure at the nanoscale using combined inputs from multiple measurement techniques.« less

  18. Wellness: A Review of Theory and Measurement for Counselors

    ERIC Educational Resources Information Center

    Roscoe, Lauren J.

    2009-01-01

    Wellness is considered the paradigm of counseling and development (J. E. Myers, 1991, 1992). However, researchers have failed to agree on a definition or on the dimensional structure of wellness. Furthermore, existing quantitative wellness instruments are inadequate for capturing the complexity of wellness. The author reviews wellness theory and…

  19. Benchmarking successional progress in a quantitative food web.

    PubMed

    Boit, Alice; Gaedke, Ursula

    2014-01-01

    Central to ecology and ecosystem management, succession theory aims to mechanistically explain and predict the assembly and development of ecological communities. Yet processes at lower hierarchical levels, e.g. at the species and functional group level, are rarely mechanistically linked to the under-investigated system-level processes which drive changes in ecosystem properties and functioning and are comparable across ecosystems. As a model system for secondary succession, seasonal plankton succession during the growing season is readily observable and largely driven autogenically. We used a long-term dataset from large, deep Lake Constance comprising biomasses, auto- and heterotrophic production, food quality, functional diversity, and mass-balanced food webs of the energy and nutrient flows between functional guilds of plankton and partly fish. Extracting population- and system-level indices from this dataset, we tested current hypotheses about the directionality of successional progress which are rooted in ecosystem theory, the metabolic theory of ecology, quantitative food web theory, thermodynamics, and information theory. Our results indicate that successional progress in Lake Constance is quantifiable, passing through predictable stages. Mean body mass, functional diversity, predator-prey weight ratios, trophic positions, system residence times of carbon and nutrients, and the complexity of the energy flow patterns increased during succession. In contrast, both the mass-specific metabolic activity and the system export decreased, while the succession rate exhibited a bimodal pattern. The weighted connectance introduced here represents a suitable index for assessing the evenness and interconnectedness of energy flows during succession. Diverging from earlier predictions, ascendency and eco-exergy did not increase during succession. Linking aspects of functional diversity to metabolic theory and food web complexity, we reconcile previously disjoint bodies of

  20. Benchmarking Successional Progress in a Quantitative Food Web

    PubMed Central

    Boit, Alice; Gaedke, Ursula

    2014-01-01

    Central to ecology and ecosystem management, succession theory aims to mechanistically explain and predict the assembly and development of ecological communities. Yet processes at lower hierarchical levels, e.g. at the species and functional group level, are rarely mechanistically linked to the under-investigated system-level processes which drive changes in ecosystem properties and functioning and are comparable across ecosystems. As a model system for secondary succession, seasonal plankton succession during the growing season is readily observable and largely driven autogenically. We used a long-term dataset from large, deep Lake Constance comprising biomasses, auto- and heterotrophic production, food quality, functional diversity, and mass-balanced food webs of the energy and nutrient flows between functional guilds of plankton and partly fish. Extracting population- and system-level indices from this dataset, we tested current hypotheses about the directionality of successional progress which are rooted in ecosystem theory, the metabolic theory of ecology, quantitative food web theory, thermodynamics, and information theory. Our results indicate that successional progress in Lake Constance is quantifiable, passing through predictable stages. Mean body mass, functional diversity, predator-prey weight ratios, trophic positions, system residence times of carbon and nutrients, and the complexity of the energy flow patterns increased during succession. In contrast, both the mass-specific metabolic activity and the system export decreased, while the succession rate exhibited a bimodal pattern. The weighted connectance introduced here represents a suitable index for assessing the evenness and interconnectedness of energy flows during succession. Diverging from earlier predictions, ascendency and eco-exergy did not increase during succession. Linking aspects of functional diversity to metabolic theory and food web complexity, we reconcile previously disjoint bodies of

  1. Quantitative first-principles theory of interface absorption in multilayer heterostructures

    DOE PAGES

    Hachtel, Jordan A.; Sachan, Ritesh; Mishra, Rohan; ...

    2015-09-03

    The unique chemical bonds and electronic states of interfaces result in optical properties that are different from those of the constituting bulk materials. In the nanoscale regime, the interface effects can be dominant and impact the optical response of devices. Using density functional theory (DFT), the interface effects can be calculated, but DFT is computationally limited to small systems. In this paper, we describe a method to combine DFT with macroscopic methodologies to extract the interface effect on absorption in a consistent and quantifiable manner. The extracted interface effects are an independent parameter and can be applied to more complicatedmore » systems. Finally, we demonstrate, using NiSi 2/Si heterostructures, that by varying the relative volume fractions of interface and bulk, we can tune the spectral range of the heterostructure absorption.« less

  2. Toward a quantitative theory of food consumption choices and body weight.

    PubMed

    Buttet, Sebastien; Dolar, Veronika

    2015-04-01

    We propose a calibrated dynamic model of food consumption choices and body weight to study changes in daily caloric intake, weight, and the away-from-home share of calories consumed by adult men and women in the U.S. during the period between 1971 and 2006. Calibration reveals substantial preference heterogeneity between men and women. For example, utility losses stemming from weight gains are ten times greater for women compared to men. Counterfactual experiments show that changes in food prices and household income account for half of the increase in weight of adult men, but only a small fraction of women's weight. We argue that quantitative models of food consumption choices and body weight have a unique role to play in future research in the economics of obesity. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Quantitative Research Methods in Chaos and Complexity: From Probability to Post Hoc Regression Analyses

    ERIC Educational Resources Information Center

    Gilstrap, Donald L.

    2013-01-01

    In addition to qualitative methods presented in chaos and complexity theories in educational research, this article addresses quantitative methods that may show potential for future research studies. Although much in the social and behavioral sciences literature has focused on computer simulations, this article explores current chaos and…

  4. Interfacial self-healing of nanocomposite hydrogels: Theory and experiment

    NASA Astrophysics Data System (ADS)

    Wang, Qiming; Gao, Zheming; Yu, Kunhao

    2017-12-01

    Polymers with dynamic bonds are able to self-heal their fractured interfaces and restore the mechanical strengths. It is largely elusive how to analytically model this self-healing behavior to construct the mechanistic relationship between the self-healing properties (e.g., healed interfacial strength and equilibrium healing time) and the material compositions and healing conditions. Here, we take a self-healable nanocomposite hydrogel as an example to illustrate an interfacial self-healing theory for hydrogels with dynamic bonds. In the theory, we consider the free polymer chains diffuse across the interface and reform crosslinks to bridge the interface. We analytically reveal that the healed strengths of nanocomposite hydrogels increase with the healing time in an error-function-like form. The equilibrium self-healing time of the full-strength recovery decreases with the temperature and increases with the nanoparticle concentration. We further analytically reveal that the healed interfacial strength decreases with increasing delaying time before the healing process. The theoretical results quantitatively match with our experiments on nanosilica hydrogels, and also agree well with other researchers' experiments on nanoclay hydrogels. We expect that this theory would open promising avenues for quantitative understanding of the self-healing mechanics of various polymers with dynamic bonds, and offer insights for designing high-performance self-healing polymers.

  5. Playing off the curve - testing quantitative predictions of skill acquisition theories in development of chess performance.

    PubMed

    Gaschler, Robert; Progscha, Johanna; Smallbone, Kieran; Ram, Nilam; Bilalić, Merim

    2014-01-01

    Learning curves have been proposed as an adequate description of learning processes, no matter whether the processes manifest within minutes or across years. Different mechanisms underlying skill acquisition can lead to differences in the shape of learning curves. In the current study, we analyze the tournament performance data of 1383 chess players who begin competing at young age and play tournaments for at least 10 years. We analyze the performance development with the goal to test the adequacy of learning curves, and the skill acquisition theories they are based on, for describing and predicting expertise acquisition. On the one hand, we show that the skill acquisition theories implying a negative exponential learning curve do a better job in both describing early performance gains and predicting later trajectories of chess performance than those theories implying a power function learning curve. On the other hand, the learning curves of a large proportion of players show systematic qualitative deviations from the predictions of either type of skill acquisition theory. While skill acquisition theories predict larger performance gains in early years and smaller gains in later years, a substantial number of players begin to show substantial improvements with a delay of several years (and no improvement in the first years), deviations not fully accounted for by quantity of practice. The current work adds to the debate on how learning processes on a small time scale combine to large-scale changes.

  6. Resource theory of non-Gaussian operations

    NASA Astrophysics Data System (ADS)

    Zhuang, Quntao; Shor, Peter W.; Shapiro, Jeffrey H.

    2018-05-01

    Non-Gaussian states and operations are crucial for various continuous-variable quantum information processing tasks. To quantitatively understand non-Gaussianity beyond states, we establish a resource theory for non-Gaussian operations. In our framework, we consider Gaussian operations as free operations, and non-Gaussian operations as resources. We define entanglement-assisted non-Gaussianity generating power and show that it is a monotone that is nonincreasing under the set of free superoperations, i.e., concatenation and tensoring with Gaussian channels. For conditional unitary maps, this monotone can be analytically calculated. As examples, we show that the non-Gaussianity of ideal photon-number subtraction and photon-number addition equal the non-Gaussianity of the single-photon Fock state. Based on our non-Gaussianity monotone, we divide non-Gaussian operations into two classes: (i) the finite non-Gaussianity class, e.g., photon-number subtraction, photon-number addition, and all Gaussian-dilatable non-Gaussian channels; and (ii) the diverging non-Gaussianity class, e.g., the binary phase-shift channel and the Kerr nonlinearity. This classification also implies that not all non-Gaussian channels are exactly Gaussian dilatable. Our resource theory enables a quantitative characterization and a first classification of non-Gaussian operations, paving the way towards the full understanding of non-Gaussianity.

  7. Angle-resolved high-order above-threshold ionization of a molecule: sensitive tool for molecular characterization.

    PubMed

    Busuladzić, M; Gazibegović-Busuladzić, A; Milosević, D B; Becker, W

    2008-05-23

    The strong-field approximation for ionization of diatomic molecules by an intense laser field is generalized to include rescattering of the ionized electron off the various centers of its molecular parent ion. The resulting spectrum and its interference structure strongly depend on the symmetry of the ground state molecular orbital. For N2, if the laser polarization is perpendicular to the molecular axis, we observe a distinct minimum in the emission spectrum, which survives focal averaging and allows determination of, e.g., the internuclear separation. In contrast, for O2, rescattering is absent in the same situation.

  8. Representations of Complexity: How Nature Appears in Our Theories

    PubMed Central

    2013-01-01

    In science we study processes in the material world. The way these processes operate can be discovered by conducting experiments that activate them, and findings from such experiments can lead to functional complexity theories of how the material processes work. The results of a good functional theory will agree with experimental measurements, but the theory may not incorporate in its algorithmic workings a representation of the material processes themselves. Nevertheless, the algorithmic operation of a good functional theory may be said to make contact with material reality by incorporating the emergent computations the material processes carry out. These points are illustrated in the experimental analysis of behavior by considering an evolutionary theory of behavior dynamics, the algorithmic operation of which does not correspond to material features of the physical world, but the functional output of which agrees quantitatively and qualitatively with findings from a large body of research with live organisms. PMID:28018044

  9. Quantitative Image Restoration in Bright Field Optical Microscopy.

    PubMed

    Gutiérrez-Medina, Braulio; Sánchez Miranda, Manuel de Jesús

    2017-11-07

    Bright field (BF) optical microscopy is regarded as a poor method to observe unstained biological samples due to intrinsic low image contrast. We introduce quantitative image restoration in bright field (QRBF), a digital image processing method that restores out-of-focus BF images of unstained cells. Our procedure is based on deconvolution, using a point spread function modeled from theory. By comparing with reference images of bacteria observed in fluorescence, we show that QRBF faithfully recovers shape and enables quantify size of individual cells, even from a single input image. We applied QRBF in a high-throughput image cytometer to assess shape changes in Escherichia coli during hyperosmotic shock, finding size heterogeneity. We demonstrate that QRBF is also applicable to eukaryotic cells (yeast). Altogether, digital restoration emerges as a straightforward alternative to methods designed to generate contrast in BF imaging for quantitative analysis. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  10. Quantitative genetics

    USDA-ARS?s Scientific Manuscript database

    The majority of economically important traits targeted for cotton improvement are quantitatively inherited. In this chapter, the current state of cotton quantitative genetics is described and separated into four components. These components include: 1) traditional quantitative inheritance analysis, ...

  11. Investment appraisal using quantitative risk analysis.

    PubMed

    Johansson, Henrik

    2002-07-01

    Investment appraisal concerned with investments in fire safety systems is discussed. Particular attention is directed at evaluating, in terms of the Bayesian decision theory, the risk reduction that investment in a fire safety system involves. It is shown how the monetary value of the change from a building design without any specific fire protection system to one including such a system can be estimated by use of quantitative risk analysis, the results of which are expressed in terms of a Risk-adjusted net present value. This represents the intrinsic monetary value of investing in the fire safety system. The method suggested is exemplified by a case study performed in an Avesta Sheffield factory.

  12. Drift mobility of photo-electrons in organic molecular crystals: Quantitative comparison between theory and experiment

    NASA Astrophysics Data System (ADS)

    Reineker, P.; Kenkre, V. M.; Kühne, R.

    1981-08-01

    A quantitative comparison of a simple theoretical prediction for the drift mobility of photo-electrons in organic molecular crystals, calculated within the model of the coupled band-like and hopping motion, with experiments in napthalene of Schein et al. and Karl et al. is given.

  13. The nucleon as a test case to calculate vector-isovector form factors at low energies

    NASA Astrophysics Data System (ADS)

    Leupold, Stefan

    2018-01-01

    Extending a recent suggestion for hyperon form factors to the nucleon case, dispersion theory is used to relate the low-energy vector-isovector form factors of the nucleon to the pion vector form factor. The additionally required input, i.e. the pion-nucleon scattering amplitudes are determined from relativistic next-to-leading-order (NLO) baryon chiral perturbation theory including the nucleons and optionally the Delta baryons. Two methods to include pion rescattering are compared: a) solving the Muskhelishvili-Omnès (MO) equation and b) using an N/D approach. It turns out that the results differ strongly from each other. Furthermore the results are compared to a fully dispersive calculation of the (subthreshold) pion-nucleon amplitudes based on Roy-Steiner (RS) equations. In full agreement with the findings from the hyperon sector it turns out that the inclusion of Delta baryons is not an option but a necessity to obtain reasonable results. The magnetic isovector form factor depends strongly on a low-energy constant of the NLO Lagrangian. If it is adjusted such that the corresponding magnetic radius is reproduced, then the results for the corresponding pion-nucleon scattering amplitude (based on the MO equation) agree very well with the RS results. Also in the electric sector the Delta degrees of freedom are needed to obtain the correct order of magnitude for the isovector charge and the corresponding electric radius. Yet quantitative agreement is not achieved. If the subtraction constant that appears in the solution of the MO equation is not taken from nucleon+Delta chiral perturbation theory but adjusted such that the electric radius is reproduced, then one obtains also in this sector a pion-nucleon scattering amplitude that agrees well with the RS results.

  14. Cancer Theory from Systems Biology Point of View

    NASA Astrophysics Data System (ADS)

    Wang, Gaowei; Tang, Ying; Yuan, Ruoshi; Ao, Ping

    In our previous work, we have proposed a novel cancer theory, endogenous network theory, to understand mechanism underlying cancer genesis and development. Recently, we apply this theory to hepatocellular carcinoma (HCC). A core endogenous network of hepatocyte was established by integrating the current understanding of hepatocyte at molecular level. Quantitative description of the endogenous network consisted of a set of stochastic differential equations which could generate many local attractors with obvious or non-obvious biological functions. By comparing with clinical observation and experimental data, the results showed that two robust attractors from the model reproduced the main known features of normal hepatocyte and cancerous hepatocyte respectively at both modular and molecular level. In light of our theory, the genesis and progression of cancer is viewed as transition from normal attractor to HCC attractor. A set of new insights on understanding cancer genesis and progression, and on strategies for cancer prevention, cure, and care were provided.

  15. Quantitative theory of electroosmotic flow in fused-silica capillaries using an extended site-dissociation--site-binding model.

    PubMed

    Zhou, Marilyn X; Foley, Joe P

    2006-03-15

    To optimize separations in capillary electrophoresis, it is important to control the electroosmotic mobility of the running buffer and the factors that affect it. Through the application of a site-dissociation-site-binding model, we demonstrated that the electroosmotic mobility could be controlled qualitatively and quantitatively by the parameters related to the physical and chemical properties of the running buffer: pH, cation valence, ionic strength, viscosity, activity, and dissociation constant. Our study illustrated that the logarithm of the number of apparent silanol sites on a fused-silica surface has a linear relationship with the pH of a buffer solution. The extension of the chemical kinetics approach allowed us to obtain the thickness of the electrical double layer when multivalent inorganic cations are present with monovalent cations in a buffer solution, and we found that the thickness of the electrical double layer does not depend on the charge of anions. The general equation to predict the electroosmotic mobility suggested here also indicates the increase of electroosmotic mobility with temperature. The general equation was experimentally verified by three buffer scenarios: (i) buffers containing only monovalent cations; (ii) buffers containing multivalent inorganic cations; and (iii) buffers containing cations and neutral additives. The general equation can explain the experimental observations of (i) a maximum electroosmotic mobility for the first scenario as the pH was varied at constant ionic strength and (ii) the inversion and maximum value of the electroosmotic mobility for the second scenario when the concentration of divalent cations was varied at constant pH. A good agreement between theory and experiment was obtained for each scenario.

  16. Systematic expansion in the order parameter for replica theory of the dynamical glass transition.

    PubMed

    Jacquin, Hugo; Zamponi, Francesco

    2013-03-28

    It has been shown recently that predictions from mode-coupling theory for the glass transition of hard-spheres become increasingly bad when dimensionality increases, whereas replica theory predicts a correct scaling. Nevertheless if one focuses on the regime around the dynamical transition in three dimensions, mode-coupling results are far more convincing than replica theory predictions. It seems thus necessary to reconcile the two theoretic approaches in order to obtain a theory that interpolates between low-dimensional, mode-coupling results, and "mean-field" results from replica theory. Even though quantitative results for the dynamical transition issued from replica theory are not accurate in low dimensions, two different approximation schemes--small cage expansion and replicated hyper-netted-chain (RHNC)--provide the correct qualitative picture for the transition, namely, a discontinuous jump of a static order parameter from zero to a finite value. The purpose of this work is to develop a systematic expansion around the RHNC result in powers of the static order parameter, and to calculate the first correction in this expansion. Interestingly, this correction involves the static three-body correlations of the liquid. More importantly, we separately demonstrate that higher order terms in the expansion are quantitatively relevant at the transition, and that the usual mode-coupling kernel, involving two-body direct correlation functions of the liquid, cannot be recovered from static computations.

  17. A quantitative theory of gamma synchronization in macaque V1.

    PubMed

    Lowet, Eric; Roberts, Mark J; Peter, Alina; Gips, Bart; De Weerd, Peter

    2017-08-31

    Gamma-band synchronization coordinates brief periods of excitability in oscillating neuronal populations to optimize information transmission during sensation and cognition. Commonly, a stable, shared frequency over time is considered a condition for functional neural synchronization. Here, we demonstrate the opposite: instantaneous frequency modulations are critical to regulate phase relations and synchronization. In monkey visual area V1, nearby local populations driven by different visual stimulation showed different gamma frequencies. When similar enough, these frequencies continually attracted and repulsed each other, which enabled preferred phase relations to be maintained in periods of minimized frequency difference. Crucially, the precise dynamics of frequencies and phases across a wide range of stimulus conditions was predicted from a physics theory that describes how weakly coupled oscillators influence each other's phase relations. Hence, the fundamental mathematical principle of synchronization through instantaneous frequency modulations applies to gamma in V1 and is likely generalizable to other brain regions and rhythms.

  18. A quantitative theory of gamma synchronization in macaque V1

    PubMed Central

    Roberts, Mark J; Peter, Alina; Gips, Bart; De Weerd, Peter

    2017-01-01

    Gamma-band synchronization coordinates brief periods of excitability in oscillating neuronal populations to optimize information transmission during sensation and cognition. Commonly, a stable, shared frequency over time is considered a condition for functional neural synchronization. Here, we demonstrate the opposite: instantaneous frequency modulations are critical to regulate phase relations and synchronization. In monkey visual area V1, nearby local populations driven by different visual stimulation showed different gamma frequencies. When similar enough, these frequencies continually attracted and repulsed each other, which enabled preferred phase relations to be maintained in periods of minimized frequency difference. Crucially, the precise dynamics of frequencies and phases across a wide range of stimulus conditions was predicted from a physics theory that describes how weakly coupled oscillators influence each other’s phase relations. Hence, the fundamental mathematical principle of synchronization through instantaneous frequency modulations applies to gamma in V1 and is likely generalizable to other brain regions and rhythms. PMID:28857743

  19. Quantitative analysis of the Dermott-Gold theory for Uranus's rings

    NASA Technical Reports Server (NTRS)

    Aksnes, K.

    1977-01-01

    A summary is presented of an investigation which supplements the largely qualitative analysis conducted by Dermott and Gold (1977). Dermott and Gold have attempted to explain the locations of Uranus's rings in terms of resonances between ring particles and pairs of satellites. An equation of motion, analogous to that of a pendulum, is derived, taking into account a study by Wilkens (1933) of possible three-body resonances involving one minor and two major planets. Dermott and Gold had concluded that the observed pattern is probably due primarily to the effect of Ariel-Titania and Ariel-Oberon pairs. However, on the basis of the values derived in the reported investigation it is seen that Miranda plays the key role rather than Ariel, in spite of the small mass of the former. It is concluded that a decisive test of the Dermott-Gold theory has to await further observational details concerning the Uranus's rings.

  20. Some directions in ecological theory.

    PubMed

    Kendall, Bruce E

    2015-12-01

    The role of theory within ecology has changed dramatically in recent decades. Once primarily a source of qualitative conceptual framing, ecological theories and models are now often used to develop quantitative explanations of empirical patterns and to project future dynamics of specific ecological systems. In this essay, I recount my own experience of this transformation, in which accelerating computing power and the widespread incorporation of stochastic processes into ecological theory combined to create some novel integration of mathematical and statistical models. This stronger integration drives theory towards incorporating more biological realism, and I explore ways in which we can grapple with that realism to generate new general theoretical insights. This enhanced realism, in turn, may lead to frameworks for projecting ecological responses to anthropogenic change, which is, arguably, the central challenge for 21st-century ecology. In an era of big data and synthesis, ecologists are increasingly seeking to infer causality from observational data; but conventional biometry provides few tools for this project. This is a realm where theorists can and should play an important role, and I close by pointing towards some analytical and philosophical approaches developed in our sister discipline of economics that address this very problem. While I make no grand prognostications about the likely discoveries of ecological theory over the coming century, you will find in this essay a scattering of more or less far-fetched ideas that I, at least, think are interesting and (possibly) fruitful directions for our field.

  1. A Unified Theory of Impact Crises and Mass Extinctions: Quantitative Tests

    NASA Technical Reports Server (NTRS)

    Rampino, Michael R.; Haggerty, Bruce M.; Pagano, Thomas C.

    1997-01-01

    Several quantitative tests of a general hypothesis linking impacts of large asteroids and comets with mass extinctions of life are possible based on astronomical data, impact dynamics, and geological information. The waiting of large-body impacts on the Earth derive from the flux of Earth-crossing asteroids and comets, and the estimated size of impacts capable of causing large-scale environmental disasters, predict that impacts of objects greater than or equal to 5 km in diameter (greater than or equal to 10 (exp 7) Mt TNT equivalent) could be sufficient to explain the record of approximately 25 extinction pulses in the last 540 Myr, with the 5 recorded major mass extinctions related to impacts of the largest objects of greater than or equal to 10 km in diameter (greater than or equal to 10(exp 8) Mt Events). Smaller impacts (approximately 10 (exp 6) Mt), with significant regional environmental effects, could be responsible for the lesser boundaries in the geologic record.

  2. Enhancing quantitative approaches for assessing community resilience

    USGS Publications Warehouse

    Chuang, W. C.; Garmestani, A.S.; Eason, T. N.; Spanbauer, T. L.; Fried-Peterson, H. B.; Roberts, C.P.; Sundstrom, Shana M.; Burnett, J.L.; Angeler, David G.; Chaffin, Brian C.; Gunderson, L.; Twidwell, Dirac; Allen, Craig R.

    2018-01-01

    Scholars from many different intellectual disciplines have attempted to measure, estimate, or quantify resilience. However, there is growing concern that lack of clarity on the operationalization of the concept will limit its application. In this paper, we discuss the theory, research development and quantitative approaches in ecological and community resilience. Upon noting the lack of methods that quantify the complexities of the linked human and natural aspects of community resilience, we identify several promising approaches within the ecological resilience tradition that may be useful in filling these gaps. Further, we discuss the challenges for consolidating these approaches into a more integrated perspective for managing social-ecological systems.

  3. Enhancing quantitative approaches for assessing community resilience.

    PubMed

    Chuang, W C; Garmestani, A; Eason, T N; Spanbauer, T L; Fried-Petersen, H B; Roberts, C P; Sundstrom, S M; Burnett, J L; Angeler, D G; Chaffin, B C; Gunderson, L; Twidwell, D; Allen, C R

    2018-05-01

    Scholars from many different intellectual disciplines have attempted to measure, estimate, or quantify resilience. However, there is growing concern that lack of clarity on the operationalization of the concept will limit its application. In this paper, we discuss the theory, research development and quantitative approaches in ecological and community resilience. Upon noting the lack of methods that quantify the complexities of the linked human and natural aspects of community resilience, we identify several promising approaches within the ecological resilience tradition that may be useful in filling these gaps. Further, we discuss the challenges for consolidating these approaches into a more integrated perspective for managing social-ecological systems. Published by Elsevier Ltd.

  4. The Perception of Prototypical Motion: Synchronization Is Enhanced with Quantitatively Morphed Gestures of Musical Conductors

    ERIC Educational Resources Information Center

    Wollner, Clemens; Deconinck, Frederik J. A.; Parkinson, Jim; Hove, Michael J.; Keller, Peter E.

    2012-01-01

    Aesthetic theories have long suggested perceptual advantages for prototypical exemplars of a given class of objects or events. Empirical evidence confirmed that morphed (quantitatively averaged) human faces, musical interpretations, and human voices are preferred over most individual ones. In this study, biological human motion was morphed and…

  5. A Critical Quantitative Examination of the Relationship between Constructs of Engagement and Latino College Completion

    ERIC Educational Resources Information Center

    Raynor, Samantha L.

    2017-01-01

    Investigating the validity and applicability of student success theories for minority students uncovers the nuance and context of student experiences. This study examines the validity and applicability of student engagement and involvement for Latino students. Specifically, this study employs a critical quantitative lens to question current…

  6. Theory of asymmetric tunneling in the cuprate superconductors

    NASA Astrophysics Data System (ADS)

    Anderson, P. W.; Ong, N. P.

    2006-01-01

    We explain quantitatively, within the Gutzwiller-Resonating Valence Bond theory, the puzzling observation of tunneling conductivity between a metallic point and a cuprate high-Tc superconductor which is markedly asymmetric between positive and negative voltage biases. The asymmetric part does not have a ‘coherence peak’ but does show structure due to the gap. The fit to data is satisfactory within the over-simplifications of the theory; in particular, it explains the marked ‘peak-dip-hump’ structure observed on the hole side and a number of other qualitative observations. This asymmetry is strong evidence for the projective nature of the ground state and hence for ‘t-J’ physics.

  7. Quantitative analysis of fracture surface by roughness and fractal method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, X.W.; Tian, J.F.; Kang, Y.

    1995-09-01

    In recent years there has been extensive research and great development in Quantitative Fractography, which acts as an integral part of fractographic analysis. A prominent technique for studying the fracture surface is based on fracture profile generation and the major means for characterizing the profile quantitatively are roughness and fractal methods. By this way, some quantitative indexes such as the roughness parameters R{sub L} for profile and R{sub S} for surface, fractal dimensions D{sub L} for profile and D{sub S} for surface can be measured. Given the relationships between the indexes and the mechanical properties of materials, it is possiblemore » to achieve the goal of protecting materials from fracture. But, as the case stands, the theory and experimental technology of quantitative fractography are still imperfect and remain to be studied further. Recently, Gokhale and Underwood et al have proposed an assumption-free method for estimating the surface roughness by vertically sectioning the fracture surface with sections at an angle of 120 deg with each other, which could be expressed as follows: R{sub S} = {ovr R{sub L}{center_dot}{Psi}} where {Psi} is the profile structure factor. This method is based on the classical sterological principles and verified with the aid of computer simulations for some ruled surfaces. The results are considered to be applicable to fracture surfaces with any arbitrary complexity and anisotropy. In order to extend the detail applications to this method in quantitative fractography, the authors made a study on roughness and fractal methods dependent on this method by performing quantitative measurements on some typical low-temperature impact fractures.« less

  8. The Use of Modelling for Theory Building in Qualitative Analysis

    ERIC Educational Resources Information Center

    Briggs, Ann R. J.

    2007-01-01

    The purpose of this article is to exemplify and enhance the place of modelling as a qualitative process in educational research. Modelling is widely used in quantitative research as a tool for analysis, theory building and prediction. Statistical data lend themselves to graphical representation of values, interrelationships and operational…

  9. Quantitative passive soil vapor sampling for VOCs--part 1: theory.

    PubMed

    McAlary, Todd; Wang, Xiaomin; Unger, Andre; Groenevelt, Hester; Górecki, Tadeusz

    2014-03-01

    Volatile organic compounds are the primary chemicals of concern at many contaminated sites and soil vapor sampling and analysis is a valuable tool for assessing the nature and extent of contamination. Soil gas samples are typically collected by applying vacuum to a probe in order to collect a whole-gas sample, or by drawing gas through a tube filled with an adsorbent (active sampling). There are challenges associated with flow and vacuum levels in low permeability materials, and leak prevention and detection during active sample collection can be cumbersome. Passive sampling has been available as an alternative to conventional gas sample collection for decades, but quantitative relationships between the mass of chemicals sorbed, the soil vapor concentrations, and the sampling time have not been established. This paper presents transient and steady-state mathematical models of radial vapor diffusion to a drilled hole and considerations for passive sampler sensitivity and practical sampling durations. The results indicate that uptake rates in the range of 0.1 to 1 mL min(-1) will minimize the starvation effect for most soil moisture conditions and provide adequate sensitivity for human health risk assessment with a practical sampling duration. This new knowledge provides a basis for improved passive soil vapour sampler design.

  10. Using Perturbation Theory to Reduce Noise in Diffusion Tensor Fields

    PubMed Central

    Bansal, Ravi; Staib, Lawrence H.; Xu, Dongrong; Laine, Andrew F.; Liu, Jun; Peterson, Bradley S.

    2009-01-01

    We propose the use of Perturbation theory to reduce noise in Diffusion Tensor (DT) fields. Diffusion Tensor Imaging (DTI) encodes the diffusion of water molecules along different spatial directions in a positive-definite, 3 × 3 symmetric tensor. Eigenvectors and eigenvalues of DTs allow the in vivo visualization and quantitative analysis of white matter fiber bundles across the brain. The validity and reliability of these analyses are limited, however, by the low spatial resolution and low Signal-to-Noise Ratio (SNR) in DTI datasets. Our procedures can be applied to improve the validity and reliability of these quantitative analyses by reducing noise in the tensor fields. We model a tensor field as a three-dimensional Markov Random Field and then compute the likelihood and the prior terms of this model using Perturbation theory. The prior term constrains the tensor field to be smooth, whereas the likelihood term constrains the smoothed tensor field to be similar to the original field. Thus, the proposed method generates a smoothed field that is close in structure to the original tensor field. We evaluate the performance of our method both visually and quantitatively using synthetic and real-world datasets. We quantitatively assess the performance of our method by computing the SNR for eigenvalues and the coherence measures for eigenvectors of DTs across tensor fields. In addition, we quantitatively compare the performance of our procedures with the performance of one method that uses a Riemannian distance to compute the similarity between two tensors, and with another method that reduces noise in tensor fields by anisotropically filtering the diffusion weighted images that are used to estimate diffusion tensors. These experiments demonstrate that our method significantly increases the coherence of the eigenvectors and the SNR of the eigenvalues, while simultaneously preserving the fine structure and boundaries between homogeneous regions, in the smoothed tensor

  11. Hard photodisintegration of 3He into a p d pair

    NASA Astrophysics Data System (ADS)

    Maheswari, Dhiraj; Sargsian, Misak M.

    2017-02-01

    The recent measurements of high energy photodisintegration of a 3He nucleus to a p d pair at 90∘ center of mass demonstrated an energy scaling consistent with the quark counting rule with an unprecedentedly large exponent of s-17. To understand the underlying mechanism of this process, we extended the theoretical formalism of the hard rescattering mechanism (HRM) to calculate the γ 3He→p d reaction. In HRM the incoming high energy photon strikes a quark from one of the nucleons in the target which subsequently undergoes hard rescattering with the quarks from the other nucleons, generating a hard two-body system in the final state of the reaction. Within the HRM we derived the parameter-free expression for the differential cross section of the reaction, which is expressed through the 3He→p d transition spectral function, the cross section of hard p d →p d scattering, and the effective charge of the quarks being interchanged during the hard rescattering process. The numerical estimates of all these factors resulted in the magnitude of the cross section, which is surprisingly in good agreement with the data.

  12. Influence factors and prediction of stormwater runoff of urban green space in Tianjin, China: laboratory experiment and quantitative theory model.

    PubMed

    Yang, Xu; You, Xue-Yi; Ji, Min; Nima, Ciren

    2013-01-01

    The effects of limiting factors such as rainfall intensity, rainfall duration, grass type and vegetation coverage on the stormwater runoff of urban green space was investigated in Tianjin. The prediction equation of stormwater runoff was established by the quantitative theory with the lab experimental data of soil columns. It was validated by three field experiments and the relative errors between predicted and measured stormwater runoff are 1.41, 1.52 and 7.35%, respectively. The results implied that the prediction equation could be used to forecast the stormwater runoff of urban green space. The results of range and variance analysis indicated the sequence order of limiting factors is rainfall intensity > grass type > rainfall duration > vegetation coverage. The least runoff of green land in the present study is the combination of rainfall intensity 60.0 mm/h, duration 60.0 min, grass Festuca arundinacea and vegetation coverage 90.0%. When the intensity and duration of rainfall are 60.0 mm/h and 90.0 min, the predicted volumetric runoff coefficient is 0.23 with Festuca arundinacea of 90.0% vegetation coverage. The present approach indicated that green space is an effective method to reduce stormwater runoff and the conclusions are mainly applicable to Tianjin and the semi-arid areas with main summer precipitation and long-time interval rainfalls.

  13. Transition-state theory predicts clogging at the microscale

    NASA Astrophysics Data System (ADS)

    Laar, T. Van De; Klooster, S. Ten; Schroën, K.; Sprakel, J.

    2016-06-01

    Clogging is one of the main failure mechanisms encountered in industrial processes such as membrane filtration. Our understanding of the factors that govern the build-up of fouling layers and the emergence of clogs is largely incomplete, so that prevention of clogging remains an immense and costly challenge. In this paper we use a microfluidic model combined with quantitative real-time imaging to explore the influence of pore geometry and particle interactions on suspension clogging in constrictions, two crucial factors which remain relatively unexplored. We find a distinct dependence of the clogging rate on the entrance angle to a membrane pore which we explain quantitatively by deriving a model, based on transition-state theory, which describes the effect of viscous forces on the rate with which particles accumulate at the channel walls. With the same model we can also predict the effect of the particle interaction potential on the clogging rate. In both cases we find excellent agreement between our experimental data and theory. A better understanding of these clogging mechanisms and the influence of design parameters could form a stepping stone to delay or prevent clogging by rational membrane design.

  14. An Attractor Network in the Hippocampus: Theory and Neurophysiology

    ERIC Educational Resources Information Center

    Rolls, Edmund T.

    2007-01-01

    A quantitative computational theory of the operation of the CA3 system as an attractor or autoassociation network is described. Based on the proposal that CA3-CA3 autoassociative networks are important for episodic or event memory in which space is a component (place in rodents and spatial view in primates), it has been shown behaviorally that the…

  15. Dependence of quantitative accuracy of CT perfusion imaging on system parameters

    NASA Astrophysics Data System (ADS)

    Li, Ke; Chen, Guang-Hong

    2017-03-01

    Deconvolution is a popular method to calculate parametric perfusion parameters from four dimensional CT perfusion (CTP) source images. During the deconvolution process, the four dimensional space is squeezed into three-dimensional space by removing the temporal dimension, and a prior knowledge is often used to suppress noise associated with the process. These additional complexities confound the understanding about deconvolution-based CTP imaging system and how its quantitative accuracy depends on parameters and sub-operations involved in the image formation process. Meanwhile, there has been a strong clinical need in answering this question, as physicians often rely heavily on the quantitative values of perfusion parameters to make diagnostic decisions, particularly during an emergent clinical situation (e.g. diagnosis of acute ischemic stroke). The purpose of this work was to develop a theoretical framework that quantitatively relates the quantification accuracy of parametric perfusion parameters with CTP acquisition and post-processing parameters. This goal was achieved with the help of a cascaded systems analysis for deconvolution-based CTP imaging systems. Based on the cascaded systems analysis, the quantitative relationship between regularization strength, source image noise, arterial input function, and the quantification accuracy of perfusion parameters was established. The theory could potentially be used to guide developments of CTP imaging technology for better quantification accuracy and lower radiation dose.

  16. Quantitative reflectance spectroscopy of buddingtonite from the Cuprite mining district, Nevada

    NASA Technical Reports Server (NTRS)

    Felzer, Benjamin; Hauff, Phoebe; Goetz, Alexander F. H.

    1994-01-01

    Buddingtonite, an ammonium-bearing feldspar diagnostic of volcanic-hosted alteration, can be identified and, in some cases, quantitatively measured using short-wave infrared (SWIR) reflectance spectroscopy. In this study over 200 samples from Cuprite, Nevada, were evaluated by X ray diffraction, chemical analysis, scanning electron microscopy, and SWIR reflectance spectroscopy with the objective of developing a quantitative remote-sensing technique for rapid determination of the amount of ammonium or buddingtonite present, and its distribution across the site. Based upon the Hapke theory of radiative transfer from particulate surfaces, spectra from quantitative, physical mixtures were compared with computed mixture spectra. We hypothesized that the concentration of ammonium in each sample is related to the size and shape of the ammonium absorption bands and tested this hypothesis for samples of relatively pure buddingtonite. We found that the band depth of the 2.12-micron NH4 feature is linearly related to the NH4 concentration for the Cuprite buddingtonite, and that the relationship is approximately exponential for a larger range of NH4 concentrations. Associated minerals such as smectite and jarosite suppress the depth of the 2.12-micron NH4 absorption band. Quantitative reflectance spectroscopy is possible when the effects of these associated minerals are also considered.

  17. Use of Theory-Driven Research in Counseling: Investigating Three Counseling Psychology Journals from 1990 to 1999

    ERIC Educational Resources Information Center

    Karr, Carolyn A.; Larson, Lisa M.

    2005-01-01

    Three major journals in counseling psychology were sampled from 1990 to 1999 to assess the percentage of quantitative, empirical articles that were theory driven. Only 43% of the studies utilized a theory or model, and 57% predicted the relation between the variables, with few studies specifying the strength of the relation. Studies sampled in the…

  18. Quantitative Simulation of QARBM Challenge Events During Radiation Belt Enhancements

    NASA Astrophysics Data System (ADS)

    Li, W.; Ma, Q.; Thorne, R. M.; Bortnik, J.; Chu, X.

    2017-12-01

    Various physical processes are known to affect energetic electron dynamics in the Earth's radiation belts, but their quantitative effects at different times and locations in space need further investigation. This presentation focuses on discussing the quantitative roles of various physical processes that affect Earth's radiation belt electron dynamics during radiation belt enhancement challenge events (storm-time vs. non-storm-time) selected by the GEM Quantitative Assessment of Radiation Belt Modeling (QARBM) focus group. We construct realistic global distributions of whistler-mode chorus waves, adopt various versions of radial diffusion models (statistical and event-specific), and use the global evolution of other potentially important plasma waves including plasmaspheric hiss, magnetosonic waves, and electromagnetic ion cyclotron waves from all available multi-satellite measurements. These state-of-the-art wave properties and distributions on a global scale are used to calculate diffusion coefficients, that are then adopted as inputs to simulate the dynamical electron evolution using a 3D diffusion simulation during the storm-time and the non-storm-time acceleration events respectively. We explore the similarities and differences in the dominant physical processes that cause radiation belt electron dynamics during the storm-time and non-storm-time acceleration events. The quantitative role of each physical process is determined by comparing against the Van Allen Probes electron observations at different energies, pitch angles, and L-MLT regions. This quantitative comparison further indicates instances when quasilinear theory is sufficient to explain the observed electron dynamics or when nonlinear interaction is required to reproduce the energetic electron evolution observed by the Van Allen Probes.

  19. The electromagnetic Sigma-to-Lambda hyperon transition form factors at low energies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Granados, Carlos; Leupold, Stefan; Perotti, Elisabetta

    Using dispersion theory the low-energy electromagnetic form factors for the transition of a Sigma to a Lambda hyperon are related to the pion vector form factor. The additionally required input, i.e. the two-pion-Sigma-Lambda amplitudes are determined from relativistic next-to-leading-order (NLO) baryon chiral perturbation theory including the baryons from the octet and optionally from the decuplet. Pion rescattering is again taken into account by dispersion theory. It turns out that the inclusion of decuplet baryons is not an option but a necessity to obtain reasonable results. The electric transition form factor remains very small in the whole low-energy region. The magneticmore » transition form factor depends strongly on one not very well determined low-energy constant of the NLO Lagrangian. Furthermore, one obtains reasonable predictive power if this low-energy constant is determined from a measurement of the magnetic transition radius. Such a measurement can be performed at the future Facility for Antiproton and Ion Research (FAIR).« less

  20. The electromagnetic Sigma-to-Lambda hyperon transition form factors at low energies

    DOE PAGES

    Granados, Carlos; Leupold, Stefan; Perotti, Elisabetta

    2017-06-09

    Using dispersion theory the low-energy electromagnetic form factors for the transition of a Sigma to a Lambda hyperon are related to the pion vector form factor. The additionally required input, i.e. the two-pion-Sigma-Lambda amplitudes are determined from relativistic next-to-leading-order (NLO) baryon chiral perturbation theory including the baryons from the octet and optionally from the decuplet. Pion rescattering is again taken into account by dispersion theory. It turns out that the inclusion of decuplet baryons is not an option but a necessity to obtain reasonable results. The electric transition form factor remains very small in the whole low-energy region. The magneticmore » transition form factor depends strongly on one not very well determined low-energy constant of the NLO Lagrangian. Furthermore, one obtains reasonable predictive power if this low-energy constant is determined from a measurement of the magnetic transition radius. Such a measurement can be performed at the future Facility for Antiproton and Ion Research (FAIR).« less

  1. ON NORRIS' THEORY FOR THE SHAPE OF THE MAMMALIAN ERYTHROCYTE

    PubMed Central

    Ponder, Eric

    1934-01-01

    This paper is concerned with an attempt to put Norris' theory for the shape of the mammalian erythrocyte into a quantitative form. The theory supposes that the biconcave form of the cell is brought about by an expansive force enlarging the surface, and is also supposed to apply to the formation of the myelin forms of lecithn. The attempt is not successful, and is published merely because it is suggestive. Various points regarding the shape of the cell, the curvature of its surface, and the kind of system to which Norris' theory might be supposed to apply, are discussed, and an empirical formula is given for the curve which bounds the cross-section of the cell. This empirical formula describes the shape almost to perfection. PMID:19872803

  2. Protein Signaling Networks from Single Cell Fluctuations and Information Theory Profiling

    PubMed Central

    Shin, Young Shik; Remacle, F.; Fan, Rong; Hwang, Kiwook; Wei, Wei; Ahmad, Habib; Levine, R.D.; Heath, James R.

    2011-01-01

    Protein signaling networks among cells play critical roles in a host of pathophysiological processes, from inflammation to tumorigenesis. We report on an approach that integrates microfluidic cell handling, in situ protein secretion profiling, and information theory to determine an extracellular protein-signaling network and the role of perturbations. We assayed 12 proteins secreted from human macrophages that were subjected to lipopolysaccharide challenge, which emulates the macrophage-based innate immune responses against Gram-negative bacteria. We characterize the fluctuations in protein secretion of single cells, and of small cell colonies (n = 2, 3,···), as a function of colony size. Measuring the fluctuations permits a validation of the conditions required for the application of a quantitative version of the Le Chatelier's principle, as derived using information theory. This principle provides a quantitative prediction of the role of perturbations and allows a characterization of a protein-protein interaction network. PMID:21575571

  3. Kinetic theory of Lennard-Jones fluids

    NASA Astrophysics Data System (ADS)

    Leegwater, Jan A.

    1991-12-01

    A kinetic theory that describes the time evolution of a fluid consisting of Lennard-Jones particles at all densities is proposed. The kinetic equation assumes binary collisions, but takes into account the finite time duration of a collision. Furthermore, it is an extension of a kinetic equation for the square well fluid as well as the hard sphere Enskog theory. In the low density limit, the Boltzmann theory is obtained. It is shown that the proposed theory obeys all the conservation laws. The exchange of potential and kinetic energies is studied and it is shown that at high density this is a fast process. The dominant mechanism for energy exchange is found to be collisions at the strongly repulsive part of the potential that are disturbed by third particles. The kinetic equation is also used to calculate the Green-Kubo integrands for shear viscosity and heat conductivity. The major structures found in molecular dynamics simulations are reproduced at intermediate densities quantitatively and at high density semiquantitatively. It is found that at high density, not only correlated collisions have to be taken into account, but that even the concept of collisions in the sense of sudden changes in the velocity is no longer useful.

  4. Physics Content and Pedagogical Changes: Ramification of Theory and Practice

    ERIC Educational Resources Information Center

    Cobbinah, Charles; Bayaga, Anass

    2017-01-01

    The aim of this study was to explore physics teachers' ramification of theory and practices as a result of physics content and pedagogical changes in the Further Education and Training (FET) phase. The researchers adopted the mixed method research approach. The quantitative aspect involved 109 physics teachers and the qualitative approach used ten…

  5. Three quantitative approaches to the diagnosis of abdominal pain in children: practical applications of decision theory.

    PubMed

    Klein, M D; Rabbani, A B; Rood, K D; Durham, T; Rosenberg, N M; Bahr, M J; Thomas, R L; Langenburg, S E; Kuhns, L R

    2001-09-01

    The authors compared 3 quantitative methods for assisting clinicians in the differential diagnosis of abdominal pain in children, where the most common important endpoint is whether the patient has appendicitis. Pretest probability in different age and sex groups were determined to perform Bayesian analysis, binary logistic regression was used to determine which variables were statistically significantly likely to contribute to a diagnosis, and recursive partitioning was used to build decision trees with quantitative endpoints. The records of all children (1,208) seen at a large urban emergency department (ED) with a chief complaint of abdominal pain were immediately reviewed retrospectively (24 to 72 hours after the encounter). Attempts were made to contact all the patients' families to determine an accurate final diagnosis. A total of 1,008 (83%) families were contacted. Data were analyzed by calculation of the posttest probability, recursive partitioning, and binary logistic regression. In all groups the most common diagnosis was abdominal pain (ICD-9 Code 789). After this, however, the order of the most common final diagnoses for abdominal pain varied significantly. The entire group had a pretest probability of appendicitis of 0.06. This varied with age and sex from 0.02 in boys 2 to 5 years old to 0.16 in boys older than 12 years. In boys age 5 to 12, recursive partitioning and binary logistic regression agreed on guarding and anorexia as important variables. Guarding and tenderness were important in girls age 5 to 12. In boys age greater than 12, both agreed on guarding and anorexia. Using sensitivities and specificities from the literature, computed tomography improved the posttest probability for the group from.06 to.33; ultrasound improved it from.06 to.48; and barium enema improved it from.06 to.58. Knowing the pretest probabilities in a specific population allows the physician to evaluate the likely diagnoses first. Other quantitative methods can help

  6. Leptokurtic portfolio theory

    NASA Astrophysics Data System (ADS)

    Kitt, R.; Kalda, J.

    2006-03-01

    The question of optimal portfolio is addressed. The conventional Markowitz portfolio optimisation is discussed and the shortcomings due to non-Gaussian security returns are outlined. A method is proposed to minimise the likelihood of extreme non-Gaussian drawdowns of the portfolio value. The theory is called Leptokurtic, because it minimises the effects from “fat tails” of returns. The leptokurtic portfolio theory provides an optimal portfolio for investors, who define their risk-aversion as unwillingness to experience sharp drawdowns in asset prices. Two types of risks in asset returns are defined: a fluctuation risk, that has Gaussian distribution, and a drawdown risk, that deals with distribution tails. These risks are quantitatively measured by defining the “noise kernel” — an ellipsoidal cloud of points in the space of asset returns. The size of the ellipse is controlled with the threshold parameter: the larger the threshold parameter, the larger return are accepted for investors as normal fluctuations. The return vectors falling into the kernel are used for calculation of fluctuation risk. Analogously, the data points falling outside the kernel are used for the calculation of drawdown risks. As a result the portfolio optimisation problem becomes three-dimensional: in addition to the return, there are two types of risks involved. Optimal portfolio for drawdown-averse investors is the portfolio minimising variance outside the noise kernel. The theory has been tested with MSCI North America, Europe and Pacific total return stock indices.

  7. Pro-Social Behavior Amongst Students of Tertiary Institutions: An Explorative and a Quantitative Approach

    ERIC Educational Resources Information Center

    Quain, Samuel; Yidana, Xiaaba Dantallah; Ambotumah, Bernard Baba; Mensah-Livivnstone, Ike Joe Nii Annang

    2016-01-01

    The purpose of this paper was to explore antecedents of pro-social behavior amongst university students, using a private university as a case study. Following an explorative research, the study was guided by some theories relating to the phenomenon, focusing on gender and location factors. A quantitative approach was used in the follow up to the…

  8. Quantitative estimation of minimum offset for multichannel surface-wave survey with actively exciting source

    USGS Publications Warehouse

    Xu, Y.; Xia, J.; Miller, R.D.

    2006-01-01

    Multichannel analysis of surface waves is a developing method widely used in shallow subsurface investigations. The field procedures and related parameters are very important for successful applications. Among these parameters, the source-receiver offset range is seldom discussed in theory and normally determined by empirical or semi-quantitative methods in current practice. This paper discusses the problem from a theoretical perspective. A formula for quantitatively evaluating a layered homogenous elastic model was developed. The analytical results based on simple models and experimental data demonstrate that the formula is correct for surface wave surveys for near-surface applications. ?? 2005 Elsevier B.V. All rights reserved.

  9. Strong dynamics and lattice gauge theory

    NASA Astrophysics Data System (ADS)

    Schaich, David

    In this dissertation I use lattice gauge theory to study models of electroweak symmetry breaking that involve new strong dynamics. Electroweak symmetry breaking (EWSB) is the process by which elementary particles acquire mass. First proposed in the 1960s, this process has been clearly established by experiments, and can now be considered a law of nature. However, the physics underlying EWSB is still unknown, and understanding it remains a central challenge in particle physics today. A natural possibility is that EWSB is driven by the dynamics of some new, strongly-interacting force. Strong interactions invalidate the standard analytical approach of perturbation theory, making these models difficult to study. Lattice gauge theory is the premier method for obtaining quantitatively-reliable, nonperturbative predictions from strongly-interacting theories. In this approach, we replace spacetime by a regular, finite grid of discrete sites connected by links. The fields and interactions described by the theory are likewise discretized, and defined on the lattice so that we recover the original theory in continuous spacetime on an infinitely large lattice with sites infinitesimally close together. The finite number of degrees of freedom in the discretized system lets us simulate the lattice theory using high-performance computing. Lattice gauge theory has long been applied to quantum chromodynamics, the theory of strong nuclear interactions. Using lattice gauge theory to study dynamical EWSB, as I do in this dissertation, is a new and exciting application of these methods. Of particular interest is non-perturbative lattice calculation of the electroweak S parameter. Experimentally S ≈ -0.15(10), which tightly constrains dynamical EWSB. On the lattice, I extract S from the momentum-dependence of vector and axial-vector current correlators. I created and applied computer programs to calculate these correlators and analyze them to determine S. I also calculated the masses

  10. Time-dependent mean-field theory for x-ray near-edge spectroscopy

    NASA Astrophysics Data System (ADS)

    Bertsch, G. F.; Lee, A. J.

    2014-02-01

    We derive equations of motion for calculating the near-edge x-ray absorption spectrum in molecules and condensed matter, based on a two-determinant approximation and Dirac's variational principle. The theory provides an exact solution for the linear response when the Hamiltonian or energy functional has only diagonal interactions in some basis. We numerically solve the equations to compare with the Mahan-Nozières-De Dominicis theory of the edge singularity in metallic conductors. Our extracted power-law exponents are similar to those of the analytic theory, but are not in quantitative agreement. The calculational method can be readily generalized to treat Kohn-Sham Hamiltonians with electron-electron interactions derived from correlation-exchange potentials.

  11. Social Comparison and Body Image in Adolescence: A Grounded Theory Approach

    ERIC Educational Resources Information Center

    Krayer, A.; Ingledew, D. K.; Iphofen, R.

    2008-01-01

    This study explored the use of social comparison appraisals in adolescents' lives with particular reference to enhancement appraisals which can be used to counter threats to the self. Social comparison theory has been increasingly used in quantitative research to understand the processes through which societal messages about appearance influence…

  12. Continuum theory of gene expression waves during vertebrate segmentation.

    PubMed

    Jörg, David J; Morelli, Luis G; Soroldoni, Daniele; Oates, Andrew C; Jülicher, Frank

    2015-09-01

    The segmentation of the vertebrate body plan during embryonic development is a rhythmic and sequential process governed by genetic oscillations. These genetic oscillations give rise to traveling waves of gene expression in the segmenting tissue. Here we present a minimal continuum theory of vertebrate segmentation that captures the key principles governing the dynamic patterns of gene expression including the effects of shortening of the oscillating tissue. We show that our theory can quantitatively account for the key features of segmentation observed in zebrafish, in particular the shape of the wave patterns, the period of segmentation and the segment length as a function of time.

  13. Continuum theory of gene expression waves during vertebrate segmentation

    PubMed Central

    Jörg, David J; Morelli, Luis G; Soroldoni, Daniele; Oates, Andrew C; Jülicher, Frank

    2015-01-01

    Abstract The segmentation of the vertebrate body plan during embryonic development is a rhythmic and sequential process governed by genetic oscillations. These genetic oscillations give rise to traveling waves of gene expression in the segmenting tissue. Here we present a minimal continuum theory of vertebrate segmentation that captures the key principles governing the dynamic patterns of gene expression including the effects of shortening of the oscillating tissue. We show that our theory can quantitatively account for the key features of segmentation observed in zebrafish, in particular the shape of the wave patterns, the period of segmentation and the segment length as a function of time. PMID:28725158

  14. Closing in on chemical bonds by opening up relativity theory.

    PubMed

    Whitney, Cynthia K

    2008-03-01

    This paper develops a connection between the phenomenology of chemical bonding and the theory of relativity. Empirical correlations between electron numbers in atoms and chemical bond stabilities in molecules are first reviewed and extended. Quantitative chemical bond strengths are then related to ionization potentials in elements. Striking patterns in ionization potentials are revealed when the data are viewed in an element-independent way, where element-specific details are removed via an appropriate scaling law. The scale factor involved is not explained by quantum mechanics; it is revealed only when one goes back further, to the development of Einstein's special relativity theory.

  15. Explanation of asymmetric dynamics of human water consumption in arid regions: prospect theory versus expected utility theory

    NASA Astrophysics Data System (ADS)

    Tian, F.; Lu, Y.

    2017-12-01

    Based on socioeconomic and hydrological data in three arid inland basins and error analysis, the dynamics of human water consumption (HWC) are analyzed to be asymmetric, i.e., HWC increase rapidly in wet periods while maintain or decrease slightly in dry periods. Besides the qualitative analysis that in wet periods great water availability inspires HWC to grow fast but the now expanded economy is managed to sustain by over-exploitation in dry periods, two quantitative models are established and tested, based on expected utility theory (EUT) and prospect theory (PT) respectively. EUT states that humans make decisions based on the total expected utility, namely the sum of utility function multiplied by probability of each result, while PT states that the utility function is defined over gains and losses separately, and probability should be replaced by probability weighting function.

  16. Prediction of tautomer ratios by embedded-cluster integral equation theory

    NASA Astrophysics Data System (ADS)

    Kast, Stefan M.; Heil, Jochen; Güssregen, Stefan; Schmidt, K. Friedemann

    2010-04-01

    The "embedded cluster reference interaction site model" (EC-RISM) approach combines statistical-mechanical integral equation theory and quantum-chemical calculations for predicting thermodynamic data for chemical reactions in solution. The electronic structure of the solute is determined self-consistently with the structure of the solvent that is described by 3D RISM integral equation theory. The continuous solvent-site distribution is mapped onto a set of discrete background charges ("embedded cluster") that represent an additional contribution to the molecular Hamiltonian. The EC-RISM analysis of the SAMPL2 challenge set of tautomers proceeds in three stages. Firstly, the group of compounds for which quantitative experimental free energy data was provided was taken to determine appropriate levels of quantum-chemical theory for geometry optimization and free energy prediction. Secondly, the resulting workflow was applied to the full set, allowing for chemical interpretations of the results. Thirdly, disclosure of experimental data for parts of the compounds facilitated a detailed analysis of methodical issues and suggestions for future improvements of the model. Without specifically adjusting parameters, the EC-RISM model yields the smallest value of the root mean square error for the first set (0.6 kcal mol-1) as well as for the full set of quantitative reaction data (2.0 kcal mol-1) among the SAMPL2 participants.

  17. Effect of Zb states on ϒ (3 S )→ϒ (1 S )π π decays

    NASA Astrophysics Data System (ADS)

    Chen, Yun-Hua; Daub, Johanna T.; Guo, Feng-Kun; Kubis, Bastian; Meißner, Ulf-G.; Zou, Bing-Song

    2016-02-01

    Within the framework of dispersion theory, we analyze the dipion transitions between the lightest ϒ states, ϒ (n S )→ϒ (m S )π π with m rescattering effects are taken into account in a model-independent way using dispersion theory. We confirm that matching the dispersive representation to the leading chiral amplitude alone cannot reproduce the peculiar two-peak π π mass spectrum of the decay ϒ (3 S )→ϒ (1 S )π π . The existence of the bottomoniumlike Zb states can naturally explain this anomaly. We also point out the necessity of a proper extraction of the coupling strengths for the Zb states to ϒ (n S )π , which is only possible if a Flatté-like parametrization is used in the data analysis for the Zb states.

  18. School Type and Academic Culture: Evidence for the Differentiation-Polarization Theory

    ERIC Educational Resources Information Center

    Van Houtte, Mieke

    2006-01-01

    Several decades ago it was shown that the differentiation of pupils into tracks and streams led to a polarization into "anti-school" and "pro-school" cultures. Support for this differentiation-polarization theory is mainly based on case studies. This paper presents findings of a quantitative study in Belgium (Flanders).…

  19. Quantitative nephelometry

    MedlinePlus

    ... this page: //medlineplus.gov/ency/article/003545.htm Quantitative nephelometry test To use the sharing features on this page, please enable JavaScript. Quantitative nephelometry is a lab test to quickly and ...

  20. Transverse charge and magnetization densities: Improved chiral predictions down to b=1 fms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alarcon, Jose Manuel; Hiller Blin, Astrid N.; Vicente Vacas, Manuel J.

    The transverse charge and magnetization densities provide insight into the nucleon’s inner structure. In the periphery, the isovector components are clearly dominant, and can be computed in a model-independent way by means of a combination of chiral effective field theory (cEFT) and dispersion analysis. With a novel N=D method, we incorporate the pion electromagnetic formfactor data into the cEFT calculation, thus taking into account the pion-rescattering effects and r-meson pole. As a consequence, we are able to reliably compute the densities down to distances b1 fm, therefore achieving a dramatic improvement of the results compared to traditional cEFT calculations, whilemore » remaining predictive and having controlled uncertainties.« less

  1. Structure of the Roper resonance from lattice QCD constraints

    NASA Astrophysics Data System (ADS)

    Wu, Jia-jun; Leinweber, Derek B.; Liu, Zhan-wei; Thomas, Anthony W.

    2018-05-01

    Two different effective field theory descriptions of the pion-nucleon scattering data are constructed to describe the region of the Roper resonance. In one, the resonance is the result of strong rescattering between coupled meson-baryon channels, while in the other the resonance has a large bare-baryon (or quark-model-like) component. The predictions of these two scenarios are compared with the latest lattice QCD simulation results in this channel. We find that the second scenario is not consistent with lattice QCD results, whereas the first agrees with those constraints. In that preferred scenario, the mass of the quark-model-like state is approximately 2 GeV, with the infinite-volume Roper resonance best described as a resonance generated dynamically through strongly coupled meson-baryon channels.

  2. Lost in translation? The hazards of applying social constructionism to quantitative research on sexual orientation development.

    PubMed

    Robboy, Caroline Alex

    2002-01-01

    This article explores the hazards faced by social constructionists who attempt to conduct quantitative research on sexual orientation development. By critically reviewing two quantitative research studies, this article explores the ways in which the very nature of social constructionist arguments may be incongruous with the methodological requirements of quantitative studies. I suggest this conflict is a result of the differing natures of these two modes of scholarly inquiry. While research requires the acceptance of certain analytical categories, the strength of social constructionism comes from its reflexive scrutiny and problematization of those very categories. Ultimately, social constructionists who try to apply their theories/perspectives must necessarily conform to the methodological constraints of quantitative research. The intent of this article is not to suggest that it is futile or self-contradictory for social constructionists to attempt empirical research, but that these are two distinct modes of scholarly inquiry which can, and should, co-exist in a dialectical relationship to each other.

  3. A Quantitative Comparison of Leading-edge Vortices in Incompressible and Supersonic Flows

    NASA Technical Reports Server (NTRS)

    Wang, F. Y.; Milanovic, I. M.; Zaman, K. B. M. Q.

    2002-01-01

    When requiring quantitative data on delta-wing vortices for design purposes, low-speed results have often been extrapolated to configurations intended for supersonic operation. This practice stems from a lack of database owing to difficulties that plague measurement techniques in high-speed flows. In the present paper an attempt is made to examine this practice by comparing quantitative data on the nearwake properties of such vortices in incompressible and supersonic flows. The incompressible flow data are obtained in experiments conducted in a low-speed wind tunnel. Detailed flow-field properties, including vorticity and turbulence characteristics, obtained by hot-wire and pressure probe surveys are documented. These data are compared, wherever possible, with available data from a past work for a Mach 2.49 flow for the same wing geometry and angles-of-attack. The results indicate that quantitative similarities exist in the distributions of total pressure and swirl velocity. However, the streamwise velocity of the core exhibits different trends. The axial flow characteristics of the vortices in the two regimes are examined, and a candidate theory is discussed.

  4. Ethnic Disparities in Graduate Education: A Selective Review of Quantitative Research, Social Theory, and Quality Initiatives

    ERIC Educational Resources Information Center

    Franklin, Somer L.; Slate, John R.; Joyner, Sheila A.

    2014-01-01

    In this article, we analyzed research studies in the field of graduate education. In particular, we explored the issue of inequity in graduate education through three key lenses of social science analyses. Furthermore, we analyzed selected quantitative research studies that undertook a comparative examination of aggregate trends in enrollment and…

  5. Development of a dynamic computational model of social cognitive theory.

    PubMed

    Riley, William T; Martin, Cesar A; Rivera, Daniel E; Hekler, Eric B; Adams, Marc A; Buman, Matthew P; Pavel, Misha; King, Abby C

    2016-12-01

    Social cognitive theory (SCT) is among the most influential theories of behavior change and has been used as the conceptual basis of health behavior interventions for smoking cessation, weight management, and other health behaviors. SCT and other behavior theories were developed primarily to explain differences between individuals, but explanatory theories of within-person behavioral variability are increasingly needed as new technologies allow for intensive longitudinal measures and interventions adapted from these inputs. These within-person explanatory theoretical applications can be modeled as dynamical systems. SCT constructs, such as reciprocal determinism, are inherently dynamical in nature, but SCT has not been modeled as a dynamical system. This paper describes the development of a dynamical system model of SCT using fluid analogies and control systems principles drawn from engineering. Simulations of this model were performed to assess if the model performed as predicted based on theory and empirical studies of SCT. This initial model generates precise and testable quantitative predictions for future intensive longitudinal research. Dynamic modeling approaches provide a rigorous method for advancing health behavior theory development and refinement and for guiding the development of more potent and efficient interventions.

  6. Choice theories: What are they good for?☆

    PubMed Central

    Johnson, Eric J.

    2013-01-01

    Simonson et al. present an ambitious sketch of an integrative theory of context. Provoked by this thoughtful proposal, I discuss what is the function of theories of choice in the coming decades. Traditionally, choice models and theory have attempted to predict choices as a function of the attributes of options. I argue that to be truly useful, they need to generate specific and quantitative predictions of the effect of the choice environment upon choice probability. To do this, we need to focus on rigorously modeling and measuring the underlying processes causing these effects, and use the Simonson et al. proposal to provide some examples. I also present some examples from research in decision-making and decision neuroscience, and argue that models that fail, and fail spectacularly are particularly useful. I close with a challenge: How would consumer researcher aid the design of real world choice environments such as the health exchanges under the Patient Protection and Affordable Care Act? PMID:23794793

  7. Hard Photodisintegration of 3He

    NASA Astrophysics Data System (ADS)

    Granados, Carlos

    2011-02-01

    Large angle photodisintegration of two nucleons from the 3He nucleus is studied within the framework of the hard rescattering model (HRM). In the HRM the incoming photon is absorbed by one nucleon's valence quark that then undergoes a hard rescattering reaction with a valence quark from the second nucleon producing two nucleons emerging at large transverse momentum . Parameter free cross sections for pp and pn break up channels are calculated through the input of experimental cross sections on pp and pn elastic scattering. The calculated cross section for pp breakup and its predicted energy dependency are in good agreement with recent experimental data. Predictions on spectator momentum distributions and helicity transfer are also presented.

  8. A two-dimensional model of water: Theory and computer simulations

    NASA Astrophysics Data System (ADS)

    Urbič, T.; Vlachy, V.; Kalyuzhnyi, Yu. V.; Southall, N. T.; Dill, K. A.

    2000-02-01

    We develop an analytical theory for a simple model of liquid water. We apply Wertheim's thermodynamic perturbation theory (TPT) and integral equation theory (IET) for associative liquids to the MB model, which is among the simplest models of water. Water molecules are modeled as 2-dimensional Lennard-Jones disks with three hydrogen bonding arms arranged symmetrically, resembling the Mercedes-Benz (MB) logo. The MB model qualitatively predicts both the anomalous properties of pure water and the anomalous solvation thermodynamics of nonpolar molecules. IET is based on the orientationally averaged version of the Ornstein-Zernike equation. This is one of the main approximations in the present work. IET correctly predicts the pair correlation function of the model water at high temperatures. Both TPT and IET are in semi-quantitative agreement with the Monte Carlo values of the molar volume, isothermal compressibility, thermal expansion coefficient, and heat capacity. A major advantage of these theories is that they require orders of magnitude less computer time than the Monte Carlo simulations.

  9. Impact of pore space topology on permeability, cut-off frequencies and validity of wave propagation theories

    NASA Astrophysics Data System (ADS)

    Sarout, Joël.

    2012-04-01

    For the first time, a comprehensive and quantitative analysis of the domains of validity of popular wave propagation theories for porous/cracked media is provided. The case of a simple, yet versatile rock microstructure is detailed. The microstructural parameters controlling the applicability of the scattering theories, the effective medium theories, the quasi-static (Gassmann limit) and dynamic (inertial) poroelasticity are analysed in terms of pores/cracks characteristic size, geometry and connectivity. To this end, a new permeability model is devised combining the hydraulic radius and percolation concepts. The predictions of this model are compared to published micromechanical models of permeability for the limiting cases of capillary tubes and penny-shaped cracks. It is also compared to published experimental data on natural rocks in these limiting cases. It explicitly accounts for pore space topology around the percolation threshold and far above it. Thanks to this permeability model, the scattering, squirt-flow and Biot cut-off frequencies are quantitatively compared. This comparison leads to an explicit mapping of the domains of validity of these wave propagation theories as a function of the rock's actual microstructure. How this mapping impacts seismic, geophysical and ultrasonic wave velocity data interpretation is discussed. The methodology demonstrated here and the outcomes of this analysis are meant to constitute a quantitative guide for the selection of the most suitable modelling strategy to be employed for prediction and/or interpretation of rocks elastic properties in laboratory-or field-scale applications when information regarding the rock's microstructure is available.

  10. A unified theory of impact crises and mass extinctions: quantitative tests.

    PubMed

    Rampino, M R; Haggerty, B M; Pagano, T C

    1997-05-30

    Several quantitative tests of a general hypothesis linking impacts of large asteroids and comets with mass extinctions of life are possible based on astronomical data, impact dynamics, and geological information. The waiting times of large-body impacts on the Earth derived from the flux of Earth-crossing asteroids and comets, and the estimated size of impacts capable of causing, large-scale environmental disasters, predict the impacts of objects > or = 5 km in diameter (> or = 10(7) Mt TNT equivalent) could be sufficient to explain the record of approximately 25 extinction pulses in the last 540 Myr, with the 5 recorded major mass extinctions related to impacts of the largest objects of > or = 10 km in diameter (> or = 10(8) Mt events). Smaller impacts (approximately 10(6) Mt), with significant regional environmental effects, could be responsible for the lesser boundaries in the geologic record. Tests of the "kill curve" relationship for impact-induced extinctions based on new data on extinction intensities, and several well-dated large impact craters, also suggest that major mass extinctions require large impacts, and that a step in the kill curve may exist at impacts that produce craters of approximately 100 km diameter, smaller impacts being capable of only relatively weak extinction pulses. Single impact craters less than approximately 60 km in diameter should not be associated with detectable global extinction pulses (although they may explain stage and zone boundaries marked by lesser faunal turnover), but multiple impacts in that size range may produce significant stepped extinction pulses. Statistical tests of the last occurrences of species at mass-extinction boundaries are generally consistent with predictions for abrupt or stepped extinctions, and several boundaries are known to show "catastrophic" signatures of environmental disasters and biomass crash, impoverished postextinction fauna and flora dominated by stress-tolerant and opportunistic species

  11. Continuum theory of lipid bilayer electrostatics.

    PubMed

    Gerami, R; Bruinsma, R F

    2009-10-01

    In order to address the concerns about the applicability of the continuum theory of lipid bilayers, we generalize it by including a film with uniaxial dielectric properties representing the polar head groups of the lipid molecules. As a function of the in-plane dielectric constant κ|| of this film, we encounter a sequence of different phases. For low values of κ||, transmembrane pores have aqueous cores, ions are repelled by the bilayer, and the ion permeability of the bilayer is independent of the ion radius as in the existing theory. For increasing κ||, a threshold is reached--of the order of the dielectric constant of water--beyond which ions are attracted to the lipid bilayer by generic polarization attraction, transmembrane pores collapse, and the ion permeability becomes sensitively dependent on the ion radius, results that are more consistent with experimental and numerical studies of the interaction of ions with neutral lipid bilayers. At even higher values of κ||, the ion/pore complexes are predicted to condense in the form of extended arrays. The generalized continuum theory can be tested quantitatively by studies of the ion permeability as a function of salt concentration and co-surfactant concentration.

  12. Rate-distortion theory and human perception.

    PubMed

    Sims, Chris R

    2016-07-01

    The fundamental goal of perception is to aid in the achievement of behavioral objectives. This requires extracting and communicating useful information from noisy and uncertain sensory signals. At the same time, given the complexity of sensory information and the limitations of biological information processing, it is necessary that some information must be lost or discarded in the act of perception. Under these circumstances, what constitutes an 'optimal' perceptual system? This paper describes the mathematical framework of rate-distortion theory as the optimal solution to the problem of minimizing the costs of perceptual error subject to strong constraints on the ability to communicate or transmit information. Rate-distortion theory offers a general and principled theoretical framework for developing computational-level models of human perception (Marr, 1982). Models developed in this framework are capable of producing quantitatively precise explanations for human perceptual performance, while yielding new insights regarding the nature and goals of perception. This paper demonstrates the application of rate-distortion theory to two benchmark domains where capacity limits are especially salient in human perception: discrete categorization of stimuli (also known as absolute identification) and visual working memory. A software package written for the R statistical programming language is described that aids in the development of models based on rate-distortion theory. Copyright © 2016 The Author. Published by Elsevier B.V. All rights reserved.

  13. Qualitative and Quantitative Distinctions in Personality Disorder

    PubMed Central

    Wright, Aidan G. C.

    2011-01-01

    The “categorical-dimensional debate” has catalyzed a wealth of empirical advances in the study of personality pathology. However, this debate is merely one articulation of a broader conceptual question regarding whether to define and describe psychopathology as a quantitatively extreme expression of normal functioning or as qualitatively distinct in its process. In this paper I argue that dynamic models of personality (e.g., object-relations, cognitive-affective processing system) offer the conceptual scaffolding to reconcile these seemingly incompatible approaches to characterizing the relationship between normal and pathological personality. I propose that advances in personality assessment that sample behavior and experiences intensively provide the empirical techniques, whereas interpersonal theory offers an integrative theoretical framework, for accomplishing this goal. PMID:22804676

  14. PhD Thesis: String theory in the early universe

    NASA Astrophysics Data System (ADS)

    Gwyn, Rhiannon

    2009-11-01

    The intersection of string theory with cosmology is unavoidable in the early universe, and its exploration may shine light on both fields. In this thesis, three papers at this intersection are presented and reviewed, with the aim of providing a thorough and pedagogical guide to their results. First, we address the longstanding problem of finding a string theory realisation of the axion. Using warped compactifications in heterotic string theory, we show that the axion decay constant can be lowered to acceptable values by the warp factor. Next, we move to the subject of cosmic strings, whose network evolution could have important consequences for astrophysics and cosmology. In particular, there are quantitative differences between cosmic superstring networks and GUT cosmic string networks. We investigate the properties of cosmic superstring networks in warped backgrounds, giving the tension and properties of three-string junctions in these backgrounds. Finally, we examine the possibility that cosmic strings in heterotic string theory could be responsible for generating the galactic magnetic fields that seeded those observed today.

  15. Some Recent Developments in the Endochronic Theory with Application to Cyclic Histories

    NASA Technical Reports Server (NTRS)

    Valanis, K. C.; Lee, C. F.

    1983-01-01

    Constitutive equations with only two easily determined material constants predict the stress (strain) response of normalized mild steel to a variety of general strain (stress) histories, without a need for special unloading-reloading rules. The equations are derived from the endochronic theory of plasticity of isotropic materials with an intrinsic time scale defined in the plastic strain space. Agreement between theoretical predictions and experiments are are excellent quantitatively in cases of various uniaxial constant amplitude histories, variable uniaxial strain amplitude histories and cyclic relaxation. The cyclic ratcheting phenomenon is predicted by the present theory.

  16. Perceptions of Challenge: The Role of Catastrophe Theory in Piano Learning

    ERIC Educational Resources Information Center

    Bugos, Jennifer; Lee, William

    2015-01-01

    The purpose of this study is to evaluate the perceptions of private piano instructors on the role of challenge in teaching and learning the piano and to examine the potential application of catastrophe theory in understanding the role and outcomes of such challenges. A 23-item electronic questionnaire was administered to collect quantitative and…

  17. Closing in on Chemical Bonds by Opening up Relativity Theory

    PubMed Central

    Whitney, Cynthia Kolb

    2008-01-01

    This paper develops a connection between the phenomenology of chemical bonding and the theory of relativity. Empirical correlations between electron numbers in atoms and chemical bond stabilities in molecules are first reviewed and extended. Quantitative chemical bond strengths are then related to ionization potentials in elements. Striking patterns in ionization potentials are revealed when the data are viewed in an element-independent way, where element-specific details are removed via an appropriate scaling law. The scale factor involved is not explained by quantum mechanics; it is revealed only when one goes back further, to the development of Einstein’s special relativity theory. PMID:19325749

  18. The Development of Mathematical Knowledge for Teaching for Quantitative Reasoning Using Video-Based Instruction

    NASA Astrophysics Data System (ADS)

    Walters, Charles David

    Quantitative reasoning (P. W. Thompson, 1990, 1994) is a powerful mathematical tool that enables students to engage in rich problem solving across the curriculum. One way to support students' quantitative reasoning is to develop prospective secondary teachers' (PSTs) mathematical knowledge for teaching (MKT; Ball, Thames, & Phelps, 2008) related to quantitative reasoning. However, this may prove challenging, as prior to entering the classroom, PSTs often have few opportunities to develop MKT by examining and reflecting on students' thinking. Videos offer one avenue through which such opportunities are possible. In this study, I report on the design of a mini-course for PSTs that featured a series of videos created as part of a proof-of-concept NSF-funded project. These MathTalk videos highlight the ways in which the quantitative reasoning of two high school students developed over time. Using a mixed approach to grounded theory, I analyzed pre- and postinterviews using an extant coding scheme based on the Silverman and Thompson (2008) framework for the development of MKT. This analysis revealed a shift in participants' affect as well as three distinct shifts in their MKT around quantitative reasoning with distances, including shifts in: (a) quantitative reasoning; (b) point of view (decentering); and (c) orientation toward problem solving. Using the four-part focusing framework (Lobato, Hohensee, & Rhodehamel, 2013), I analyzed classroom data to account for how participants' noticing was linked with the shifts in MKT. Notably, their increased noticing of aspects of MKT around quantitative reasoning with distances, which features prominently in the MathTalk videos, seemed to contribute to the emergence of the shifts in MKT. Results from this study link elements of the learning environment to the development of specific facets of MKT around quantitative reasoning with distances. These connections suggest that vicarious experiences with two students' quantitative

  19. A Qualitative Analysis Framework Using Natural Language Processing and Graph Theory

    ERIC Educational Resources Information Center

    Tierney, Patrick J.

    2012-01-01

    This paper introduces a method of extending natural language-based processing of qualitative data analysis with the use of a very quantitative tool--graph theory. It is not an attempt to convert qualitative research to a positivist approach with a mathematical black box, nor is it a "graphical solution". Rather, it is a method to help qualitative…

  20. Connecting qualitative observation and quantitative measurement for enhancing quantitative literacy in plant anatomy course

    NASA Astrophysics Data System (ADS)

    Nuraeni, E.; Rahmat, A.

    2018-05-01

    Forming of cognitive schemes of plant anatomy concepts is performed by processing of qualitative and quantitative data obtained from microscopic observations. To enhancing student’s quantitative literacy, strategy of plant anatomy course was modified by adding the task to analyze quantitative data produced by quantitative measurement of plant anatomy guided by material course. Participant in this study was 24 biology students and 35 biology education students. Quantitative Literacy test, complex thinking in plant anatomy test and questioner used to evaluate the course. Quantitative literacy capability data was collected by quantitative literacy test with the rubric from the Association of American Colleges and Universities, Complex thinking in plant anatomy by test according to Marzano and questioner. Quantitative literacy data are categorized according to modified Rhodes and Finley categories. The results showed that quantitative literacy of biology education students is better than biology students.

  1. Multiconfiguration Pair-Density Functional Theory.

    PubMed

    Li Manni, Giovanni; Carlson, Rebecca K; Luo, Sijie; Ma, Dongxia; Olsen, Jeppe; Truhlar, Donald G; Gagliardi, Laura

    2014-09-09

    We present a new theoretical framework, called Multiconfiguration Pair-Density Functional Theory (MC-PDFT), which combines multiconfigurational wave functions with a generalization of density functional theory (DFT). A multiconfigurational self-consistent-field (MCSCF) wave function with correct spin and space symmetry is used to compute the total electronic density, its gradient, the on-top pair density, and the kinetic and Coulomb contributions to the total electronic energy. We then use a functional of the total density, its gradient, and the on-top pair density to calculate the remaining part of the energy, which we call the on-top-density-functional energy in contrast to the exchange-correlation energy of Kohn-Sham DFT. Because the on-top pair density is an element of the two-particle density matrix, this goes beyond the Hohenberg-Kohn theorem that refers only to the one-particle density. To illustrate the theory, we obtain first approximations to the required new type of density functionals by translating conventional density functionals of the spin densities using a simple prescription, and we perform post-SCF density functional calculations using the total density, density gradient, and on-top pair density from the MCSCF calculations. Double counting of dynamic correlation or exchange does not occur because the MCSCF energy is not used. The theory is illustrated by applications to the bond energies and potential energy curves of H2, N2, F2, CaO, Cr2, and NiCl and the electronic excitation energies of Be, C, N, N(+), O, O(+), Sc(+), Mn, Co, Mo, Ru, N2, HCHO, C4H6, c-C5H6, and pyrazine. The method presented has a computational cost and scaling similar to MCSCF, but a quantitative accuracy, even with the present first approximations to the new types of density functionals, that is comparable to much more expensive multireference perturbation theory methods.

  2. Assessment of Scientific Literacy: Development and Validation of the Quantitative Assessment of Socio-Scientific Reasoning (QuASSR)

    ERIC Educational Resources Information Center

    Romine, William L.; Sadler, Troy D.; Kinslow, Andrew T.

    2017-01-01

    We describe the development and validation of the Quantitative Assessment of Socio-scientific Reasoning (QuASSR) in a college context. The QuASSR contains 10 polytomous, two-tiered items crossed between two scenarios, and is based on theory suggesting a four-pronged structure for SSR (complexity, perspective taking, inquiry, and skepticism). In…

  3. Data-Driven Decisions: Using Equity Theory to Highlight Implications for Underserved Students

    ERIC Educational Resources Information Center

    Fowler, Denver J.; Brown, Kelly

    2018-01-01

    By using equity theory through a social justice lens, the authors intend to highlight how data are currently being used to solve the "what" and not the "why" as it relates to achievement gaps for marginalized students in urban settings. School practitioners have been utilizing quantitative data, such as district and state…

  4. Bayesian parameter estimation in spectral quantitative photoacoustic tomography

    NASA Astrophysics Data System (ADS)

    Pulkkinen, Aki; Cox, Ben T.; Arridge, Simon R.; Kaipio, Jari P.; Tarvainen, Tanja

    2016-03-01

    Photoacoustic tomography (PAT) is an imaging technique combining strong contrast of optical imaging to high spatial resolution of ultrasound imaging. These strengths are achieved via photoacoustic effect, where a spatial absorption of light pulse is converted into a measurable propagating ultrasound wave. The method is seen as a potential tool for small animal imaging, pre-clinical investigations, study of blood vessels and vasculature, as well as for cancer imaging. The goal in PAT is to form an image of the absorbed optical energy density field via acoustic inverse problem approaches from the measured ultrasound data. Quantitative PAT (QPAT) proceeds from these images and forms quantitative estimates of the optical properties of the target. This optical inverse problem of QPAT is illposed. To alleviate the issue, spectral QPAT (SQPAT) utilizes PAT data formed at multiple optical wavelengths simultaneously with optical parameter models of tissue to form quantitative estimates of the parameters of interest. In this work, the inverse problem of SQPAT is investigated. Light propagation is modelled using the diffusion equation. Optical absorption is described with chromophore concentration weighted sum of known chromophore absorption spectra. Scattering is described by Mie scattering theory with an exponential power law. In the inverse problem, the spatially varying unknown parameters of interest are the chromophore concentrations, the Mie scattering parameters (power law factor and the exponent), and Gruneisen parameter. The inverse problem is approached with a Bayesian method. It is numerically demonstrated, that estimation of all parameters of interest is possible with the approach.

  5. Successful Language Learning in a Corporate Setting: The Role of Attribution Theory and Its Relation to Intrinsic and Extrinsic Motivation

    ERIC Educational Resources Information Center

    Kálmán, Csaba; Eugenio, Esther Gutierrez

    2015-01-01

    Attribution theory (Weiner, 1985) and self-determination theory (Deci & Ryan, 1985) have been explored as contributors to L2 motivation (cf. Dörnyei, 2001) but have never been studied quantitatively in concert. In addition, students' attributions for success in learning a foreign language have never been measured through the use of a…

  6. From Theory to Practice: A Quantitative Content Analysis of Adult Education's Language on Meaning Making

    ERIC Educational Resources Information Center

    Roessger, Kevin M.

    2017-01-01

    Translating theory to practice has been a historical concern of adult education. It remains unclear, though, if adult education's theoretical and epistemological focus on meaning making transcends the academy. A manifest content analysis was conducted to determine if the frequency of meaning making language differed between the field's U.S.…

  7. Quantitative interpretations of Visible-NIR reflectance spectra of blood.

    PubMed

    Serebrennikova, Yulia M; Smith, Jennifer M; Huffman, Debra E; Leparc, German F; García-Rubio, Luis H

    2008-10-27

    This paper illustrates the implementation of a new theoretical model for rapid quantitative analysis of the Vis-NIR diffuse reflectance spectra of blood cultures. This new model is based on the photon diffusion theory and Mie scattering theory that have been formulated to account for multiple scattering populations and absorptive components. This study stresses the significance of the thorough solution of the scattering and absorption problem in order to accurately resolve for optically relevant parameters of blood culture components. With advantages of being calibration-free and computationally fast, the new model has two basic requirements. First, wavelength-dependent refractive indices of the basic chemical constituents of blood culture components are needed. Second, multi-wavelength measurements or at least the measurements of characteristic wavelengths equal to the degrees of freedom, i.e. number of optically relevant parameters, of blood culture system are required. The blood culture analysis model was tested with a large number of diffuse reflectance spectra of blood culture samples characterized by an extensive range of the relevant parameters.

  8. Light-propagation management in coupled waveguide arrays: Quantitative experimental and theoretical assessment from band structures to functional patterns

    NASA Astrophysics Data System (ADS)

    Moison, Jean-Marie; Belabas, Nadia; Levenson, Juan Ariel; Minot, Christophe

    2012-09-01

    We assess the band structure of arrays of coupled optical waveguides both by ab initio calculations and by experiments, with an excellent quantitative agreement without any adjustable physical parameter. The band structures we obtain can deviate strongly from the expectations of the standard coupled mode theory approximation, but we describe them efficiently by a few parameters within an extended coupled mode theory. We also demonstrate that this description is in turn a firm and simple basis for accurate beam management in functional patterns of coupled waveguides, in full accordance with their design.

  9. Quantiative reliability of the Migdal-Eliashberg theory for strong coupling superconductors

    NASA Astrophysics Data System (ADS)

    Bauer, Johannes; Han, Jong; Gunnarsson, Olle

    2012-02-01

    The Migdal-Eliashberg (ME) theory for strong electron-phonon coupling and retardation effects of the Morel-Anderson type form the basis for the quantitative understanding of conventional superconductors. The validity of the ME theory for values of the electron-phonon coupling strength λ>1 has been questioned by model studies. By distinguishing bare and effective parameters, and by comparing the ME theory with the dynamical mean field theory (DMFT), we clarify the range of applicability of the ME theory. Specifically, we show that ME theory is very accurate as long as the product of effective parameters, λφph/D, where φph is an appropriate phonon scale and D an electronic scale, is small enough [1]. The effectiveness of retardation effects is usually considered based on the lowest order diagram in the perturbation theory. We analyze these effects to higher order and find modifications to the usual result for the Coulomb pseudo-potential &*circ;. Retardation effects are weakened due to a reduced effective bandwidth. Comparison with the non-perturbative DMFT corroborates our findings [2]. [4pt] [1] J Bauer, J E Han, and O Gunnarsson, Phys. Rev. B. 84, 184531 (2011).[0pt] [2] J Bauer, J E Han, and O Gunnarsson, in preparation (2011).

  10. Theory of reasoned action and theory of planned behavior-based dietary interventions in adolescents and young adults: a systematic review

    PubMed Central

    Hackman, Christine L; Knowlden, Adam P

    2014-01-01

    Background Childhood obesity has reached epidemic proportions in many nations around the world. The theory of planned behavior (TPB) and the theory of reasoned action (TRA) have been used to successfully plan and evaluate numerous interventions for many different behaviors. The aim of this study was to systematically review and synthesize TPB and TRA-based dietary behavior interventions targeting adolescents and young adults. Methods The following databases were systematically searched to find articles for this review: Academic Search Premier; Cumulative Index to Nursing and Allied Health (CINAHL); Education Resources Information Center (ERIC); Health Source: Nursing/Academic Edition; Cochrane Central Register of Controlled Trials (CENTRAL); and MEDLINE. Inclusion criteria for articles were: 1) primary or secondary interventions, 2) with any quantitative design, 3) published in the English language, 4) between January 2003 and March 2014, 5) that targeted adolescents or young adults, 6) which included dietary change behavior as the outcome, and 7) utilized TPB or TRA. Results Of the eleven intervention studies evaluated, nine resulted in dietary behavior change that was attributed to the treatment. Additionally, all but one study found there to be a change in at least one construct of TRA or TPB, while one study did not measure constructs. All of the studies utilized some type of quantitative design, with two employing quasi-experimental, and eight employing randomized control trial design. Among the studies, four utilized technology including emails, social media posts, information on school websites, web-based activities, audio messages in classrooms, interactive DVDs, and health-related websites. Two studies incorporated goal setting and four employed persuasive communication. Conclusion Interventions directed toward changing dietary behaviors in adolescents should aim to incorporate multi-faceted, theory-based approaches. Future studies should consider

  11. Theory of reasoned action and theory of planned behavior-based dietary interventions in adolescents and young adults: a systematic review.

    PubMed

    Hackman, Christine L; Knowlden, Adam P

    2014-01-01

    Childhood obesity has reached epidemic proportions in many nations around the world. The theory of planned behavior (TPB) and the theory of reasoned action (TRA) have been used to successfully plan and evaluate numerous interventions for many different behaviors. The aim of this study was to systematically review and synthesize TPB and TRA-based dietary behavior interventions targeting adolescents and young adults. THE FOLLOWING DATABASES WERE SYSTEMATICALLY SEARCHED TO FIND ARTICLES FOR THIS REVIEW: Academic Search Premier; Cumulative Index to Nursing and Allied Health (CINAHL); Education Resources Information Center (ERIC); Health Source: Nursing/Academic Edition; Cochrane Central Register of Controlled Trials (CENTRAL); and MEDLINE. Inclusion criteria for articles were: 1) primary or secondary interventions, 2) with any quantitative design, 3) published in the English language, 4) between January 2003 and March 2014, 5) that targeted adolescents or young adults, 6) which included dietary change behavior as the outcome, and 7) utilized TPB or TRA. Of the eleven intervention studies evaluated, nine resulted in dietary behavior change that was attributed to the treatment. Additionally, all but one study found there to be a change in at least one construct of TRA or TPB, while one study did not measure constructs. All of the studies utilized some type of quantitative design, with two employing quasi-experimental, and eight employing randomized control trial design. Among the studies, four utilized technology including emails, social media posts, information on school websites, web-based activities, audio messages in classrooms, interactive DVDs, and health-related websites. Two studies incorporated goal setting and four employed persuasive communication. Interventions directed toward changing dietary behaviors in adolescents should aim to incorporate multi-faceted, theory-based approaches. Future studies should consider utilizing randomized control trial design and

  12. K*(892) and ϕ(1020) production and their decay into the hadronic medium at the Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Shapoval, V. M.; Braun-Munzinger, P.; Sinyukov, Yu. M.

    2017-12-01

    The production of the K* (892) strange resonance in Pb +Pb collisions at √{sNN} = 2.76 TeV LHC energy is analyzed within the integrated hydrokinetic model (iHKM) at different equations of state of superdense matter. The similar analysis is done also for the RHIC top energy √{sNN} = 200 GeV for comparison purposes. A modification of experimental K* (892)-identification is studied for different centralities in view of possible re-scattering of the decay products at the afterburner stage of the fireball evolution. We see quite intensive rescattering of the decay products as well as recombination processes for K* (892). In addition, the production of the much longer-long-lived ϕ (1020) resonance with hidden strange quark content is investigated.

  13. Steering continuum electron dynamics by low-energy attosecond streaking

    NASA Astrophysics Data System (ADS)

    Geng, Ji-Wei; Xiong, Wei-Hao; Xiao, Xiang-Ru; Gong, Qihuang; Peng, Liang-You

    2016-08-01

    A semiclassical model is developed to understand the electronic dynamics in the low-energy attosecond streaking. Under a relatively strong infrared (IR) pulse, the low-energy part of photoelectrons initialized by a single attosecond pulse (SAP) can either rescatter with the ionic core and induce interferences structures in the momentum spectra of the ionized electrons or be recaptured into the Rydberg states. The Coulomb potential plays essential roles in both the electron rescattering and recapturing processes. We find that by changing the time delay between the SAP and the IR pulse, the photoelectrons yield or the population of the Rydberg states can be effectively controlled. The present study demonstrates a fascinating way to steer the electron motion in the continuum.

  14. Statistical theory of combinatorial libraries of folding proteins: energetic discrimination of a target structure.

    PubMed

    Zou, J; Saven, J G

    2000-02-11

    A self-consistent theory is presented that can be used to estimate the number and composition of sequences satisfying a predetermined set of constraints. The theory is formulated so as to examine the features of sequences having a particular value of Delta=E(f)-(u), where E(f) is the energy of sequences when in a target structure and (u) is an average energy of non-target structures. The theory yields the probabilities w(i)(alpha) that each position i in the sequence is occupied by a particular monomer type alpha. The theory is applied to a simple lattice model of proteins. Excellent agreement is observed between the theory and the results of exact enumerations. The theory provides a quantitative framework for the design and interpretation of combinatorial experiments involving proteins, where a library of amino acid sequences is searched for sequences that fold to a desired structure. Copyright 2000 Academic Press.

  15. Information Theory Applied to Dolphin Whistle Vocalizations with Possible Application to SETI Signals

    NASA Astrophysics Data System (ADS)

    Doyle, Laurance R.; McCowan, Brenda; Hanser, Sean F.

    2002-01-01

    Information theory allows a quantification of the complexity of a given signaling system. We are applying information theory to dolphin whistle vocalizations, humpback whale songs, squirrel monkey chuck calls, and several other animal communication systems' in order to develop a quantitative and objective way to compare inter species communication systems' complexity. Once signaling units have been correctly classified the communication system must obey certain statistical distributions in order to contain complexity whether it is human languages, dolphin whistle vocalizations, or even a system of communication signals received from an extraterrestrial source.

  16. Revealing Long-Range Interconnected Hubs in Human Chromatin Interaction Data Using Graph Theory

    NASA Astrophysics Data System (ADS)

    Boulos, R. E.; Arneodo, A.; Jensen, P.; Audit, B.

    2013-09-01

    We use graph theory to analyze chromatin interaction (Hi-C) data in the human genome. We show that a key functional feature of the genome—“master” replication origins—corresponds to DNA loci of maximal network centrality. These loci form a set of interconnected hubs both within chromosomes and between different chromosomes. Our results open the way to a fruitful use of graph theory concepts to decipher DNA structural organization in relation to genome functions such as replication and transcription. This quantitative information should prove useful to discriminate between possible polymer models of nuclear organization.

  17. Quantitative metal magnetic memory reliability modeling for welded joints

    NASA Astrophysics Data System (ADS)

    Xing, Haiyan; Dang, Yongbin; Wang, Ben; Leng, Jiancheng

    2016-03-01

    Metal magnetic memory(MMM) testing has been widely used to detect welded joints. However, load levels, environmental magnetic field, and measurement noises make the MMM data dispersive and bring difficulty to quantitative evaluation. In order to promote the development of quantitative MMM reliability assessment, a new MMM model is presented for welded joints. Steel Q235 welded specimens are tested along the longitudinal and horizontal lines by TSC-2M-8 instrument in the tensile fatigue experiments. The X-ray testing is carried out synchronously to verify the MMM results. It is found that MMM testing can detect the hidden crack earlier than X-ray testing. Moreover, the MMM gradient vector sum K vs is sensitive to the damage degree, especially at early and hidden damage stages. Considering the dispersion of MMM data, the K vs statistical law is investigated, which shows that K vs obeys Gaussian distribution. So K vs is the suitable MMM parameter to establish reliability model of welded joints. At last, the original quantitative MMM reliability model is first presented based on the improved stress strength interference theory. It is shown that the reliability degree R gradually decreases with the decreasing of the residual life ratio T, and the maximal error between prediction reliability degree R 1 and verification reliability degree R 2 is 9.15%. This presented method provides a novel tool of reliability testing and evaluating in practical engineering for welded joints.

  18. AN EDUCATIONAL THEORY MODEL--(SIGGS), AN INTEGRATION OF SET THEORY, INFORMATION THEORY, AND GRAPH THEORY WITH GENERAL SYSTEMS THEORY.

    ERIC Educational Resources Information Center

    MACCIA, ELIZABETH S.; AND OTHERS

    AN ANNOTATED BIBLIOGRAPHY OF 20 ITEMS AND A DISCUSSION OF ITS SIGNIFICANCE WAS PRESENTED TO DESCRIBE CURRENT UTILIZATION OF SUBJECT THEORIES IN THE CONSTRUCTION OF AN EDUCATIONAL THEORY. ALSO, A THEORY MODEL WAS USED TO DEMONSTRATE CONSTRUCTION OF A SCIENTIFIC EDUCATIONAL THEORY. THE THEORY MODEL INCORPORATED SET THEORY (S), INFORMATION THEORY…

  19. Quantitative research.

    PubMed

    Watson, Roger

    2015-04-01

    This article describes the basic tenets of quantitative research. The concepts of dependent and independent variables are addressed and the concept of measurement and its associated issues, such as error, reliability and validity, are explored. Experiments and surveys – the principal research designs in quantitative research – are described and key features explained. The importance of the double-blind randomised controlled trial is emphasised, alongside the importance of longitudinal surveys, as opposed to cross-sectional surveys. Essential features of data storage are covered, with an emphasis on safe, anonymous storage. Finally, the article explores the analysis of quantitative data, considering what may be analysed and the main uses of statistics in analysis.

  20. Nuclear medicine and imaging research (quantitative studies in radiopharmaceutical science)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cooper, M.D.; Beck, R.N.

    1990-09-01

    This is a report of progress in Year Two (January 1, 1990--December 31, 1990) of Grant FG02-86ER60438, Quantitative Studies in Radiopharmaceutical Science,'' awarded for the three-year period January 1, 1989--December 31, 1991 as a competitive renewal following site visit in the fall of 1988. This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further themore » development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. 25 refs., 13 figs., 1 tab.« less

  1. A novel evaluation method for building construction project based on integrated information entropy with reliability theory.

    PubMed

    Bai, Xiao-ping; Zhang, Xi-wei

    2013-01-01

    Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  2. Kinetic theory for DNA melting with vibrational entropy

    NASA Astrophysics Data System (ADS)

    Sensale, Sebastian; Peng, Zhangli; Chang, Hsueh-Chia

    2017-10-01

    By treating DNA as a vibrating nonlinear lattice, an activated kinetic theory for DNA melting is developed to capture the breakage of the hydrogen bonds and subsequent softening of torsional and bending vibration modes. With a coarse-grained lattice model, we identify a key bending mode with GHz frequency that replaces the hydrogen vibration modes as the dominant out-of-phase phonon vibration at the transition state. By associating its bending modulus to a universal in-phase bending vibration modulus at equilibrium, we can hence estimate the entropic change in the out-of-phase vibration from near-equilibrium all-atom simulations. This and estimates of torsional and bending entropy changes lead to the first predictive and sequence-dependent theory with good quantitative agreement with experimental data for the activation energy of melting of short DNA molecules without intermediate hairpin structures.

  3. Using Innovation Diffusion Theory and the Technolgy Acceptance Model to Evaluate the Security of Wireless Mobile Devices at a Post Secondary Institution

    ERIC Educational Resources Information Center

    Feliciano-Torres, Hector L.

    2017-01-01

    The purpose of this quantitative, descriptive non experimental study was to investigate the use of wireless mobile network devices at a post-secondary institution using the innovation diffusion theory (IDT) and technology acceptance model (TAM) as background theories. The researcher intended to explore how students and personnel of the institution…

  4. Relative Proximity Theory: Measuring the Gap between Actual and Ideal Online Course Delivery

    ERIC Educational Resources Information Center

    Swart, William; MacLeod, Kenneth; Paul, Ravi; Zhang, Aixiu; Gagulic, Mario

    2014-01-01

    Based on the Theory of Transactional Distance and Needs Assessment, this article reports a procedure for quantitatively measuring how close the actual delivery of a course was to ideal, as perceived by students. It extends Zhang's instrument and prescribes the computational steps to calculate relative proximity at the element and construct…

  5. Atomic Theory and Multiple Combining Proportions: The Search for Whole Number Ratios.

    PubMed

    Usselman, Melvyn C; Brown, Todd A

    2015-04-01

    John Dalton's atomic theory, with its postulate of compound formation through atom-to-atom combination, brought a new perspective to weight relationships in chemical reactions. A presumed one-to-one combination of atoms A and B to form a simple compound AB allowed Dalton to construct his first table of relative atomic weights from literature analyses of appropriate binary compounds. For such simple binary compounds, the atomic theory had little advantages over affinity theory as an explanation of fixed proportions by weight. For ternary compounds of the form AB2, however, atomic theory made quantitative predictions that were not deducible from affinity theory. Atomic theory required that the weight of B in the compound AB2 be exactly twice that in the compound AB. Dalton, Thomas Thomson and William Hyde Wollaston all published within a few years of each other experimental data that claimed to give the predicted results with the required accuracy. There are nonetheless several experimental barriers to obtaining the desired integral multiple proportions. In this paper I will discuss replication experiments which demonstrate that only Wollaston's results are experimentally reliable. It is likely that such replicability explains why Wollaston's experiments were so influential.

  6. Sewall Wright, shifting balance theory, and the hardening of the modern synthesis.

    PubMed

    Ishida, Yoichi

    2017-02-01

    The period between the 1940s and 1960s saw the hardening of the modern synthesis in evolutionary biology. Gould and Provine argue that Wright's shifting balance theory of evolution hardened during this period. But their account does not do justice to Wright, who always regarded selection as acting together with drift. This paper presents a more adequate account of the development of Wright's shifting balance theory, paying particular attention to his application of the theory to the geographical distribution of flower color dimorphism in Linanthus parryae. The account shows that even in the heyday of the hardened synthesis, the balance or interaction of evolutionary factors, such as drift, selection, and migration, occupied pride of place in Wright's theory, and that between the 1940s and 1970s, Wright developed the theory of isolation by distance to quantitatively represent the structure of the Linanthus population, which he argued had the kind of structure posited by his shifting balance theory. In the end, Wright arrived at a sophisticated description of the structure of the Linanthus population, where the interaction between drift and selection varied spatially. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Quantitative observations of hydrogen-induced, slow crack growth in a low alloy steel

    NASA Technical Reports Server (NTRS)

    Nelson, H. G.; Williams, D. P.

    1973-01-01

    Hydrogen-induced slow crack growth, da/dt, was studied in AISI-SAE 4130 low alloy steel in gaseous hydrogen and distilled water environments as a function of applied stress intensity, K, at various temperatures, hydrogen pressures, and alloy strength levels. At low values of K, da/dt was found to exhibit a strong exponential K dependence (Stage 1 growth) in both hydrogen and water. At intermediate values of K, da/dt exhibited a small but finite K dependence (Stage 2), with the Stage 2 slope being greater in hydrogen than in water. In hydrogen, at a constant K, (da/dt) sub 2 varied inversely with alloy strength level and varied essentially in the same complex manner with temperature and hydrogen pressure as noted previously. The results of this study provide support for most of the qualitative predictions of the lattice decohesion theory as recently modified by Oriani. The lack of quantitative agreement between data and theory and the inability of theory to explain the observed pressure dependence of slow crack growth are mentioned and possible rationalizations to account for these differences are presented.

  8. Quantitative molecular orbital energies within a G0W0 approximation

    NASA Astrophysics Data System (ADS)

    Sharifzadeh, S.; Tamblyn, I.; Doak, P.; Darancet, P. T.; Neaton, J. B.

    2012-09-01

    Using many-body perturbation theory within a G 0 W 0 approximation, with a plane wave basis set and using a starting point based on density functional theory within the generalized gradient approximation, we explore routes for computing the ionization potential (IP), electron affinity (EA), and fundamental gap of three gas-phase molecules — benzene, thiophene, and (1,4) diamino-benzene — and compare with experiments. We examine the dependence of the IP and fundamental gap on the number of unoccupied states used to represent the dielectric function and the self energy, as well as the dielectric function plane-wave cutoff. We find that with an effective completion strategy for approximating the unoccupied subspace, and a well converged dielectric function kinetic energy cutoff, the computed IPs and EAs are in excellent quantitative agreement with available experiment (within 0.2 eV), indicating that a one-shot G 0 W 0 approach can be very accurate for calculating addition/removal energies of small organic molecules.

  9. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast

    PubMed Central

    Pang, Wei; Coghill, George M.

    2015-01-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. PMID:25864377

  10. Rigour in quantitative research.

    PubMed

    Claydon, Leica Sarah

    2015-07-22

    This article which forms part of the research series addresses scientific rigour in quantitative research. It explores the basis and use of quantitative research and the nature of scientific rigour. It examines how the reader may determine whether quantitative research results are accurate, the questions that should be asked to determine accuracy and the checklists that may be used in this process. Quantitative research has advantages in nursing, since it can provide numerical data to help answer questions encountered in everyday practice.

  11. A Review of Trend of Nursing Theories related Caregivers in Korea

    PubMed Central

    Hae Kim, Sung; Choi, Yoona; Lee, Ji-Hye; Jang, Da-El; Kim, Sanghee

    2018-01-01

    Background: The prevalence of chronic diseases has been rapidly increased due to population aging. As the duration of care needs increase, the caregivers’ socioeconomic burdens have also increased. Objective: This review examines the attributes of caregiving experience and quality of life of caregivers in Korea with a focus on the application of nursing theory. Method: We reviewed studies on caregivers’ caring for adult patients published till 2016 in 4 bio-medical research portal websites or data bases. A total of 1,939 studies were identified through the keyword search. One hundred forty five studies were selected by a process; of which, 17 studies were theory-applied. Selected studies were analyzed in accordance with the structured analysis format. Results: Quantitative studies accounted for 76.6%, while 22.1% were qualitative studies and 1.3% were triangulation studies. Caregiver-related studies increased after 2000. Most frequently, the caregivers were spouses (28.4%), and most frequently, care was provided to a recipient affected by stroke (22.5%). The 17 theory-based studies described 20 theories (70% psychology theories, 30% nursing theories). The most frequent nursing theory was the theory of stress, appraisal and coping. Conclusion: This study sought to better understand caregiving through the analysis of Korean studies on the caregiving experience and caregivers’ QOL and this finding helped presenting empirical data for nursing by identifying the nursing theories applied to the caregiving experience and caregivers’ QOL. The results suggest that the need for further expansion of nursing theories and their greater utilization in the studies of caregiving. PMID:29515682

  12. Investigating an approach to the alliance based on interpersonal defense theory.

    PubMed

    Westerman, Michael A; Muran, J Christopher

    2017-09-01

    Notwithstanding consistent findings of significant relationships between the alliance and outcome, questions remain to be answered about the relatively small magnitude of those correlations, the mechanisms underlying the association, and how to conceptualize the alliance construct. We conducted a preliminary study of an approach to the alliance based on interpersonal defense theory, which is an interpersonal reconceptualization of defense processes, to investigate the promise of this alternative approach as a way to address the outstanding issues. We employed qualitative, theory-building case study methodology, closely examining alliance processes at four time points in the treatment of a case in terms of a case formulation based on interpersonal defense theory. The results suggested that our approach made it possible to recognize key processes in the alliance and that it helps explain how the alliance influences outcome. Our analyses also provided a rich set of concrete illustrations of the alliance phenomena identified by the theory. The findings suggest that an approach to the alliance based on interpersonal defense theory holds promise. However, although the qualitative method we employed has advantages, it also has limitations. We offer suggestions about how future qualitative and quantitative investigations could build on this study.

  13. Simple Z2 lattice gauge theories at finite fermion density

    NASA Astrophysics Data System (ADS)

    Prosko, Christian; Lee, Shu-Ping; Maciejko, Joseph

    2017-11-01

    Lattice gauge theories are a powerful language to theoretically describe a variety of strongly correlated systems, including frustrated magnets, high-Tc superconductors, and topological phases. However, in many cases gauge fields couple to gapless matter degrees of freedom, and such theories become notoriously difficult to analyze quantitatively. In this paper we study several examples of Z2 lattice gauge theories with gapless fermions at finite density, in one and two spatial dimensions, that are either exactly soluble or whose solution reduces to that of a known problem. We consider complex fermions (spinless and spinful) as well as Majorana fermions and study both theories where Gauss' law is strictly imposed and those where all background charge sectors are kept in the physical Hilbert space. We use a combination of duality mappings and the Z2 slave-spin representation to map our gauge theories to models of gauge-invariant fermions that are either free, or with on-site interactions of the Hubbard or Falicov-Kimball type that are amenable to further analysis. In 1D, the phase diagrams of these theories include free-fermion metals, insulators, and superconductors, Luttinger liquids, and correlated insulators. In 2D, we find a variety of gapped and gapless phases, the latter including uniform and spatially modulated flux phases featuring emergent Dirac fermions, some violating Luttinger's theorem.

  14. The emotional coaching model: quantitative and qualitative research into relationships, communication and decisions in physical and sports rehabilitation

    PubMed Central

    RESPIZZI, STEFANO; COVELLI, ELISABETTA

    2015-01-01

    The emotional coaching model uses quantitative and qualitative elements to demonstrate some assumptions relevant to new methods of treatment in physical rehabilitation, considering emotional, cognitive and behavioral aspects in patients, whether or not they are sportsmen. Through quantitative tools (Tampa Kinesiophobia Scale, Emotional Interview Test, Previous Re-Injury Test, and reports on test scores) and qualitative tools (training contracts and relationships of emotional alliance or “contagion”), we investigate initial assumptions regarding: the presence of a cognitive and emotional mental state of impasse in patients at the beginning of the rehabilitation pathway; the curative value of the emotional alliance or “emotional contagion” relationship between healthcare provider and patient; the link between the patient’s pathology and type of contact with his own body and emotions; analysis of the psychosocial variables for the prediction of possible cases of re-injury for patients who have undergone or are afraid to undergo reconstruction of the anterior cruciate ligament (ACL). Although this approach is still in the experimental stage, the scores of the administered tests show the possibility of integrating quantitative and qualitative tools to investigate and develop a patient’s physical, mental and emotional resources during the course of his rehabilitation. Furthermore, it seems possible to identify many elements characterizing patients likely to undergo episodes of re-injury or to withdraw totally from sporting activity. In particular, such patients are competitive athletes, who fear or have previously undergone ACL reconstruction. The theories referred to (the transactional analysis theory, self-determination theory) and the tools used demonstrate the usefulness of continuing this research in order to build a shared coaching model treatment aimed at all patients, sportspeople or otherwise, which is not only physical but also emotional, cognitive

  15. The origin of allometric scaling laws in biology from genomes to ecosystems: towards a quantitative unifying theory of biological structure and organization.

    PubMed

    West, Geoffrey B; Brown, James H

    2005-05-01

    Life is the most complex physical phenomenon in the Universe, manifesting an extraordinary diversity of form and function over an enormous scale from the largest animals and plants to the smallest microbes and subcellular units. Despite this many of its most fundamental and complex phenomena scale with size in a surprisingly simple fashion. For example, metabolic rate scales as the 3/4-power of mass over 27 orders of magnitude, from molecular and intracellular levels up to the largest organisms. Similarly, time-scales (such as lifespans and growth rates) and sizes (such as bacterial genome lengths, tree heights and mitochondrial densities) scale with exponents that are typically simple powers of 1/4. The universality and simplicity of these relationships suggest that fundamental universal principles underly much of the coarse-grained generic structure and organisation of living systems. We have proposed a set of principles based on the observation that almost all life is sustained by hierarchical branching networks, which we assume have invariant terminal units, are space-filling and are optimised by the process of natural selection. We show how these general constraints explain quarter power scaling and lead to a quantitative, predictive theory that captures many of the essential features of diverse biological systems. Examples considered include animal circulatory systems, plant vascular systems, growth, mitochondrial densities, and the concept of a universal molecular clock. Temperature considerations, dimensionality and the role of invariants are discussed. Criticisms and controversies associated with this approach are also addressed.

  16. The Grammar of the Human Life Process: John Dewey's New Theory of Language

    ERIC Educational Resources Information Center

    Harris, Fred

    2012-01-01

    Dewey proposed a new theory of language, in which the form (such as symbols) and content of language are not separated. The content of language includes the physical aspects of the world, which are purely quantitative: the life process, which involves functional responses to qualities, and the human life process, which involves the conscious…

  17. The Six Core Theories of Modern Physics

    NASA Astrophysics Data System (ADS)

    Stevens, Charles F.

    1996-09-01

    Charles Stevens, a prominent neurobiologist who originally trained as a biophysicist (with George Uhlenbeck and Mark Kac), wrote this book almost by accident. Each summer he found himself reviewing key areas of physics that he had once known and understood well, for use in his present biological research. Since there was no book, he created his own set of notes, which formed the basis for this brief, clear, and self-contained summary of the basic theoretical structures of classical mechanics, electricity and magnetism, quantum mechanics, statistical physics, special relativity, and quantum field theory. The Six Core Theories of Modern Physics can be used by advanced undergraduates or beginning graduate students as a supplement to the standard texts or for an uncluttered, succinct review of the key areas. Professionals in such quantitative sciences as chemistry, engineering, computer science, applied mathematics, and biophysics who need to brush up on the essentials of a particular area will find most of the required background material, including the mathematics.

  18. Generalized framework for testing gravity with gravitational-wave propagation. II. Constraints on Horndeski theory

    NASA Astrophysics Data System (ADS)

    Arai, Shun; Nishizawa, Atsushi

    2018-05-01

    Gravitational waves (GW) are generally affected by modification of a gravity theory during propagation at cosmological distances. We numerically perform a quantitative analysis on Horndeski theory at the cosmological scale to constrain the Horndeski theory by GW observations in a model-independent way. We formulate a parametrization for a numerical simulation based on the Monte Carlo method and obtain the classification of the models that agrees with cosmic accelerating expansion within observational errors of the Hubble parameter. As a result, we find that a large group of the models in the Horndeski theory that mimic cosmic expansion of the Λ CDM model can be excluded from the simultaneous detection of a GW and its electromagnetic transient counterpart. Based on our result and the latest detection of GW170817 and GRB170817A, we conclude that the subclass of Horndeski theory including arbitrary functions G4 and G5 can hardly explain cosmic accelerating expansion without fine-tuning.

  19. Critical Quantitative Inquiry in Context

    ERIC Educational Resources Information Center

    Stage, Frances K.; Wells, Ryan S.

    2014-01-01

    This chapter briefly traces the development of the concept of critical quantitative inquiry, provides an expanded conceptualization of the tasks of critical quantitative research, offers theoretical explanation and justification for critical research using quantitative methods, and previews the work of quantitative criticalists presented in this…

  20. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast.

    PubMed

    Pang, Wei; Coghill, George M

    2015-05-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  1. Quantitative dispersion microscopy

    PubMed Central

    Fu, Dan; Choi, Wonshik; Sung, Yongjin; Yaqoob, Zahid; Dasari, Ramachandra R.; Feld, Michael

    2010-01-01

    Refractive index dispersion is an intrinsic optical property and a useful source of contrast in biological imaging studies. In this report, we present the first dispersion phase imaging of living eukaryotic cells. We have developed quantitative dispersion microscopy based on the principle of quantitative phase microscopy. The dual-wavelength quantitative phase microscope makes phase measurements at 310 nm and 400 nm wavelengths to quantify dispersion (refractive index increment ratio) of live cells. The measured dispersion of living HeLa cells is found to be around 1.088, which agrees well with that measured directly for protein solutions using total internal reflection. This technique, together with the dry mass and morphology measurements provided by quantitative phase microscopy, could prove to be a useful tool for distinguishing different types of biomaterials and studying spatial inhomogeneities of biological samples. PMID:21113234

  2. Quantum electronic stress: density-functional-theory formulation and physical manifestation.

    PubMed

    Hu, Hao; Liu, Miao; Wang, Z F; Zhu, Junyi; Wu, Dangxin; Ding, Hepeng; Liu, Zheng; Liu, Feng

    2012-08-03

    The concept of quantum electronic stress (QES) is introduced and formulated within density functional theory to elucidate extrinsic electronic effects on the stress state of solids and thin films in the absence of lattice strain. A formal expression of QES (σ(QE)) is derived in relation to deformation potential of electronic states (Ξ) and variation of electron density (Δn), σ(QE) = ΞΔn as a quantum analog of classical Hooke's law. Two distinct QES manifestations are demonstrated quantitatively by density functional theory calculations: (1) in the form of bulk stress induced by charge carriers and (2) in the form of surface stress induced by quantum confinement. Implications of QES in some physical phenomena are discussed to underlie its importance.

  3. Localization and quantitative co-localization of enamelin with amelogenin.

    PubMed

    Gallon, Victoria; Chen, Lisha; Yang, Xiudong; Moradian-Oldak, Janet

    2013-08-01

    Enamelin and amelogenin are vital proteins in enamel formation. The cooperative function of these two proteins controls crystal nucleation and morphology in vitro. We quantitatively analyzed the co-localization between enamelin and amelogenin by confocal microscopy and using two antibodies, one raised against a sequence in the porcine 32 kDa enamelin region and the other raised against full-length recombinant mouse amelogenin. We further investigated the interaction of the porcine 32 kDa enamelin and recombinant amelogenin using immuno-gold labeling. This study reports the quantitative co-localization results for postnatal days 1-8 mandibular mouse molars. We show that amelogenin and enamelin are secreted into the extracellular matrix on the cuspal slopes of the molars at day 1 and that secretion continues to at least day 8. Quantitative co-localization analysis (QCA) was performed in several different configurations using large (45 μm height, 33 μm width) and small (7 μm diameter) regions of interest to elucidate any patterns. Co-localization patterns in day 8 samples revealed that enamelin and amelogenin co-localize near the secretory face of the ameloblasts and appear to be secreted approximately in a 1:1 ratio. The degree of co-localization decreases as the enamel matures, both along the secretory face of ameloblasts and throughout the entire thickness of the enamel. Immuno-reactivity against enamelin is concentrated along the secretory face of ameloblasts, supporting the theory that this protein together with amelogenin is intimately involved in mineral induction at the beginning of enamel formation. Copyright © 2013 Elsevier Inc. All rights reserved.

  4. Quantitative evaluation of simulated functional brain networks in graph theoretical analysis.

    PubMed

    Lee, Won Hee; Bullmore, Ed; Frangou, Sophia

    2017-02-01

    There is increasing interest in the potential of whole-brain computational models to provide mechanistic insights into resting-state brain networks. It is therefore important to determine the degree to which computational models reproduce the topological features of empirical functional brain networks. We used empirical connectivity data derived from diffusion spectrum and resting-state functional magnetic resonance imaging data from healthy individuals. Empirical and simulated functional networks, constrained by structural connectivity, were defined based on 66 brain anatomical regions (nodes). Simulated functional data were generated using the Kuramoto model in which each anatomical region acts as a phase oscillator. Network topology was studied using graph theory in the empirical and simulated data. The difference (relative error) between graph theory measures derived from empirical and simulated data was then estimated. We found that simulated data can be used with confidence to model graph measures of global network organization at different dynamic states and highlight the sensitive dependence of the solutions obtained in simulated data on the specified connection densities. This study provides a method for the quantitative evaluation and external validation of graph theory metrics derived from simulated data that can be used to inform future study designs. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  5. Examination of Information Technology (IT) Certification and the Human Resources (HR) Professional Perception of Job Performance: A Quantitative Study

    ERIC Educational Resources Information Center

    O'Horo, Neal O.

    2013-01-01

    The purpose of this quantitative survey study was to test the Leontief input/output theory relating the input of IT certification to the output of the English-speaking U.S. human resource professional perceived IT professional job performance. Participants (N = 104) rated their perceptions of IT certified vs. non-IT certified professionals' job…

  6. Multiwavelength UV/visible spectroscopy for the quantitative investigation of platelet quality

    NASA Astrophysics Data System (ADS)

    Mattley, Yvette D.; Leparc, German F.; Potter, Robert L.; Garcia-Rubio, Luis H.

    1998-04-01

    The quality of platelets transfused is vital to the effectiveness of the transfusion. Freshly prepared, discoid platelets are the most effective treatment for preventing spontaneous hemorrhage or for stopping an abnormal bleeding event. Current methodology for the routine testing of platelet quality involves random pH testing of platelet rich plasma and visual inspection of platelet rich plasma for a swirling pattern indicative of the discoid shape of the cells. The drawback to these methods is that they do not provide a quantitative and objective assay for platelet functionality that can be used on each platelet unit prior to transfusion. As part of a larger project aimed at characterizing whole blood and blood components with multiwavelength UV/vis spectroscopy, isolated platelets and platelet in platelet rich plasma have been investigated. Models based on Mie theory have been developed which allow for the extraction of quantitative information on platelet size, number and quality from multi-wavelength UV/vis spectra. These models have been used to quantify changes in platelet rich plasma during storage. The overall goal of this work is to develop a simple, rapid quantitative assay for platelet quality that can be used prior to platelet transfusion to ensure the effectiveness of the treatment. As a result of this work, the optical properties for isolated platelets, platelet rich plasma and leukodepleted platelet rich plasma have been determined.

  7. Affective cognition: Exploring lay theories of emotion.

    PubMed

    Ong, Desmond C; Zaki, Jamil; Goodman, Noah D

    2015-10-01

    Humans skillfully reason about others' emotions, a phenomenon we term affective cognition. Despite its importance, few formal, quantitative theories have described the mechanisms supporting this phenomenon. We propose that affective cognition involves applying domain-general reasoning processes to domain-specific content knowledge. Observers' knowledge about emotions is represented in rich and coherent lay theories, which comprise consistent relationships between situations, emotions, and behaviors. Observers utilize this knowledge in deciphering social agents' behavior and signals (e.g., facial expressions), in a manner similar to rational inference in other domains. We construct a computational model of a lay theory of emotion, drawing on tools from Bayesian statistics, and test this model across four experiments in which observers drew inferences about others' emotions in a simple gambling paradigm. This work makes two main contributions. First, the model accurately captures observers' flexible but consistent reasoning about the ways that events and others' emotional responses to those events relate to each other. Second, our work models the problem of emotional cue integration-reasoning about others' emotion from multiple emotional cues-as rational inference via Bayes' rule, and we show that this model tightly tracks human observers' empirical judgments. Our results reveal a deep structural relationship between affective cognition and other forms of inference, and suggest wide-ranging applications to basic psychological theory and psychiatry. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  8. Conducting meta-analyses of HIV prevention literatures from a theory-testing perspective.

    PubMed

    Marsh, K L; Johnson, B T; Carey, M P

    2001-09-01

    Using illustrations from HIV prevention research, the current article advocates approaching meta-analysis as a theory-testing scientific method rather than as merely a set of rules for quantitative analysis. Like other scientific methods, meta-analysis has central concerns with internal, external, and construct validity. The focus of a meta-analysis should only rarely be merely describing the effects of health promotion, but rather should be on understanding and explaining phenomena and the processes underlying them. The methodological decisions meta-analysts make in conducting reviews should be guided by a consideration of the underlying goals of the review (e.g., simply effect size estimation or, preferably theory testing). From the advocated perspective that a health behavior meta-analyst should test theory, the authors present a number of issues to be considered during the conduct of meta-analyses.

  9. Theory of space-charge polarization for determining ionic constants of electrolytic solutions

    NASA Astrophysics Data System (ADS)

    Sawada, Atsushi

    2007-06-01

    A theoretical expression of the complex dielectric constant attributed to space-charge polarization has been derived under an electric field calculated using Poisson's equation considering the effects of bound charges on ions. The frequency dependence of the complex dielectric constant of chlorobenzene solutions doped with tetrabutylammonium tetraphenylborate (TBATPB) has been analyzed using the theoretical expression, and the impact of the bound charges on the complex dielectric constant has been clarified quantitatively in comparison with a theory that does not consider the effect of the bound charges. The Stokes radius of TBA +(=TPB-) determined by the present theory shows a good agreement with that determined by conductometry in the past; hence, the present theory should be applicable to the direct determination of the mobility of ion species in an electrolytic solution without the need to measure ionic limiting equivalent conductance and transport number.

  10. Modelling home televisiting services using systems dynamic theory.

    PubMed

    Valero, M A; Arredondo, M T; del Nogal, F; Gallar, P; Insausti, J; Del Pozo, F

    2001-01-01

    A quantitative model was developed to study the provision of a home televisiting service. Systems dynamic theory was used to describe the relationships between quality of care, accessibility and cost-effectiveness. Input information was gathered from the telemedicine literature, as well as from over 75 sessions of a televisiting service provided by the Severo Ochoa Hospital to 18 housebound patients from three different medical specialties. The model allowed the Severo Ochoa Hospital to estimate the equipment needed to support increased medical contacts for intensive cardiac and other patients.

  11. Exploring the reference point in prospect theory: gambles for length of life.

    PubMed

    van Osch, Sylvie M C; van den Hout, Wilbert B; Stiggelbout, Anne M

    2006-01-01

    Attitude toward risk is an important factor determining patient preferences. Risk behavior has been shown to be strongly dependent on the perception of the outcome as either a gain or a loss. According to prospect theory, the reference point determines how an outcome is perceived. However, no theory on the location of the reference point exists, and for the health domain, there is no direct evidence for the location of the reference point. This article combines qualitative with quantitative data to provide evidence of the reference point in life-year certainty equivalent (CE) gambles and to explore the psychology behind the reference point. The authors argue that goals (aspirations) in life influence the reference point. While thinking aloud, 45 healthy respondents gave certainty equivalents for life-year CE gambles with long and short durations of survival. Contrary to suggestions from the literature, qualitative data argued that the offered certainty equivalent most frequently served as the reference point. Thus, respondents perceived life-year CE gambles as mixed. Framing of the question and goals set in life appeared to be important factors behind the psychology of the reference point. On the basis of the authors' quantitative and qualitative data, they argue that goals alter the perception of outcomes as described by prospect theory by influencing the reference point. This relationship is more apparent for the near future as opposed to the remote future, as goals are mostly set for the near future.

  12. Who uses nursing theory? A univariate descriptive analysis of five years' research articles.

    PubMed

    Bond, A Elaine; Eshah, Nidal Farid; Bani-Khaled, Mohammed; Hamad, Atef Omar; Habashneh, Samira; Kataua', Hussein; al-Jarrah, Imad; Abu Kamal, Andaleeb; Hamdan, Falastine Rafic; Maabreh, Roqia

    2011-06-01

    Since the early 1950s, nursing leaders have worked diligently to build the Scientific Discipline of Nursing, integrating Theory, Research and Practice. Recently, the role of theory has again come into question, with some scientists claiming nurses are not using theory to guide their research, with which to improve practice. The purposes of this descriptive study were to determine: (i) Were nursing scientists' research articles in leading nursing journals based on theory? (ii) If so, were the theories nursing theories or borrowed theories? (iii) Were the theories integrated into the studies, or were they used as organizing frameworks? Research articles from seven top ISI journals were analysed, excluding regularly featured columns, meta-analyses, secondary analysis, case studies and literature reviews. The authors used King's dynamic Interacting system and Goal Attainment Theory as an organizing framework. They developed consensus on how to identify the integration of theory, searching the Title, Abstract, Aims, Methods, Discussion and Conclusion sections of each research article, whether quantitative or qualitative. Of 2857 articles published in the seven journals from 2002 to, and including, 2006, 2184 (76%) were research articles. Of the 837 (38%) authors who used theories, 460 (55%) used nursing theories, 377 (45%) used other theories: 776 (93%) of those who used theory integrated it into their studies, including qualitative studies, while 51 (7%) reported they used theory as an organizing framework for their studies. Closer analysis revealed theory principles were implicitly implied, even in research reports that did not explicitly report theory usage. Increasing numbers of nursing research articles (though not percentagewise) continue to be guided by theory, and not always by nursing theory. Newer nursing research methods may not explicitly state the use of nursing theory, though it is implicitly implied. © 2010 The Authors. Scandinavian Journal of Caring

  13. Information Security Analysis Using Game Theory and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schlicher, Bob G; Abercrombie, Robert K

    Information security analysis can be performed using game theory implemented in dynamic simulations of Agent Based Models (ABMs). Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, and information assets. Our approach addresses imperfect information and scalability that allows us to also address previous limitations of current stochastic game models. Such models only consider perfect information assuming that the defender is always able to detect attacks; assuming that the state transition probabilities are fixed before the game assuming that the players actions aremore » always synchronous; and that most models are not scalable with the size and complexity of systems under consideration. Our use of ABMs yields results of selected experiments that demonstrate our proposed approach and provides a quantitative measure for realistic information systems and their related security scenarios.« less

  14. Relative Effectiveness of Two Approaches to the Teaching of Music Theory on the Achievement and Attitudes of Undergraduate Students Training as Church Musicians

    ERIC Educational Resources Information Center

    Kinchen, John Dawson, III

    2012-01-01

    As a result of a perceived need to improve the music theory curricula for the preparation of church music leaders, this study compared two diverse approaches to the teaching of music theory for church music university students on achievement, attitudes, and self-preparedness. This current study was a quantitative, quasi-experimental research…

  15. Fluctuation-dissipation theory of input-output interindustrial relations

    NASA Astrophysics Data System (ADS)

    Iyetomi, Hiroshi; Nakayama, Yasuhiro; Aoyama, Hideaki; Fujiwara, Yoshi; Ikeda, Yuichi; Souma, Wataru

    2011-01-01

    In this study, the fluctuation-dissipation theory is invoked to shed light on input-output interindustrial relations at a macroscopic level by its application to indices of industrial production (IIP) data for Japan. Statistical noise arising from finiteness of the time series data is carefully removed by making use of the random matrix theory in an eigenvalue analysis of the correlation matrix; as a result, two dominant eigenmodes are detected. Our previous study successfully used these two modes to demonstrate the existence of intrinsic business cycles. Here a correlation matrix constructed from the two modes describes genuine interindustrial correlations in a statistically meaningful way. Furthermore, it enables us to quantitatively discuss the relationship between shipments of final demand goods and production of intermediate goods in a linear response framework. We also investigate distinctive external stimuli for the Japanese economy exerted by the current global economic crisis. These stimuli are derived from residuals of moving-average fluctuations of the IIP remaining after subtracting the long-period components arising from inherent business cycles. The observation reveals that the fluctuation-dissipation theory is applicable to an economic system that is supposed to be far from physical equilibrium.

  16. Quantitative surface topography determination by Nomarski reflection microscopy. 2: Microscope modification, calibration, and planar sample experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hartman, J.S.; Gordon, R.L.; Lessor, D.L.

    1980-09-01

    The application of reflective Nomarski differential interference contrast microscopy for the determination of quantitative sample topography data is presented. The discussion includes a review of key theoretical results presented previously plus the experimental implementation of the concepts using a commercial Momarski microscope. The experimental work included the modification and characterization of a commercial microscope to allow its use for obtaining quantitative sample topography data. System usage for the measurement of slopes on flat planar samples is also discussed. The discussion has been designed to provide the theoretical basis, a physical insight, and a cookbook procedure for implementation to allow thesemore » results to be of value to both those interested in the microscope theory and its practical usage in the metallography laboratory.« less

  17. Toward Quantitative Small Animal Pinhole SPECT: Assessment of Quantitation Accuracy Prior to Image Compensations

    PubMed Central

    Chen, Chia-Lin; Wang, Yuchuan; Lee, Jason J. S.; Tsui, Benjamin M. W.

    2011-01-01

    Purpose We assessed the quantitation accuracy of small animal pinhole single photon emission computed tomography (SPECT) under the current preclinical settings, where image compensations are not routinely applied. Procedures The effects of several common image-degrading factors and imaging parameters on quantitation accuracy were evaluated using Monte-Carlo simulation methods. Typical preclinical imaging configurations were modeled, and quantitative analyses were performed based on image reconstructions without compensating for attenuation, scatter, and limited system resolution. Results Using mouse-sized phantom studies as examples, attenuation effects alone degraded quantitation accuracy by up to −18% (Tc-99m or In-111) or −41% (I-125). The inclusion of scatter effects changed the above numbers to −12% (Tc-99m or In-111) and −21% (I-125), respectively, indicating the significance of scatter in quantitative I-125 imaging. Region-of-interest (ROI) definitions have greater impacts on regional quantitation accuracy for small sphere sources as compared to attenuation and scatter effects. For the same ROI, SPECT acquisitions using pinhole apertures of different sizes could significantly affect the outcome, whereas the use of different radii-of-rotation yielded negligible differences in quantitation accuracy for the imaging configurations simulated. Conclusions We have systematically quantified the influence of several factors affecting the quantitation accuracy of small animal pinhole SPECT. In order to consistently achieve accurate quantitation within 5% of the truth, comprehensive image compensation methods are needed. PMID:19048346

  18. Making the Invisible Visible: Advancing Quantitative Methods in Higher Education Using Critical Race Theory and Intersectionality

    ERIC Educational Resources Information Center

    López, Nancy; Erwin, Christopher; Binder, Melissa; Chavez, Mario Javier

    2018-01-01

    We appeal to critical race theory and intersectionality to examine achievement gaps at a large public university in the American southwest from 2000 to 2015. Using white, high-income women as our reference group, we report linear combinations of marginal effects for six-year graduation rates and developmental course taking across 20 distinct…

  19. Quantitative Species Measurements In Microgravity Combustion Flames

    NASA Technical Reports Server (NTRS)

    Chen, Shin-Juh; Pilgrim, Jeffrey S.; Silver, Joel A.; Piltch, Nancy D.

    2003-01-01

    The capability of models and theories to accurately predict and describe the behavior of low gravity flames can only be verified by quantitative measurements. Although video imaging, simple temperature measurements, and velocimetry methods have provided useful information in many cases, there is still a need for quantitative species measurements. Over the past decade, we have been developing high sensitivity optical absorption techniques to permit in situ, non-intrusive, absolute concentration measurements for both major and minor flames species using diode lasers. This work has helped to establish wavelength modulation spectroscopy (WMS) as an important method for species detection within the restrictions of microgravity-based measurements. More recently, in collaboration with Prof. Dahm at the University of Michigan, a new methodology combining computed flame libraries with a single experimental measurement has allowed us to determine the concentration profiles for all species in a flame. This method, termed ITAC (Iterative Temperature with Assumed Chemistry) was demonstrated for a simple laminar nonpremixed methane-air flame at both 1-g and at 0-g in a vortex ring flame. In this paper, we report additional normal and microgravity experiments which further confirm the usefulness of this approach. We also present the development of a new type of laser. This is an external cavity diode laser (ECDL) which has the unique capability of high frequency modulation as well as a very wide tuning range. This will permit the detection of multiple species with one laser while using WMS detection.

  20. Spiraling between qualitative and quantitative data on women's health behaviors: a double helix model for mixed methods.

    PubMed

    Mendlinger, Sheryl; Cwikel, Julie

    2008-02-01

    A double helix spiral model is presented which demonstrates how to combine qualitative and quantitative methods of inquiry in an interactive fashion over time. Using findings on women's health behaviors (e.g., menstruation, breast-feeding, coping strategies), we show how qualitative and quantitative methods highlight the theory of knowledge acquisition in women's health decisions. A rich data set of 48 semistructured, in-depth ethnographic interviews with mother-daughter dyads from six ethnic groups (Israeli, European, North African, Former Soviet Union [FSU], American/Canadian, and Ethiopian), plus seven focus groups, provided the qualitative sources for analysis. This data set formed the basis of research questions used in a quantitative telephone survey of 302 Israeli women from the ages of 25 to 42 from four ethnic groups. We employed multiple cycles of data analysis from both data sets to produce a more detailed and multidimensional picture of women's health behavior decisions through a spiraling process.

  1. Quantitative phase imaging using four interferograms with special phase shifts by dual-wavelength in-line phase-shifting interferometry

    NASA Astrophysics Data System (ADS)

    Xu, Xiaoqing; Wang, Yawei; Ji, Ying; Xu, Yuanyuan; Xie, Ming; Han, Hao

    2018-05-01

    A new approach of quantitative phase imaging using four interferograms with special phase shifts in dual-wavelength in-line phase-shifting interferometry is presented. In this method, positive negative 2π phase shifts are employed to easily separate the incoherent addition of two single-wavelength interferograms by combining the phase-shifting technique with the subtraction procedure, then the quantitative phase at one of both wavelengths can be achieved based on two intensities without the corresponding dc terms by the use of the character of the trigonometric function. The quantitative phase of the other wavelength can be retrieved from two dc-term suppressed intensities obtained by employing the two-step phase-shifting technique or the filtering technique in the frequency domain. The proposed method is illustrated with theory, and its effectiveness is demonstrated by simulation experiments of the spherical cap and the HeLa cell, respectively.

  2. Early reactions to Harvey's circulation theory: the impact on medicine.

    PubMed

    Lubitz, Steven A

    2004-09-01

    In early 17th century Europe, scientific concepts were still based largely on ancient philosophical and theological explanations. During this same era, however, experimentation began to take hold as a legitimate component of scientific investigation. In 1628, the English physician William Harvey announced a revolutionary theory stating that blood circulates repeatedly throughout the body. He relied on experimentation, comparative anatomy and calculation to arrive at his conclusions. His theory contrasted sharply with the accepted beliefs of the time, which were based on the 1400-year-old teachings of Galen and denied the presence of circulation. As with many new ideas, Harvey's circulation theory was received with a great deal of controversy among his colleagues. An examination of their motives reveals that many proponents agreed with his theory largely because of the logic of his argument and his use of experimentation and quantitative methods. However, some proponents agreed for religious, mystical and philosophical reasons, while some were convinced only because of the change in public opinion with time. Many opposed the circulation theory because of their rigid commitment to ancient doctrines, the questionable utility of experimentation, the lack of proof that capillaries exist, and a failure to recognize the clinical applications of his theory. Other opponents were motivated by personal resentments and professional "territorialism." Beyond the immediate issues and arguments, however, the controversy is important because it helped establish use of the scientific method.

  3. Using classical test theory, item response theory, and Rasch measurement theory to evaluate patient-reported outcome measures: a comparison of worked examples.

    PubMed

    Petrillo, Jennifer; Cano, Stefan J; McLeod, Lori D; Coon, Cheryl D

    2015-01-01

    To provide comparisons and a worked example of item- and scale-level evaluations based on three psychometric methods used in patient-reported outcome development-classical test theory (CTT), item response theory (IRT), and Rasch measurement theory (RMT)-in an analysis of the National Eye Institute Visual Functioning Questionnaire (VFQ-25). Baseline VFQ-25 data from 240 participants with diabetic macular edema from a randomized, double-masked, multicenter clinical trial were used to evaluate the VFQ at the total score level. CTT, RMT, and IRT evaluations were conducted, and results were assessed in a head-to-head comparison. Results were similar across the three methods, with IRT and RMT providing more detailed diagnostic information on how to improve the scale. CTT led to the identification of two problematic items that threaten the validity of the overall scale score, sets of redundant items, and skewed response categories. IRT and RMT additionally identified poor fit for one item, many locally dependent items, poor targeting, and disordering of over half the response categories. Selection of a psychometric approach depends on many factors. Researchers should justify their evaluation method and consider the intended audience. If the instrument is being developed for descriptive purposes and on a restricted budget, a cursory examination of the CTT-based psychometric properties may be all that is possible. In a high-stakes situation, such as the development of a patient-reported outcome instrument for consideration in pharmaceutical labeling, however, a thorough psychometric evaluation including IRT or RMT should be considered, with final item-level decisions made on the basis of both quantitative and qualitative results. Copyright © 2015. Published by Elsevier Inc.

  4. On flares, substorms, and the theory of impulsive flux transfer events

    NASA Technical Reports Server (NTRS)

    Bratenahl, A.; Baum, P. J.

    1976-01-01

    Solar flares and magnetospheric substorms are discussed in the context of a general theory of impulsive flux transfer events (IFTE). IFTE theory, derived from laboratory observations in the Double Inverse Pinch Device (DIPD), provides a quantitative extension of 'neutral sheet' theories to include nonsteady field line reconnection. Current flow along the reconnection line increases with magnetic flux storage. When flux build-up exceeds the level corresponding to a critical limit on the current, instabilities induce a sudden transition in the mode of conduction. The resulting IFTE, indifferent to the specific modes and instabilities involved, is the more energetic, the lower the initial resistivity. It is the more violent, the greater the resulting resistivity increase and the faster its growth. Violent events can develop very large voltage transients along the reconnection line. Persistent build-up promoting conditions produce relaxation oscillations in the quantity of flux and energy stored (build-up-IFTE cycles). It is difficult to avoid the conclusion: flares and substorms are examples of IFTE.

  5. Electron-density descriptors as predictors in quantitative structure--activity/property relationships and drug design.

    PubMed

    Matta, Chérif F; Arabi, Alya A

    2011-06-01

    The use of electron density-based molecular descriptors in drug research, particularly in quantitative structure--activity relationships/quantitative structure--property relationships studies, is reviewed. The exposition starts by a discussion of molecular similarity and transferability in terms of the underlying electron density, which leads to a qualitative introduction to the quantum theory of atoms in molecules (QTAIM). The starting point of QTAIM is the topological analysis of the molecular electron-density distributions to extract atomic and bond properties that characterize every atom and bond in the molecule. These atomic and bond properties have considerable potential as bases for the construction of robust quantitative structure--activity/property relationships models as shown by selected examples in this review. QTAIM is applicable to the electron density calculated from quantum-chemical calculations and/or that obtained from ultra-high resolution x-ray diffraction experiments followed by nonspherical refinement. Atomic and bond properties are introduced followed by examples of application of each of these two families of descriptors. The review ends with a study whereby the molecular electrostatic potential, uniquely determined by the density, is used in conjunction with atomic properties to elucidate the reasons for the biological similarity of bioisosteres.

  6. Studying the varied shapes of gold clusters by an elegant optimization algorithm that hybridizes the density functional tight-binding theory and the density functional theory

    NASA Astrophysics Data System (ADS)

    Yen, Tsung-Wen; Lim, Thong-Leng; Yoon, Tiem-Leong; Lai, S. K.

    2017-11-01

    We combined a new parametrized density functional tight-binding (DFTB) theory (Fihey et al. 2015) with an unbiased modified basin hopping (MBH) optimization algorithm (Yen and Lai 2015) and applied it to calculate the lowest energy structures of Au clusters. From the calculated topologies and their conformational changes, we find that this DFTB/MBH method is a necessary procedure for a systematic study of the structural development of Au clusters but is somewhat insufficient for a quantitative study. As a result, we propose an extended hybridized algorithm. This improved algorithm proceeds in two steps. In the first step, the DFTB theory is employed to calculate the total energy of the cluster and this step (through running DFTB/MBH optimization for given Monte-Carlo steps) is meant to efficiently bring the Au cluster near to the region of the lowest energy minimum since the cluster as a whole has explicitly considered the interactions of valence electrons with ions, albeit semi-quantitatively. Then, in the second succeeding step, the energy-minimum searching process will continue with a skilledly replacement of the energy function calculated by the DFTB theory in the first step by one calculated in the full density functional theory (DFT). In these subsequent calculations, we couple the DFT energy also with the MBH strategy and proceed with the DFT/MBH optimization until the lowest energy value is found. We checked that this extended hybridized algorithm successfully predicts the twisted pyramidal structure for the Au40 cluster and correctly confirms also the linear shape of C8 which our previous DFTB/MBH method failed to do so. Perhaps more remarkable is the topological growth of Aun: it changes from a planar (n =3-11) → an oblate-like cage (n =12-15) → a hollow-shape cage (n =16-18) and finally a pyramidal-like cage (n =19, 20). These varied forms of the cluster's shapes are consistent with those reported in the literature.

  7. Unified Theory for Decoding the Signals from X-Ray Florescence and X-Ray Diffraction of Mixtures.

    PubMed

    Chung, Frank H

    2017-05-01

    For research and development or for solving technical problems, we often need to know the chemical composition of an unknown mixture, which is coded and stored in the signals of its X-ray fluorescence (XRF) and X-ray diffraction (XRD). X-ray fluorescence gives chemical elements, whereas XRD gives chemical compounds. The major problem in XRF and XRD analyses is the complex matrix effect. The conventional technique to deal with the matrix effect is to construct empirical calibration lines with standards for each element or compound sought, which is tedious and time-consuming. A unified theory of quantitative XRF analysis is presented here. The idea is to cancel the matrix effect mathematically. It turns out that the decoding equation for quantitative XRF analysis is identical to that for quantitative XRD analysis although the physics of XRD and XRF are fundamentally different. The XRD work has been published and practiced worldwide. The unified theory derives a new intensity-concentration equation of XRF, which is free from the matrix effect and valid for a wide range of concentrations. The linear decoding equation establishes a constant slope for each element sought, hence eliminating the work on calibration lines. The simple linear decoding equation has been verified by 18 experiments.

  8. An integrated theory of attention and decision making in visual signal detection.

    PubMed

    Smith, Philip L; Ratcliff, Roger

    2009-04-01

    The simplest attentional task, detecting a cued stimulus in an otherwise empty visual field, produces complex patterns of performance. Attentional cues interact with backward masks and with spatial uncertainty, and there is a dissociation in the effects of these variables on accuracy and on response time. A computational theory of performance in this task is described. The theory links visual encoding, masking, spatial attention, visual short-term memory (VSTM), and perceptual decision making in an integrated dynamic framework. The theory assumes that decisions are made by a diffusion process driven by a neurally plausible, shunting VSTM. The VSTM trace encodes the transient outputs of early visual filters in a durable form that is preserved for the time needed to make a decision. Attention increases the efficiency of VSTM encoding, either by increasing the rate of trace formation or by reducing the delay before trace formation begins. The theory provides a detailed, quantitative account of attentional effects in spatial cuing tasks at the level of response accuracy and the response time distributions. (c) 2009 APA, all rights reserved

  9. Keldysh meets Lindblad: Correlated Gain and Loss in Higher Order Perturbation Theory

    NASA Astrophysics Data System (ADS)

    Stace, Tom; Mueller, Clemens

    Motivated by correlated decay processes driving gain, loss and lasing in driven artificial quantum systems, we develop a theoretical technique using Keldysh diagrammatic perturbation theory to derive a Lindblad master equation that goes beyond the usual second order perturbation theory. We demonstrate the method on the driven dissipative Rabi model, including terms up to fourth order in the interaction between the qubit and both the resonator and environment. This results in a large class of Lindblad dissipators and associated rates which go beyond the terms that have previously been proposed to describe similar systems. All of the additional terms contribute to the system behaviour at the same order of perturbation theory. We then apply these results to analyse the phonon-assisted steady-state gain of a microwave field driving a double quantum-dot in a resonator. We show that resonator gain and loss are substantially affected by dephasing- assisted dissipative processes in the quantum-dot system. These additional processes, which go beyond recently proposed polaronic theories, are in good quantitative agreement with experimental observations.

  10. Bl4 decays and the extraction of |Vub|

    NASA Astrophysics Data System (ADS)

    Kang, Xian-Wei; Kubis, Bastian; Hanhart, Christoph; Meißner, Ulf-G.

    2014-03-01

    The Cabibbo-Kobayashi-Maskawa matrix element |Vub| is not well determined yet. It can be extracted from both inclusive or exclusive decays, like B→π(ρ)laccent="true">ν¯l. However, the exclusive determination from B→ρlaccent="true">ν¯l, in particular, suffers from a large model dependence. In this paper, we propose to extract |Vub| from the four-body semileptonic decay B→ππlaccent="true">ν¯l, where the form factors for the pion-pion system are treated in dispersion theory. This is a model-independent approach that takes into account the ππ rescattering effects, as well as the effect of the ρ meson. We demonstrate that both finite-width effects of the ρ meson as well as scalar ππ contributions can be considered completely in this way.

  11. Subcycle dynamics of Coulomb asymmetry in strong elliptical laser fields.

    PubMed

    Li, Min; Liu, Yunquan; Liu, Hong; Ning, Qicheng; Fu, Libin; Liu, Jie; Deng, Yongkai; Wu, Chengyin; Peng, Liang-You; Peng, Liangyou; Gong, Qihuang

    2013-07-12

    We measure photoelectron angular distributions of noble gases in intense elliptically polarized laser fields, which indicate strong structure-dependent Coulomb asymmetry. Using a dedicated semiclassical model, we have disentangled the contribution of direct ionization and multiple forward scattering on Coulomb asymmetry in elliptical laser fields. Our theory quantifies the roles of the ionic potential and initial transverse momentum on Coulomb asymmetry, proving that the small lobes of asymmetry are induced by direct ionization and the strong asymmetry is induced by multiple forward scattering in the ionic potential. Both processes are distorted by the Coulomb force acting on the electrons after tunneling. Lowering the ionization potential, the relative contribution of direct ionization on Coulomb asymmetry substantially decreases and Coulomb focusing on multiple rescattering is more important. We do not observe evident initial longitudinal momentum spread at the tunnel exit according to our simulation.

  12. Intermodulation in nonlinear SQUID metamaterials: Experiment and theory

    NASA Astrophysics Data System (ADS)

    Zhang, Daimeng; Trepanier, Melissa; Antonsen, Thomas; Ott, Edward; Anlage, Steven M.

    2016-11-01

    The response of nonlinear metamaterials and superconducting electronics to two-tone excitation is critical for understanding their use as low-noise amplifiers and tunable filters. A new setting for such studies is that of metamaterials made of radio frequency superconducting quantum interference devices (rf-SQUIDs). The two-tone response of self-resonant rf-SQUID meta-atoms and metamaterials is studied here via intermodulation (IM) measurement over a broad range of tone frequencies and tone powers. A sharp onset followed by a surprising strongly suppressed IM region near the resonance is observed. Using a two time scale analysis technique, we present an analytical theory that successfully explains our experimental observations. The theory predicts that the IM can be manipulated with tone power, center frequency, frequency difference between the two tones, and temperature. This quantitative understanding potentially allows for the design of rf-SQUID metamaterials with either very low or very high IM response.

  13. Quantitative approaches to information recovery from black holes

    NASA Astrophysics Data System (ADS)

    Balasubramanian, Vijay; Czech, Bartłomiej

    2011-08-01

    The evaporation of black holes into apparently thermal radiation poses a serious conundrum for theoretical physics: at face value, it appears that in the presence of a black hole, quantum evolution is non-unitary and destroys information. This information loss paradox has its seed in the presence of a horizon causally separating the interior and asymptotic regions in a black hole spacetime. A quantitative resolution of the paradox could take several forms: (a) a precise argument that the underlying quantum theory is unitary, and that information loss must be an artifact of approximations in the derivation of black hole evaporation, (b) an explicit construction showing how information can be recovered by the asymptotic observer, (c) a demonstration that the causal disconnection of the black hole interior from infinity is an artifact of the semiclassical approximation. This review summarizes progress on all these fronts.

  14. Quantitative Glycomics Strategies*

    PubMed Central

    Mechref, Yehia; Hu, Yunli; Desantos-Garcia, Janie L.; Hussein, Ahmed; Tang, Haixu

    2013-01-01

    The correlations between protein glycosylation and many biological processes and diseases are increasing the demand for quantitative glycomics strategies enabling sensitive monitoring of changes in the abundance and structure of glycans. This is currently attained through multiple strategies employing several analytical techniques such as capillary electrophoresis, liquid chromatography, and mass spectrometry. The detection and quantification of glycans often involve labeling with ionic and/or hydrophobic reagents. This step is needed in order to enhance detection in spectroscopic and mass spectrometric measurements. Recently, labeling with stable isotopic reagents has also been presented as a very viable strategy enabling relative quantitation. The different strategies available for reliable and sensitive quantitative glycomics are herein described and discussed. PMID:23325767

  15. Researching the Impact of Teacher Professional Development Programmes Based on Action Research, Constructivism, and Systems Theory

    ERIC Educational Resources Information Center

    Zehetmeier, Stefan; Andreitz, Irina; Erlacher, Willibald; Rauch, Franz

    2015-01-01

    This paper deals with the topic of professional development programmes' impact. Concepts and ideas of action research, constructivism, and systems theory are used as a theoretical framework and are combined to describe and analyse an exemplary professional development programme in Austria. Empirical findings from both quantitative and qualitative…

  16. TOPICAL PROBLEMS: The phenomenological theory of world population growth

    NASA Astrophysics Data System (ADS)

    Kapitza, Sergei P.

    1996-01-01

    Of all global problems world population growth is the most significant. Demographic data describe this process in a concise and quantitative way in its past and present. Analysing this development it is possible by applying the concepts of systems analysis and synergetics, to work out a mathematical model for a phenomenological description of the global demographic process and to project its trends into the future. Assuming self-similarity as the dynamic principle of development, growth can be described practically over the whole of human history, assuming the growth rate to be proportional to the square of the number of people. The large parameter of the theory and the effective size of a coherent population group is of the order of 105 and the microscopic parameter of the phenomenology is the human lifespan. The demographic transition — a transition to a stabilised world population of some 14 billion in a foreseeable future — is a systemic singularity and is determined by the inherent pattern of growth of an open system, rather than by the lack of resources. The development of a quantitative nonlinear theory of the world population is of interest for interdisciplinary research in anthropology and demography, history and sociology, for population genetics and epidemiology, for studies in evolution of humankind and the origin of man. The model also provides insight into the stability of growth and the present predicament of humankind, and provides a setting for discussing the main global problems.

  17. The Physics of Earthquakes: In the Quest for a Unified Theory (or Model) That Quantitatively Describes the Entire Process of an Earthquake Rupture, From its Nucleation to the Dynamic Regime and to its Arrest

    NASA Astrophysics Data System (ADS)

    Ohnaka, M.

    2004-12-01

    For the past four decades, great progress has been made in understanding earthquake source processes. In particular, recent progress in the field of the physics of earthquakes has contributed substantially to unraveling the earthquake generation process in quantitative terms. Yet, a fundamental problem remains unresolved in this field. The constitutive law that governs the behavior of earthquake ruptures is the basis of earthquake physics, and the governing law plays a fundamental role in accounting for the entire process of an earthquake rupture, from its nucleation to the dynamic propagation to its arrest, quantitatively in a unified and consistent manner. Therefore, without establishing the rational constitutive law, the physics of earthquakes cannot be a quantitative science in a true sense, and hence it is urgent to establish the rational constitutive law. However, it has been controversial over the past two decades, and it is still controversial, what the constitutive law for earthquake ruptures ought to be, and how it should be formulated. To resolve the controversy is a necessary step towards a more complete, unified theory of earthquake physics, and now the time is ripe to do so. Because of its fundamental importance, we have to discuss thoroughly and rigorously what the constitutive law ought to be from the standpoint of the physics of rock friction and fracture on the basis of solid evidence. There are prerequisites for the constitutive formulation. The brittle, seismogenic layer and individual faults therein are characterized by inhomogeneity, and fault inhomogeneity has profound implications for earthquake ruptures. In addition, rupture phenomena including earthquakes are inherently scale dependent; indeed, some of the physical quantities inherent in rupture exhibit scale dependence. To treat scale-dependent physical quantities inherent in the rupture over a broad scale range quantitatively in a unified and consistent manner, it is critical to

  18. A test for selection employing quantitative trait locus and mutation accumulation data.

    PubMed

    Rice, Daniel P; Townsend, Jeffrey P

    2012-04-01

    Evolutionary biologists attribute much of the phenotypic diversity observed in nature to the action of natural selection. However, for many phenotypic traits, especially quantitative phenotypic traits, it has been challenging to test for the historical action of selection. An important challenge for biologists studying quantitative traits, therefore, is to distinguish between traits that have evolved under the influence of strong selection and those that have evolved neutrally. Most existing tests for selection employ molecular data, but selection also leaves a mark on the genetic architecture underlying a trait. In particular, the distribution of quantitative trait locus (QTL) effect sizes and the distribution of mutational effects together provide information regarding the history of selection. Despite the increasing availability of QTL and mutation accumulation data, such data have not yet been effectively exploited for this purpose. We present a model of the evolution of QTL and employ it to formulate a test for historical selection. To provide a baseline for neutral evolution of the trait, we estimate the distribution of mutational effects from mutation accumulation experiments. We then apply a maximum-likelihood-based method of inference to estimate the range of selection strengths under which such a distribution of mutations could generate the observed QTL. Our test thus represents the first integration of population genetic theory and QTL data to measure the historical influence of selection.

  19. Towards a quantitative description of tunneling conductance of superconductors: Application to LiFeAs

    DOE PAGES

    Kreisel, A.; Nelson, R.; Berlijn, T.; ...

    2016-12-27

    Since the discovery of iron-based superconductors, a number of theories have been put forward to explain the qualitative origin of pairing, but there have been few attempts to make quantitative, material-specific comparisons to experimental results. The spin-fluctuation theory of electronic pairing, based on first-principles electronic structure calculations, makes predictions for the superconducting gap. Within the same framework, the surface wave functions may also be calculated, allowing, e.g., for detailed comparisons between theoretical results and measured scanning tunneling topographs and spectra. We present such a comparison between theory and experiment on the Fe-based superconductor LiFeAs. Our results for the homogeneous surfacemore » as well as impurity states are presented as a benchmark test of the theory. For the homogeneous system, we argue that the maxima of topographic image intensity may be located at positions above either the As or Li atoms, depending on tip height and the setpoint current of the measurement. We further report the experimental observation of transitions between As- and Li-registered lattices as functions of both tip height and setpoint bias, in agreement with this prediction. Next, we give a detailed comparison between the simulated scanning tunneling microscopy images of transition-metal defects with experiment. Finally, we discuss possible extensions of the current framework to obtain a theory with true predictive power for scanning tunneling microscopy in Fe-based systems.« less

  20. Towards a quantitative description of tunneling conductance of superconductors: Application to LiFeAs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kreisel, A.; Nelson, R.; Berlijn, T.

    Since the discovery of iron-based superconductors, a number of theories have been put forward to explain the qualitative origin of pairing, but there have been few attempts to make quantitative, material-specific comparisons to experimental results. The spin-fluctuation theory of electronic pairing, based on first-principles electronic structure calculations, makes predictions for the superconducting gap. Within the same framework, the surface wave functions may also be calculated, allowing, e.g., for detailed comparisons between theoretical results and measured scanning tunneling topographs and spectra. We present such a comparison between theory and experiment on the Fe-based superconductor LiFeAs. Our results for the homogeneous surfacemore » as well as impurity states are presented as a benchmark test of the theory. For the homogeneous system, we argue that the maxima of topographic image intensity may be located at positions above either the As or Li atoms, depending on tip height and the setpoint current of the measurement. We further report the experimental observation of transitions between As- and Li-registered lattices as functions of both tip height and setpoint bias, in agreement with this prediction. Next, we give a detailed comparison between the simulated scanning tunneling microscopy images of transition-metal defects with experiment. Finally, we discuss possible extensions of the current framework to obtain a theory with true predictive power for scanning tunneling microscopy in Fe-based systems.« less

  1. Affinity for Quantitative Tools: Undergraduate Marketing Students Moving beyond Quantitative Anxiety

    ERIC Educational Resources Information Center

    Tarasi, Crina O.; Wilson, J. Holton; Puri, Cheenu; Divine, Richard L.

    2013-01-01

    Marketing students are known as less likely to have an affinity for the quantitative aspects of the marketing discipline. In this article, we study the reasons why this might be true and develop a parsimonious 20-item scale for measuring quantitative affinity in undergraduate marketing students. The scale was administered to a sample of business…

  2. Using category theory to assess the relationship between consciousness and integrated information theory.

    PubMed

    Tsuchiya, Naotsugu; Taguchi, Shigeru; Saigo, Hayato

    2016-06-01

    One of the most mysterious phenomena in science is the nature of conscious experience. Due to its subjective nature, a reductionist approach is having a hard time in addressing some fundamental questions about consciousness. These questions are squarely and quantitatively tackled by a recently developed theoretical framework, called integrated information theory (IIT) of consciousness. In particular, IIT proposes that a maximally irreducible conceptual structure (MICS) is identical to conscious experience. However, there has been no principled way to assess the claimed identity. Here, we propose to apply a mathematical formalism, category theory, to assess the proposed identity and suggest that it is important to consider if there exists a proper translation between the domain of conscious experience and that of the MICS. If such translation exists, we postulate that questions in one domain can be answered in the other domain; very difficult questions in the domain of consciousness can be resolved in the domain of mathematics. We claim that it is possible to empirically test if such a functor exists, by using a combination of neuroscientific and computational approaches. Our general, principled and empirical framework allows us to assess the relationship between the domain of consciousness and the domain of mathematical structures, including those suggested by IIT. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  3. Quantitative analysis of the effects of vertical magnetic fields on microsegregation in Te-doped LEC GaAs

    NASA Technical Reports Server (NTRS)

    Carlson, D. J.; Witt, A. F.

    1992-01-01

    Using near-IR transmission microscopy with computational absorption analysis, the effects of axial magnetic fields on micro- and macrosegregation during LP-LEC growth of GaAs were quantitatively investigated with a spatial resolution approaching 2 microns. Segregation inhomogeneities exceeding one order of magnitude are found to be related to fluid dynamics of the melt. The applicability of the BPS theory as well as the nonapplicability of the Cochran analysis are established.

  4. Higgs compositeness in Sp(2N) gauge theories - Determining the low-energy constants with lattice calculations

    NASA Astrophysics Data System (ADS)

    Bennett, Ed; Ki Hong, Deog; Lee, Jong-Wan; David Lin, C.-J.; Lucini, Biagio; Piai, Maurizio; Vadacchino, Davide

    2018-03-01

    As a first step towards a quantitative understanding of the SU(4)/Sp(4) composite Higgs model through lattice calculations, we discuss the low energy effective field theory resulting from the SU(4) → Sp(4) global symmetry breaking pattern. We then consider an Sp(4) gauge theory with two Dirac fermion flavours in the fundamental representation on a lattice, which provides a concrete example of the microscopic realisation of the SU(4)/Sp(4) composite Higgs model. For this system, we outline a programme of numerical simulations aiming at the determination of the low-energy constants of the effective field theory and we test the method on the quenched theory. We also report early results from dynamical simulations, focussing on the phase structure of the lattice theory and a calculation of the lowest-lying meson spectrum at coarse lattice spacing. Combined contributions of B. Lucini (e-mail: b.lucini@swansea.ac.uk) and J.-W. Lee (e-mail: wlee823@pusan.ac.kr).

  5. Natural selection. VII. History and interpretation of kin selection theory.

    PubMed

    Frank, S A

    2013-06-01

    Kin selection theory is a kind of causal analysis. The initial form of kin selection ascribed cause to costs, benefits and genetic relatedness. The theory then slowly developed a deeper and more sophisticated approach to partitioning the causes of social evolution. Controversy followed because causal analysis inevitably attracts opposing views. It is always possible to separate total effects into different component causes. Alternative causal schemes emphasize different aspects of a problem, reflecting the distinct goals, interests and biases of different perspectives. For example, group selection is a particular causal scheme with certain advantages and significant limitations. Ultimately, to use kin selection theory to analyse natural patterns and to understand the history of debates over different approaches, one must follow the underlying history of causal analysis. This article describes the history of kin selection theory, with emphasis on how the causal perspective improved through the study of key patterns of natural history, such as dispersal and sex ratio, and through a unified approach to demographic and social processes. Independent historical developments in the multivariate analysis of quantitative traits merged with the causal analysis of social evolution by kin selection. © 2013 The Author. Journal of Evolutionary Biology © 2013 European Society For Evolutionary Biology.

  6. Generalization of Equivalent Crystal Theory to Include Angular Dependence

    NASA Technical Reports Server (NTRS)

    Ferrante, John; Zypman, Fredy R.

    2004-01-01

    In the original Equivalent Crystal Theory, each atomic site in the real crystal is assigned an equivalent lattice constant, in general different from the ground state one. This parameter corresponds to a local compression or expansion of the lattice. The basic method considers these volumetric transformations and, in addition, introduces the possibility that the reference lattice is anisotropically distorted. These distortions however, were introduced ad-hoc. In this work, we generalize the original Equivalent Crystal Theory by systematically introducing site-dependent directional distortions of the lattice, whose corresponding distortions account for the dependence of the energy on anisotropic local density variations. This is done in the spirit of the original framework, but including a gradient term in the density. This approach is introduced to correct a deficiency in the original Equivalent Crystal Theory and other semiempirical methods in quantitatively obtaining the correct ratios of the surface energies of low index planes of cubic metals (100), (110), and (111). We develop here the basic framework, and apply it to the calculation of Fe (110) and Fe (111) surface energy formation. The results, compared with first principles calculations, show an improvement over previous semiempirical approaches.

  7. Nuclear medicine and quantitative imaging research (quantitative studies in radiopharmaceutical science): Comprehensive progress report, April 1, 1986-December 31, 1988

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cooper, M.D.; Beck, R.N.

    1988-06-01

    This document describes several years research to improve PET imaging and diagnostic techniques in man. This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefitmore » from the new knowledge gained. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. The reports in the study were processed separately for the data bases. (TEM)« less

  8. Using Rasch Measurement Theory to Examine Two Instructional Approaches for Teaching and Learning of French Grammar

    ERIC Educational Resources Information Center

    Vogel, Severine P.; Engelhard, George, Jr.

    2011-01-01

    The authors describe a quantitative approach based on Rasch measurement theory for evaluating classroom assessments within the context of foreign language classes. A secondary purpose was to examine the effects of two instructional approaches to teach grammar, a guided inductive and a deductive approach, through the lens of Rasch measurement…

  9. Modeling conflict : research methods, quantitative modeling, and lessons learned.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rexroth, Paul E.; Malczynski, Leonard A.; Hendrickson, Gerald A.

    2004-09-01

    This study investigates the factors that lead countries into conflict. Specifically, political, social and economic factors may offer insight as to how prone a country (or set of countries) may be for inter-country or intra-country conflict. Largely methodological in scope, this study examines the literature for quantitative models that address or attempt to model conflict both in the past, and for future insight. The analysis concentrates specifically on the system dynamics paradigm, not the political science mainstream approaches of econometrics and game theory. The application of this paradigm builds upon the most sophisticated attempt at modeling conflict as a resultmore » of system level interactions. This study presents the modeling efforts built on limited data and working literature paradigms, and recommendations for future attempts at modeling conflict.« less

  10. ψ (2 S ) versus J /ψ suppression in proton-nucleus collisions from factorization violating soft color exchanges

    NASA Astrophysics Data System (ADS)

    Ma, Yan-Qing; Venugopalan, Raju; Watanabe, Kazuhiro; Zhang, Hong-Fei

    2018-01-01

    We argue that the large suppression of the ψ (2 S ) inclusive cross section relative to the J /ψ inclusive cross section in proton-nucleus (p+A) collisions can be attributed to factorization breaking effects in the formation of quarkonium. These factorization breaking effects arise from soft color exchanges between charm-anticharm pairs undergoing hadronization and comoving partons that are long lived on time scales of quarkonium formation. We compute the short distance pair production of heavy quarks in the color glass condensate (CGC) effective field theory and employ an improved color evaporation model (ICEM) to describe their hadronization into quarkonium at large distances. The combined CGC+ICEM model provides a quantitative description of J /ψ and ψ (2 S ) data in proton-proton (p+p) collisions from both RHIC and the LHC. Factorization breaking effects in hadronization, due to additional parton comovers in the nucleus, are introduced heuristically by imposing a cutoff Λ , representing the average momentum kick from soft color exchanges, in the ICEM. Such soft exchanges have no perceptible effect on J /ψ suppression in p+A collisions. In contrast, the interplay of the physics of these soft exchanges at large distances, with the physics of semihard rescattering at short distances, causes a significant additional suppression of ψ (2 S ) yields relative to that of the J /ψ . A good fit of all RHIC and LHC J /ψ and ψ (2 S ) data, for transverse momenta P⊥≤5 GeV in p+p and p+A collisions, is obtained for Λ ˜10 MeV.

  11. A quantitative analysis of qualitative studies in clinical journals for the 2000 publishing year

    PubMed Central

    McKibbon, Kathleen Ann; Gadd, Cynthia S

    2004-01-01

    Background Quantitative studies are becoming more recognized as important to understanding health care with all of its richness and complexities. The purpose of this descriptive survey was to provide a quantitative evaluation of the qualitative studies published in 170 core clinical journals for 2000. Methods All identified studies that used qualitative methods were reviewed to ascertain which clinical journals publish qualitative studies and to extract research methods, content (persons and health care issues studied), and whether mixed methods (quantitative and qualitative methods) were used. Results 60 330 articles were reviewed. 355 reports of original qualitative studies and 12 systematic review articles were identified in 48 journals. Most of the journals were in the discipline of nursing. Only 4 of the most highly cited health care journals, based on ISI Science Citation Index (SCI) Impact Factors, published qualitative studies. 37 of the 355 original reports used both qualitative and quantitative (mixed) methods. Patients and non-health care settings were the most common groups of people studied. Diseases and conditions were cancer, mental health, pregnancy and childbirth, and cerebrovascular disease with many other diseases and conditions represented. Phenomenology and grounded theory were commonly used; substantial ethnography was also present. No substantial differences were noted for content or methods when articles published in all disciplines were compared with articles published in nursing titles or when studies with mixed methods were compared with studies that included only qualitative methods. Conclusions The clinical literature includes many qualitative studies although they are often published in nursing journals or journals with low SCI Impact Factor journals. Many qualitative studies incorporate both qualitative and quantitative methods. PMID:15271221

  12. The Promise of Qualitative Research to Inform Theory to Address Health Equity.

    PubMed

    Shelton, Rachel C; Griffith, Derek M; Kegler, Michelle C

    2017-10-01

    Most public health researchers and practitioners agree that we need to accelerate our efforts to eliminate health disparities and promote health equity. The past two decades of research have provided a wealth of descriptive studies, both qualitative and quantitative, that describe the size, scale, and scope of health disparities, as well as the key determinants that affect disparities. We need, however, to shift more aggressively to action informed by this research and develop deeper understandings of how to shape multilevel interventions, influenced by theories across multiple levels of the social-ecologic framework. In this article, we discuss the promising opportunities for qualitative and health equity scholars to advance research and practice through the refinement, expansion, and application of rigorous, theoretically informed qualitative research. In particular, to advance work in the area of theory to inform health equity, we encourage researchers (a) to move toward thinking about mechanisms and theory-building and refining; (b) to explicitly incorporate theories at the social, organizational, community, and policy levels and consider how factors at these levels interact synergistically with factors at the individual and interpersonal levels; (c) consider how the social dimensions that have implications for health equity intersect and interact; and (d) develop and apply more community-engaged, assets-based, and action-oriented theories and frameworks.

  13. A feminist critique of foundational nursing research and theory on transition to motherhood.

    PubMed

    Parratt, Jenny A; Fahy, Kathleen M

    2011-08-01

    is using 'transition to motherhood theory' the best way to guide midwives in providing woman-centred care? contemporary research about changes to women's embodied sense of self during childbearing is influenced by foundational research and theory about the transition to motherhood. Rubin and Mercer are two key nursing authors whose work on transition to motherhood theory still shapes the ways in which a woman's experience of change during childbearing is understood in midwifery. using a feminist post-structural framework, Rubin and Mercer's theory and research is described, critiqued and discussed. Rubin and Mercer used pre-existing theories and concepts that had the effect of finding similarities and discarding differences between women. Rubin and Mercer's theory and research is an expression of humanistic philosophy. This philosophy creates frameworks that have an assumed, disempowered role for childbearing women. Their research used a logico-empirical, quantitative approach. Qualitative interpretive or constructivist approaches offer more appropriate ways to study the highly individualised, embodied, lived experience of a woman's changing self during childbearing. Rubin and Mercer's theory is baby-centred. Transition to motherhood theory privileges the position of experts in directing how a woman should become a mother. This has the effect of making midwives agents for the social control of women. Rubin and Mercer's transition to motherhood theory is a well-intentioned product of its time. The theory is inconsistent with contemporary midwifery philosophy which promotes a woman-centred partnership between the midwife and the woman. The usefulness of this outdated nursing theory in midwifery teaching, research or practice is debatable. Copyright © 2010 Elsevier Ltd. All rights reserved.

  14. Literacy Research, Theory, and Practice: Views from Many Perspectives. Forty-First Yearbook of the National Reading Conference.

    ERIC Educational Resources Information Center

    Kinzer, Charles K., Ed.; Leu, Donald J., Ed.

    The 43 manuscripts presented in this collection represent qualitative and quantitative studies, as well as papers that present literacy research, theory, and pedagogy. Papers in the collection include: "Family Uses of Literacy: A Critical Voice" (D. Madigan); "Intergenerational Literacy: Impact on the Development of the Storybook Reading Behaviors…

  15. How do Turkish High School Graduates Use the Wave Theory of Light to Explain Optics Phenomena?

    ERIC Educational Resources Information Center

    Sengoren, S. K.

    2010-01-01

    This research was intended to investigate whether Turkish students who had graduated from high school used the wave theory of light properly in explaining optical phenomena. The survey method was used in this research. The data, which were collected from 175 first year university students in Turkey, were analysed quantitatively and qualitatively.…

  16. On the mechanochemical theory of biological pattern formation with application to vasculogenesis.

    PubMed

    Murray, James D

    2003-02-01

    We first describe the Murray-Oster mechanical theory of pattern formation, the biological basis of which is experimentally well documented. The model quantifies the interaction of cells and the extracellular matrix via the cell-generated forces. The model framework is described in quantitative detail. Vascular endothelial cells, when cultured on gelled basement membrane matrix, rapidly aggregate into clusters while deforming the matrix into a network of cord-like structures tessellating the planar culture. We apply the mechanical theory of pattern formation to this culture system and show that neither strain-biased anisotropic cell traction nor cell migration are necessary for pattern formation: isotropic, strain-stimulated cell traction is sufficient to form the observed patterns. Predictions from the model were confirmed experimentally.

  17. Precocious quantitative cognition in monkeys.

    PubMed

    Ferrigno, Stephen; Hughes, Kelly D; Cantlon, Jessica F

    2016-02-01

    Basic quantitative abilities are thought to have an innate basis in humans partly because the ability to discriminate quantities emerges early in child development. If humans and nonhuman primates share this developmentally primitive foundation of quantitative reasoning, then this ability should be present early in development across species and should emerge earlier in monkeys than in humans because monkeys mature faster than humans. We report that monkeys spontaneously make accurate quantity choices by 1 year of age in a task that human children begin to perform only at 2.5 to 3 years of age. Additionally, we report that the quantitative sensitivity of infant monkeys is equal to that of the adult animals in their group and that rates of learning do not differ between infant and adult animals. This novel evidence of precocious quantitative reasoning in infant monkeys suggests that human quantitative reasoning shares its early developing foundation with other primates. The data further suggest that early developing components of primate quantitative reasoning are constrained by maturational factors related to genetic development as opposed to learning experience alone.

  18. Screening hypochromism (sieve effect) in red blood cells: a quantitative analysis

    PubMed Central

    Razi Naqvi, K.

    2014-01-01

    Multiwavelength UV-visible spectroscopy, Kramers-Kronig analysis, and several other experimental and theoretical tools have been applied over the last several decades to fathom absorption and scattering of light by suspensions of micron-sized pigmented particles, including red blood cells, but a satisfactory quantitative analysis of the difference between the absorption spectra of suspension of intact and lysed red blood cells is still lacking. It is stressed that such a comparison is meaningful only if the pertinent spectra are free from, or have been corrected for, scattering losses, and it is shown that Duysens’ theory can, whereas that of Vekshin cannot, account satisfactorily for the observed hypochromism of suspensions of red blood cells. PMID:24761307

  19. Screening hypochromism (sieve effect) in red blood cells: a quantitative analysis.

    PubMed

    Razi Naqvi, K

    2014-04-01

    Multiwavelength UV-visible spectroscopy, Kramers-Kronig analysis, and several other experimental and theoretical tools have been applied over the last several decades to fathom absorption and scattering of light by suspensions of micron-sized pigmented particles, including red blood cells, but a satisfactory quantitative analysis of the difference between the absorption spectra of suspension of intact and lysed red blood cells is still lacking. It is stressed that such a comparison is meaningful only if the pertinent spectra are free from, or have been corrected for, scattering losses, and it is shown that Duysens' theory can, whereas that of Vekshin cannot, account satisfactorily for the observed hypochromism of suspensions of red blood cells.

  20. Leukotriene B4 catabolism: quantitation of leukotriene B4 and its omega-oxidation products by reversed-phase high-performance liquid chromatography.

    PubMed

    Shak, S

    1987-01-01

    LTB4 and its omega-oxidation products may be rapidly, sensitively, and specifically quantitated by the methods of solid-phase extraction and reversed-phase high-performance liquid chromatography (HPLC), which are described in this chapter. Although other techniques, such as radioimmunoassay or gas chromatography-mass spectrometry, may be utilized for quantitative analysis of the lipoxygenase products of arachidonic acid, only the technique of reversed-phase HPLC can quantitate as many as 10 metabolites in a single analysis, without prior derivatization. In this chapter, we also reviewed the chromatographic theory which we utilized in order to optimize reversed-phase HPLC analysis of LTB4 and its omega-oxidation products. With this information and a gradient HPLC system, it is possible for any investigator to develop a powerful assay for the potent inflammatory mediator, LTB4, or for any other lipoxygenase product of arachidonic acid.

  1. Quantitative microbiological risk assessment in food industry: Theory and practical application.

    PubMed

    Membré, Jeanne-Marie; Boué, Géraldine

    2018-04-01

    The objective of this article is to bring scientific background as well as practical hints and tips to guide risk assessors and modelers who want to develop a quantitative Microbiological Risk Assessment (MRA) in an industrial context. MRA aims at determining the public health risk associated with biological hazards in a food. Its implementation in industry enables to compare the efficiency of different risk reduction measures, and more precisely different operational settings, by predicting their effect on the final model output. The first stage in MRA is to clearly define the purpose and scope with stakeholders, risk assessors and modelers. Then, a probabilistic model is developed; this includes schematically three important phases. Firstly, the model structure has to be defined, i.e. the connections between different operational processing steps. An important step in food industry is the thermal processing leading to microbial inactivation. Growth of heat-treated surviving microorganisms and/or post-process contamination during storage phase is also important to take into account. Secondly, mathematical equations are determined to estimate the change of microbial load after each processing step. This phase includes the construction of model inputs by collecting data or eliciting experts. Finally, the model outputs are obtained by simulation procedures, they have to be interpreted and communicated to targeted stakeholders. In this latter phase, tools such as what-if scenarios provide an essential added value. These different MRA phases are illustrated through two examples covering important issues in industry. The first one covers process optimization in a food safety context, the second one covers shelf-life determination in a food quality context. Although both contexts required the same methodology, they do not have the same endpoint: up to the human health in the foie gras case-study illustrating here a safety application, up to the food portion in the

  2. Interspecific interactions in phytophagous insects revisited: a quantitative assessment of competition theory.

    PubMed

    Kaplan, Ian; Denno, Robert F

    2007-10-01

    The importance of interspecific competition is a highly controversial and unresolved issue for community ecology in general, and for phytophagous insects in particular. Recent advancements, however, in our understanding of indirect (plant- and enemy-mediated) interactions challenge the historical paradigms of competition. Thus, in the context of this rapidly developing field, we re-evaluate the evidence for interspecific competition in phytophagous insects using a meta-analysis of published studies. Our analysis is specifically designed to test the assumptions underlying traditional competition theory, namely that competitive interactions are symmetrical, necessitate spatial and temporal co-occurrence, and increase in intensity as the density, phylogenetic similarity, and niche overlap of competing species increase. Despite finding frequent evidence for competition, we found very little evidence that plant-feeding insects conform to theoretical predictions for interspecific competition. Interactions were highly asymmetrical, similar in magnitude within vs. between feeding guilds (chewers vs. sap-feeders), and were unaffected by the quantity of resources removed (% defoliation). There was mixed support for the effects of phylogeny, spatial/temporal separation, and the relative strength of intra- vs. interspecific competition. Clearly, a new paradigm that accounts for indirect interactions and facilitation is required to describe how interspecific competition contributes to the organization of phytophagous insect communities, and perhaps to other plant and animal communities as well.

  3. Decoherence estimation in quantum theory and beyond

    NASA Astrophysics Data System (ADS)

    Pfister, Corsin

    The quantum physics literature provides many different characterizations of decoherence. Most of them have in common that they describe decoherence as a kind of influence on a quantum system upon interacting with an another system. In the spirit of quantum information theory, we adapt a particular viewpoint on decoherence which describes it as the loss of information into a system that is possibly controlled by an adversary. We use a quantitative framework for decoherence that builds on operational characterizations of the min-entropy that have been developed in the quantum information literature. It characterizes decoherence as an influence on quantum channels that reduces their suitability for a variety of quantifiable tasks such as the distribution of secret cryptographic keys of a certain length or the distribution of a certain number of maximally entangled qubit pairs. This allows for a quantitative and operational characterization of decoherence via operational characterizations of the min-entropy. In this thesis, we present a series of results about the estimation of the minentropy, subdivided into three parts. The first part concerns the estimation of a quantum adversary's uncertainty about classical information--expressed by the smooth min-entropy--as it is done in protocols for quantum key distribution (QKD). We analyze this form of min-entropy estimation in detail and find that some of the more recently suggested QKD protocols have previously unnoticed security loopholes. We show that the specifics of the sifting subroutine of a QKD protocol are crucial for security by pointing out mistakes in the security analysis in the literature and by presenting eavesdropping attacks on those problematic protocols. We provide solutions to the identified problems and present a formalized analysis of the min-entropy estimate that incorporates the sifting stage of QKD protocols. In the second part, we extend ideas from QKD to a protocol that allows to estimate an

  4. Focal Point Theory Models for Dissecting Dynamic Duality Problems of Microbial Infections

    PubMed Central

    Huang, S.-H.; Zhou, W.; Jong, A.

    2008-01-01

    Extending along the dynamic continuum from conflict to cooperation, microbial infections always involve symbiosis (Sym) and pathogenesis (Pat). There exists a dynamic Sym-Pat duality (DSPD) in microbial infection that is the most fundamental problem in infectomics. DSPD is encoded by the genomes of both the microbes and their hosts. Three focal point (FP) theory-based game models (pure cooperative, dilemma, and pure conflict) are proposed for resolving those problems. Our health is associated with the dynamic interactions of three microbial communities (nonpathogenic microbiota (NP) (Cooperation), conditional pathogens (CP) (Dilemma), and unconditional pathogens (UP) (Conflict)) with the hosts at different health statuses. Sym and Pat can be quantitated by measuring symbiotic index (SI), which is quantitative fitness for the symbiotic partnership, and pathogenic index (PI), which is quantitative damage to the symbiotic partnership, respectively. Symbiotic point (SP), which bears analogy to FP, is a function of SI and PI. SP-converting and specific pathogen-targeting strategies can be used for the rational control of microbial infections. PMID:18350122

  5. Quantitative and qualitative approaches in educational research — problems and examples of controlled understanding through interpretive methods

    NASA Astrophysics Data System (ADS)

    Neumann, Karl

    1987-06-01

    In the methodological discussion of recent years it has become apparent that many research problems, including problems relating to the theory of educational science, cannot be solved by using quantitative methods. The multifaceted aspects of human behaviour and all its environment-bound subtle nuances, especially the process of education or the development of identity, cannot fully be taken into account within a rigid neopositivist approach. In employing the paradigm of symbolic interactionism as a suitable model for the analysis of processes of education and formation, the research has generally to start out from complex reciprocal social interactions instead of unambigious connections of causes. In analysing several particular methodological problems, the article demonstrates some weaknesses of quantitative approaches and then shows the advantages in and the necessity for using qualitative research tools.

  6. Quantitative habitability.

    PubMed

    Shock, Everett L; Holland, Melanie E

    2007-12-01

    A framework is proposed for a quantitative approach to studying habitability. Considerations of environmental supply and organismal demand of energy lead to the conclusions that power units are most appropriate and that the units for habitability become watts per organism. Extreme and plush environments are revealed to be on a habitability continuum, and extreme environments can be quantified as those where power supply only barely exceeds demand. Strategies for laboratory and field experiments are outlined that would quantify power supplies, power demands, and habitability. An example involving a comparison of various metabolisms pursued by halophiles is shown to be well on the way to a quantitative habitability analysis.

  7. [Talk about nomenclature of twelve meridians from quantitative yin-yang theory].

    PubMed

    Zhao, Xi-xin; Wang, Xue-xia; Zhao, Zhao; Ran, Peng-fei; Lü, Xiao-rui

    2009-03-01

    Based on leads provided by Neijing and other literature, analyze origins of the three-yin and the three-yang and the their respective contents of yin and yang, indicating the principle that the order of yang-qi from more to less is Yang ming, Tai yang, Shao yang, and the order of yin-qi is Tai yin, Shao yin, Jue yin. According to the location of five (six) zang-organs, respective yin-qi content is defined, and according to the principle of more yin-qi matches more, and less yin-qi matches less, five (six) zang-organs match each other. The zang-organs above the diaphragm joints with The Hand-Channels and the zang-organs below the diaphragm with The Foot-Channels, completing the nomenclature of twelve meridians. The names of the six yang-channels correspond to the yin-channels of the exterior-interior relationship, the yin-channels link with hands (feet), and the yang-channels also link with hands (feet), and the amount of yin-qi of the zang-organs corresponding to the yin-channels and the amount of yang-qi of the fu-organs corresponding to yang-channels are in a state of balance. Based on this principle, nomenclature of six channels are completed. Emphasize that the nomenclature of twelve meridians contains profound TCM theories, especially, TCM, by yin-yang, three-yin and three- yang, illustrates living phenomena from the whole to the system and organ level in human body, and the scientific principle "yin-yang can be unlimitedly divided" and its significance, which must guide the studies on living phenomena with modern life sciences from the whole to the molecular level.

  8. Linking ecosystem services and human-values theory.

    PubMed

    Hicks, Christina C; Cinner, Joshua E; Stoeckl, Natalie; McClanahan, Tim R

    2015-10-01

    Understanding why people make the decisions they do remains a fundamental challenge facing conservation science. Ecosystem service (ES) (a benefit people derive from an ecosystem) approaches to conservation reflect efforts to anticipate people's preferences and influence their environmental behavior. Yet, the design of ES approaches seldom includes psychological theories of human behavior. We sought to alleviate this omission by applying a psychological theory of human values to a cross-cultural ES assessment. We used interviews and focus groups with fish workers from 28 coral reef fishing communities in 4 countries to qualitatively identify the motivations (i.e., human values) underlying preferences for ES; quantitatively evaluate resource user ES priorities; and identify common patterns among ES motivations and ES priorities (i.e., trade-offs and synergies). Three key findings are evident that align with human values theory. First, motivations underlying preferences for individual ESs reflected multiple human values within the same value domain (e.g., self-enhancement). Second, when averaged at community or country scales, the order of ES priorities was consistent. However, the order belied significant variation that existed among individuals. Third, in line with human values theory, ESs related to one another in a consistent pattern; certain service pairs reflected trade-off relationships (e.g., supporting and provisioning), whereas other service pairs reflected synergistic relationships (e.g., supporting and regulating). Together, these findings help improve understanding of when and why convergence and trade-offs in people's preferences for ESs occur, and this knowledge can inform the development of suitable conservation actions. © 2015 The Authors. Conservation Biology published by Wiley Periodicals, Inc. on behalf of the Society for Conservation Biology.

  9. Addressing Pre-Service Teachers' Understandings and Difficulties with Some Core Concepts in the Special Theory of Relativity

    ERIC Educational Resources Information Center

    Selcuk, Gamze Sezgin

    2011-01-01

    The aim of this study is to investigate pre-service teachers' understanding of and difficulties with some core concepts in the special theory of relativity. The pre-service teachers (n = 185) from the Departments of Physics Education and Elementary Science Education at Dokuz Eylul University (in Turkey) participated. Both quantitative and…

  10. Coulomb-free and Coulomb-distorted recolliding quantum orbits in photoelectron holography

    NASA Astrophysics Data System (ADS)

    Maxwell, A. S.; Figueira de Morisson Faria, C.

    2018-06-01

    We perform a detailed analysis of the different types of orbits in the Coulomb quantum orbit strong-field approximation (CQSFA), ranging from direct to those undergoing hard collisions. We show that some of them exhibit clear counterparts in the standard formulations of the strong-field approximation for direct and rescattered above-threshold ionization, and show that the standard orbit classification commonly used in Coulomb-corrected models is over-simplified. We identify several types of rescattered orbits, such as those responsible for the low-energy structures reported in the literature, and determine the momentum regions in which they occur. We also find formerly overlooked interference patterns caused by backscattered Coulomb-corrected orbits and assess their effect on photoelectron angular distributions. These orbits improve the agreement of photoelectron angular distributions computed with the CQSFA with the outcome of ab initio methods for high energy phtotoelectrons perpendicular to the field polarization axis.

  11. Hysteretic phenomena in GFET: Comprehensive theory and experiment

    NASA Astrophysics Data System (ADS)

    Kurchak, Anatolii I.; Morozovska, Anna N.; Strikha, Maksym V.

    2017-07-01

    We propose a comprehensive analytical theory for the description of versatile hysteretic phenomena in a graphene field effect transistor (GFET). Our theory account for the existence of the three most important rival factors, such as external dipoles on graphene free surface, localized states at the graphene-substrate interface, and the bound polarization charge coming from a ferroelectric substrate. In particular, we demonstrated that the absorbed dipole molecules (e.g., dissociated or highly polarized water molecules) can cause hysteretic form of carrier concentration as a function of gate voltage and corresponding dependence of graphene conductivity in GFET on the substrate of different types, including the most common SiO2 and ferroelectric ones. It was shown that the increase in the gate voltage sweeping rate leads to the complete vanishing of hysteresis for GFET on SiO2 substrate as well as for GFET on ferroelectric substrate for applied electric fields E less than the critical value Ec. For E > Ec, the cross-over from the anti-hysteresis to hysteresis take place. The carriers' trapping from the graphene channel by the interface states describes the "anti-hysteresis" in GFET on PZT substrate well enough. These results well correlate with the available experimental data up to the quantitative agreement. So, the obtained analytical results predict new and clarify existing effects in GFET. They describe quantitatively the physical principles of GFET operation and can become the first necessary step to transform the state-of-art from almost empirical to analytical level, because they can be directly applied to describe the basic characteristics of advanced non-volatile ultra-fast memory devices using GFET on versatile substrates.

  12. Measuring teamwork in primary care: Triangulation of qualitative and quantitative data.

    PubMed

    Brown, Judith Belle; Ryan, Bridget L; Thorpe, Cathy; Markle, Emma K R; Hutchison, Brian; Glazier, Richard H

    2015-09-01

    This article describes the triangulation of qualitative dimensions, reflecting high functioning teams, with the results of standardized teamwork measures. The study used a mixed methods design using qualitative and quantitative approaches to assess teamwork in 19 Family Health Teams in Ontario, Canada. This article describes dimensions from the qualitative phase using grounded theory to explore the issues and challenges to teamwork. Two quantitative measures were used in the study, the Team Climate Inventory (TCI) and the Providing Effective Resources and Knowledge (PERK) scale. For the triangulation analysis, the mean scores of these measures were compared with the qualitatively derived ratings for the dimensions. The final sample for the qualitative component was 107 participants. The qualitative analysis identified 9 dimensions related to high team functioning such as common philosophy, scope of practice, conflict resolution, change management, leadership, and team evolution. From these dimensions, teams were categorized numerically as high, moderate, or low functioning. Three hundred seventeen team members completed the survey measures. Mean site scores for the TCI and PERK were 3.87 and 3.88, respectively (of 5). The TCI was associated will all dimensions except for team location, space allocation, and executive director leadership. The PERK was associated with all dimensions except team location. Data triangulation provided qualitative and quantitative evidence of what constitutes teamwork. Leadership was pivotal in forging a common philosophy and encouraging team collaboration. Teams used conflict resolution strategies and adapted to the changes they encountered. These dimensions advanced the team's evolution toward a high functioning team. (c) 2015 APA, all rights reserved).

  13. The quantitation of buffering action II. Applications of the formal & general approach.

    PubMed

    Schmitt, Bernhard M

    2005-03-16

    The paradigm of "buffering" originated in acid-base physiology, but was subsequently extended to other fields and is now used for a wide and diverse set of phenomena. In the preceding article, we have presented a formal and general approach to the quantitation of buffering action. Here, we use that buffering concept for a systematic treatment of selected classical and other buffering phenomena. H+ buffering by weak acids and "self-buffering" in pure water represent "conservative buffered systems" whose analysis reveals buffering properties that contrast in important aspects from classical textbook descriptions. The buffering of organ perfusion in the face of variable perfusion pressure (also termed "autoregulation") can be treated in terms of "non-conservative buffered systems", the general form of the concept. For the analysis of cytoplasmic Ca++ concentration transients (also termed "muffling"), we develop a related unit that is able to faithfully reflect the time-dependent quantitative aspect of buffering during the pre-steady state period. Steady-state buffering is shown to represent the limiting case of time-dependent muffling, namely for infinitely long time intervals and infinitely small perturbations. Finally, our buffering concept provides a stringent definition of "buffering" on the level of systems and control theory, resulting in four absolute ratio scales for control performance that are suited to measure disturbance rejection and setpoint tracking, and both their static and dynamic aspects. Our concept of buffering provides a powerful mathematical tool for the quantitation of buffering action in all its appearances.

  14. Hadron diffractive production at ultrahigh energies and shadow effects

    NASA Astrophysics Data System (ADS)

    Anisovich, V. V.; Matveev, M. A.; Nikonov, V. A.

    2016-10-01

    Shadow effects at collisions of hadrons with light nuclei at high energies were subject of scientific interest of V.N. Gribov, first, we mean his study of the hadron-deuteron scattering, see Sov. Phys. JETP 29, 483 (1969) [Zh. Eksp. Teor. Fiz. 56, 892 (1969)] and discovery of the reinforcement of shadowing due to inelastic diffractive rescatterings. It turns out that the similar effect exists on hadron level though at ultrahigh energies. Diffractive production is considered in the ultrahigh energy region where pomeron exchange amplitudes are transformed into black disk ones due to rescattering corrections. The corresponding corrections in hadron reactions h1 + h3 → h1 + h2 + h3 with small momenta transferred (q1→12 ˜ m2/ln2s, q3→32 ˜ m2/ln2s) are calculated in terms of the K-matrix technique modified for ultrahigh energies. Small values of the momenta transferred are crucial for introducing equations for amplitudes. The three-body equation for hadron diffractive production reaction h1 + h3 → h1 + h2 + h3 is written and solved precisely in the eikonal approach. In the black disk regime final state scattering processes do not change the shapes of amplitudes principally but dump amplitudes by a factor ˜ 1 4; initial state rescatterings result in additional factor ˜ 1 2. In the resonant disk regime initial and final state scatterings damp strongly the production amplitude that corresponds to σinel/σtot → 0 at s →∞ in this mode.

  15. Hadron Diffractive Production at Ultrahigh Energies and Shadow Effects

    NASA Astrophysics Data System (ADS)

    Anisovich, V. V.; Matveev, M. A.; Nikonov, V. A.

    Shadow effects at collisions of hadrons with light nuclei at high energies were subject of scientific interest of V.N. Gribov, first, we mean his study of the hadron-deuteron scattering, see Sov. Phys. JETP 29, 483 (1969) [Zh. Eksp. Teor. Fiz. 56, 892 (1969)] and discovery of the reinforcement of shadowing due to inelastic diffractive rescatterings. It turns out that the similar effect exists on hadron level though at ultrahigh energies... Diffractive production is considered in the ultrahigh energy region where pomeron exchange amplitudes are transformed into black disk ones due to rescattering corrections. The corresponding corrections in hadron reactions h1 + h3 → h1 + h2 + h3 with small momenta transferred (q^2_{1 to 1} m^2/ ln^2 s, q^2_{3 to 3} m^2/ ln^2 s) are calculated in terms of the K-matrix technique modified for ultrahigh energies. Small values of the momenta transferred are crucial for introducing equations for amplitudes. The three-body equation for hadron diffractive production reaction h1 + h3 → h1 + h2 + h3 is written and solved precisely in the eikonal approach. In the black disk regime final state scattering processes do not change the shapes of amplitudes principally but dump amplitudes by a factor 1/4 initial state rescatterings result in additional factor 1/2. In the resonant disk regime initial and final state scatterings damp strongly the production amplitude that corresponds to σ_{inel}/σ_{tot} to 0 at √{s}to ∞ in this mode.

  16. When do particle ratios freeze out in relativistic heavy ion collisions?

    NASA Astrophysics Data System (ADS)

    Humanic, Thomas; Bellwied, Rene

    1999-10-01

    The systematics of CERN SPS data for transverse mass distributions have been shown to imply that thermal equilibrium is achieved at freeze out in these collisions. This conclusion is based on the observation that for p+p, S+S, and Pb+Pb collisions freeze out occurs at a single temperature for all particle species measured if one assumes a certain uniform expansion velocity after hadronization for each colliding system [1]. A recent final- state rescattering calculation for SPS Pb+Pb collisions has shown that these systematics can be described as a consequence of particle rescattering where the system is assumed initially (i.e. at hadronization) to have a common temperature for all particles and no initial expansion velocity [2]. In addition to kinetic observables, it is equally interesting to investigate the time dependence of particle abundances through particle ratios in such a calculation. Two questions immediately arise: 1) is chemical equilibrium established in these collisions, and 2) when does chemical freeze out occur with respect to thermal freeze out for different particle ratios? How rescattering influences particle ratios is clearly of interest if one would like to deduce information about the hadronization stage of the collision from particle ratios measured at freeze out. For the present work we will show results for strange and non-strange particle ratios within the context of a version of the dynamic transport code used in Ref. [2]. [1] NA44 colaboration, I.G. Bearden et al., Phys. Rev. Lett. 78,2080(1997), [2] T. J. Humanic, Phys. Rev. C 57,866(1998)

  17. Building a Middle-Range Theory of Adaptive Spirituality.

    PubMed

    Dobratz, Marjorie C

    2016-04-01

    The purpose of this article is to describe a Roy adaptation model based- research abstraction, the findings of which were synthesized into a middle-range theory (MRT) of adaptive spirituality. The published literature yielded 21 empirical studies that investigated religion/spirituality. Quantitative results supported the influence of spirituality on quality of life, psychosocial adjustment, well-being, adaptive coping, and the self-concept mode. Qualitative findings showed the importance of spiritual expressions, values, and beliefs in adapting to chronic illness, bereavement, death, and other life transitions. These findings were abstracted into six theoretical statements, a conceptual definition of adaptive spirituality, and three hypotheses for future testing. © The Author(s) 2016.

  18. A theory of ring formation around Be stars

    NASA Technical Reports Server (NTRS)

    Huang, S.-S.

    1976-01-01

    A theory for the formation of gaseous rings around Be stars is developed which involves the combined effect of stellar rotation and radiation pressure. A qualitative scenario of ring formation is outlined in which the envelope formed about a star from ejected material is in the form of a disk in the equatorial plane, collisions between ejected gas blobs are inevitable, and particles with high angular momenta form a rotating ring around the star. A quantitative description of this process is then formulated by considering the angular momentum and dynamical energy of the ejected matter as well as those of the ring alone, without introducing any other assumptions.

  19. Quantitative Estimation of the Amount of Fibrosis in the Rat Liver Using Fractal Dimension of the Shape of Power Spectrum

    NASA Astrophysics Data System (ADS)

    Kikuchi, Tsuneo; Nakazawa, Toshihiro; Furukawa, Tetsuo; Higuchi, Toshiyuki; Maruyama, Yukio; Sato, Sojun

    1995-05-01

    This paper describes the quantitative measurement of the amount of fibrosis in the rat liver using the fractal dimension of the shape of power spectrum. The shape of the power spectrum of the scattered echo from biotissues is strongly affected by its internal structure. The fractal dimension, which is one of the important parameters of the fractal theory, is useful to express the complexity of shape of figures such as the power spectrum. From in vitro experiments using rat liver, it was found that this method can be used to quantitatively measure the amount of fibrosis in the liver, and has the possibility for use in the diagnosis of human liver cirrhosis.

  20. Polymer on Top: Current Limits and Future Perspectives of Quantitatively Evaluating Surface Grafting.

    PubMed

    Michalek, Lukas; Barner, Leonie; Barner-Kowollik, Christopher

    2018-03-07

    Well-defined polymer strands covalently tethered onto solid substrates determine the properties of the resulting functional interface. Herein, the current approaches to determine quantitative grafting densities are assessed. Based on a brief introduction into the key theories describing polymer brush regimes, a user's guide is provided to estimating maximum chain coverage and-importantly-examine the most frequently employed approaches for determining grafting densities, i.e., dry thickness measurements, gravimetric assessment, and swelling experiments. An estimation of the reliability of these determination methods is provided via carefully evaluating their assumptions and assessing the stability of the underpinning equations. A practical access guide for comparatively and quantitatively evaluating the reliability of a given approach is thus provided, enabling the field to critically judge experimentally determined grafting densities and to avoid the reporting of grafting densities that fall outside the physically realistic parameter space. The assessment is concluded with a perspective on the development of advanced approaches for determination of grafting density, in particular, on single-chain methodologies. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Prospect theory on the brain? Toward a cognitive neuroscience of decision under risk.

    PubMed

    Trepel, Christopher; Fox, Craig R; Poldrack, Russell A

    2005-04-01

    Most decisions must be made without advance knowledge of their consequences. Economists and psychologists have devoted much attention to modeling decisions made under conditions of risk in which options can be characterized by a known probability distribution over possible outcomes. The descriptive shortcomings of classical economic models motivated the development of prospect theory (D. Kahneman, A. Tversky, Prospect theory: An analysis of decision under risk. Econometrica, 4 (1979) 263-291; A. Tversky, D. Kahneman, Advances in prospect theory: Cumulative representation of uncertainty. Journal of Risk and Uncertainty, 5 (4) (1992) 297-323) the most successful behavioral model of decision under risk. In the prospect theory, subjective value is modeled by a value function that is concave for gains, convex for losses, and steeper for losses than for gains; the impact of probabilities are characterized by a weighting function that overweights low probabilities and underweights moderate to high probabilities. We outline the possible neural bases of the components of prospect theory, surveying evidence from human imaging, lesion, and neuropharmacology studies as well as animal neurophysiology studies. These results provide preliminary suggestions concerning the neural bases of prospect theory that include a broad set of brain regions and neuromodulatory systems. These data suggest that focused studies of decision making in the context of quantitative models may provide substantial leverage towards a fuller understanding of the cognitive neuroscience of decision making.

  2. Explore Stochastic Instabilities of Periodic Points by Transition Path Theory

    NASA Astrophysics Data System (ADS)

    Cao, Yu; Lin, Ling; Zhou, Xiang

    2016-06-01

    We consider the noise-induced transitions from a linearly stable periodic orbit consisting of T periodic points in randomly perturbed discrete logistic map. Traditional large deviation theory and asymptotic analysis at small noise limit cannot distinguish the quantitative difference in noise-induced stochastic instabilities among the T periodic points. To attack this problem, we generalize the transition path theory to the discrete-time continuous-space stochastic process. In our first criterion to quantify the relative instability among T periodic points, we use the distribution of the last passage location related to the transitions from the whole periodic orbit to a prescribed disjoint set. This distribution is related to individual contributions to the transition rate from each periodic points. The second criterion is based on the competency of the transition paths associated with each periodic point. Both criteria utilize the reactive probability current in the transition path theory. Our numerical results for the logistic map reveal the transition mechanism of escaping from the stable periodic orbit and identify which periodic point is more prone to lose stability so as to make successful transitions under random perturbations.

  3. Sensory conflict in motion sickness: An observer theory approach

    NASA Technical Reports Server (NTRS)

    Oman, Charles M.

    1989-01-01

    Motion sickness is the general term describing a group of common nausea syndromes originally attributed to motion-induced cerebral ischemia, stimulation of abdominal organ afferent, or overstimulation of the vestibular organs of the inner ear. Sea-, car-, and airsicknesses are the most commonly experienced examples. However, the discovery of other variants such as Cinerama-, flight simulator-, spectacle-, and space sickness in which the physical motion of the head and body is normal or absent has led to a succession of sensory conflict theories which offer a more comprehensive etiologic perspective. Implicit in the conflict theory is the hypothesis that neutral and/or humoral signals originate in regions of the brain subversing spatial orientation, and that these signals somehow traverse to other centers mediating sickness symptoms. Unfortunately, the present understanding of the neurophysiological basis of motion sickness is far from complete. No sensory conflict neuron or process has yet been physiologically identified. To what extent can the existing theory be reconciled with current knowledge of the physiology and pharmacology of nausea and vomiting. The stimuli which causes sickness, synthesizes a contemporary Observer Theory view of the Sensory Conflict hypothesis are reviewed, and a revised model for the dynamic coupling between the putative conflict signals and nausea magnitude estimates is presented. The use of quantitative models for sensory conflict offers a possible new approach to improving the design of visual and motion systems for flight simulators and other virtual environment display systems.

  4. Quantitative Finance

    NASA Astrophysics Data System (ADS)

    James, Jessica

    2017-01-01

    Quantitative finance is a field that has risen to prominence over the last few decades. It encompasses the complex models and calculations that value financial contracts, particularly those which reference events in the future, and apply probabilities to these events. While adding greatly to the flexibility of the market available to corporations and investors, it has also been blamed for worsening the impact of financial crises. But what exactly does quantitative finance encompass, and where did these ideas and models originate? We show that the mathematics behind finance and behind games of chance have tracked each other closely over the centuries and that many well-known physicists and mathematicians have contributed to the field.

  5. Spatial Direct Numerical Simulation of Boundary-Layer Transition Mechanisms: Validation of PSE Theory

    NASA Technical Reports Server (NTRS)

    Joslin, R. D.; Streett, C. L.; Chang, C.-L.

    1991-01-01

    A study of instabilities in incompressible boundary-layer flow on a flat plate is conducted by spatial direct numerical simulation (DNS) of the Navier-Stokes equations. Here, the DNS results are used to critically evaluate the results obtained using parabolized stability equations (PSE) theory and to study mechanisms associated with breakdown from laminar to turbulent flow. Three test cases are considered: two-dimensional Tollmien-Schlichting wave propagation, subharmonic instability breakdown, and oblique-wave break-down. The instability modes predicted by PSE theory are in good quantitative agreement with the DNS results, except a small discrepancy is evident in the mean-flow distortion component of the 2-D test problem. This discrepancy is attributed to far-field boundary- condition differences. Both DNS and PSE theory results show several modal discrepancies when compared with the experiments of subharmonic breakdown. Computations that allow for a small adverse pressure gradient in the basic flow and a variation of the disturbance frequency result in better agreement with the experiments.

  6. Ross, macdonald, and a theory for the dynamics and control of mosquito-transmitted pathogens.

    PubMed

    Smith, David L; Battle, Katherine E; Hay, Simon I; Barker, Christopher M; Scott, Thomas W; McKenzie, F Ellis

    2012-01-01

    Ronald Ross and George Macdonald are credited with developing a mathematical model of mosquito-borne pathogen transmission. A systematic historical review suggests that several mathematicians and scientists contributed to development of the Ross-Macdonald model over a period of 70 years. Ross developed two different mathematical models, Macdonald a third, and various "Ross-Macdonald" mathematical models exist. Ross-Macdonald models are best defined by a consensus set of assumptions. The mathematical model is just one part of a theory for the dynamics and control of mosquito-transmitted pathogens that also includes epidemiological and entomological concepts and metrics for measuring transmission. All the basic elements of the theory had fallen into place by the end of the Global Malaria Eradication Programme (GMEP, 1955-1969) with the concept of vectorial capacity, methods for measuring key components of transmission by mosquitoes, and a quantitative theory of vector control. The Ross-Macdonald theory has since played a central role in development of research on mosquito-borne pathogen transmission and the development of strategies for mosquito-borne disease prevention.

  7. Ross, Macdonald, and a Theory for the Dynamics and Control of Mosquito-Transmitted Pathogens

    PubMed Central

    Smith, David L.; Battle, Katherine E.; Hay, Simon I.; Barker, Christopher M.; Scott, Thomas W.; McKenzie, F. Ellis

    2012-01-01

    Ronald Ross and George Macdonald are credited with developing a mathematical model of mosquito-borne pathogen transmission. A systematic historical review suggests that several mathematicians and scientists contributed to development of the Ross-Macdonald model over a period of 70 years. Ross developed two different mathematical models, Macdonald a third, and various “Ross-Macdonald” mathematical models exist. Ross-Macdonald models are best defined by a consensus set of assumptions. The mathematical model is just one part of a theory for the dynamics and control of mosquito-transmitted pathogens that also includes epidemiological and entomological concepts and metrics for measuring transmission. All the basic elements of the theory had fallen into place by the end of the Global Malaria Eradication Programme (GMEP, 1955–1969) with the concept of vectorial capacity, methods for measuring key components of transmission by mosquitoes, and a quantitative theory of vector control. The Ross-Macdonald theory has since played a central role in development of research on mosquito-borne pathogen transmission and the development of strategies for mosquito-borne disease prevention. PMID:22496640

  8. Classical nucleation theory in the phase-field crystal model

    NASA Astrophysics Data System (ADS)

    Jreidini, Paul; Kocher, Gabriel; Provatas, Nikolas

    2018-04-01

    A full understanding of polycrystalline materials requires studying the process of nucleation, a thermally activated phase transition that typically occurs at atomistic scales. The numerical modeling of this process is problematic for traditional numerical techniques: commonly used phase-field methods' resolution does not extend to the atomic scales at which nucleation takes places, while atomistic methods such as molecular dynamics are incapable of scaling to the mesoscale regime where late-stage growth and structure formation takes place following earlier nucleation. Consequently, it is of interest to examine nucleation in the more recently proposed phase-field crystal (PFC) model, which attempts to bridge the atomic and mesoscale regimes in microstructure simulations. In this work, we numerically calculate homogeneous liquid-to-solid nucleation rates and incubation times in the simplest version of the PFC model, for various parameter choices. We show that the model naturally exhibits qualitative agreement with the predictions of classical nucleation theory (CNT) despite a lack of some explicit atomistic features presumed in CNT. We also examine the early appearance of lattice structure in nucleating grains, finding disagreement with some basic assumptions of CNT. We then argue that a quantitatively correct nucleation theory for the PFC model would require extending CNT to a multivariable theory.

  9. Classical nucleation theory in the phase-field crystal model.

    PubMed

    Jreidini, Paul; Kocher, Gabriel; Provatas, Nikolas

    2018-04-01

    A full understanding of polycrystalline materials requires studying the process of nucleation, a thermally activated phase transition that typically occurs at atomistic scales. The numerical modeling of this process is problematic for traditional numerical techniques: commonly used phase-field methods' resolution does not extend to the atomic scales at which nucleation takes places, while atomistic methods such as molecular dynamics are incapable of scaling to the mesoscale regime where late-stage growth and structure formation takes place following earlier nucleation. Consequently, it is of interest to examine nucleation in the more recently proposed phase-field crystal (PFC) model, which attempts to bridge the atomic and mesoscale regimes in microstructure simulations. In this work, we numerically calculate homogeneous liquid-to-solid nucleation rates and incubation times in the simplest version of the PFC model, for various parameter choices. We show that the model naturally exhibits qualitative agreement with the predictions of classical nucleation theory (CNT) despite a lack of some explicit atomistic features presumed in CNT. We also examine the early appearance of lattice structure in nucleating grains, finding disagreement with some basic assumptions of CNT. We then argue that a quantitatively correct nucleation theory for the PFC model would require extending CNT to a multivariable theory.

  10. Teaching quantitative biology: goals, assessments, and resources

    PubMed Central

    Aikens, Melissa L.; Dolan, Erin L.

    2014-01-01

    More than a decade has passed since the publication of BIO2010, calling for an increased emphasis on quantitative skills in the undergraduate biology curriculum. In that time, relatively few papers have been published that describe educational innovations in quantitative biology or provide evidence of their effects on students. Using a “backward design” framework, we lay out quantitative skill and attitude goals, assessment strategies, and teaching resources to help biologists teach more quantitatively. Collaborations between quantitative biologists and education researchers are necessary to develop a broader and more appropriate suite of assessment tools, and to provide much-needed evidence on how particular teaching strategies affect biology students' quantitative skill development and attitudes toward quantitative work. PMID:25368425

  11. Quantitative analysis of spatial variability of geotechnical parameters

    NASA Astrophysics Data System (ADS)

    Fang, Xing

    2018-04-01

    Geotechnical parameters are the basic parameters of geotechnical engineering design, while the geotechnical parameters have strong regional characteristics. At the same time, the spatial variability of geotechnical parameters has been recognized. It is gradually introduced into the reliability analysis of geotechnical engineering. Based on the statistical theory of geostatistical spatial information, the spatial variability of geotechnical parameters is quantitatively analyzed. At the same time, the evaluation of geotechnical parameters and the correlation coefficient between geotechnical parameters are calculated. A residential district of Tianjin Survey Institute was selected as the research object. There are 68 boreholes in this area and 9 layers of mechanical stratification. The parameters are water content, natural gravity, void ratio, liquid limit, plasticity index, liquidity index, compressibility coefficient, compressive modulus, internal friction angle, cohesion and SP index. According to the principle of statistical correlation, the correlation coefficient of geotechnical parameters is calculated. According to the correlation coefficient, the law of geotechnical parameters is obtained.

  12. Teaching Theory X and Theory Y in Organizational Communication

    ERIC Educational Resources Information Center

    Noland, Carey

    2014-01-01

    The purpose of the activity described here is to integrate McGregor's Theory X and Theory Y into a group application: design a syllabus that embodies either Theory X or Theory Y tenets. Students should be able to differentiate between Theory X and Theory Y, create a syllabus based on Theory X or Theory Y tenets, evaluate the different syllabi…

  13. A review of empirical research related to the use of small quantitative samples in clinical outcome scale development.

    PubMed

    Houts, Carrie R; Edwards, Michael C; Wirth, R J; Deal, Linda S

    2016-11-01

    There has been a notable increase in the advocacy of using small-sample designs as an initial quantitative assessment of item and scale performance during the scale development process. This is particularly true in the development of clinical outcome assessments (COAs), where Rasch analysis has been advanced as an appropriate statistical tool for evaluating the developing COAs using a small sample. We review the benefits such methods are purported to offer from both a practical and statistical standpoint and detail several problematic areas, including both practical and statistical theory concerns, with respect to the use of quantitative methods, including Rasch-consistent methods, with small samples. The feasibility of obtaining accurate information and the potential negative impacts of misusing large-sample statistical methods with small samples during COA development are discussed.

  14. Effective theories of universal theories

    DOE PAGES

    Wells, James D.; Zhang, Zhengkang

    2016-01-20

    It is well-known but sometimes overlooked that constraints on the oblique parameters (most notably S and T parameters) are generally speaking only applicable to a special class of new physics scenarios known as universal theories. The oblique parameters should not be associated with Wilson coefficients in a particular operator basis in the effective field theory (EFT) framework, unless restrictions have been imposed on the EFT so that it describes universal theories. Here, we work out these restrictions, and present a detailed EFT analysis of universal theories. We find that at the dimension-6 level, universal theories are completely characterized by 16more » parameters. They are conveniently chosen to be: 5 oblique parameters that agree with the commonly-adopted ones, 4 anomalous triple-gauge couplings, 3 rescaling factors for the h 3, hff, hV V vertices, 3 parameters for hV V vertices absent in the Standard Model, and 1 four-fermion coupling of order yf 2. Furthermore, all these parameters are defined in an unambiguous and basis-independent way, allowing for consistent constraints on the universal theories parameter space from precision electroweak and Higgs data.« less

  15. Effective theories of universal theories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wells, James D.; Zhang, Zhengkang

    It is well-known but sometimes overlooked that constraints on the oblique parameters (most notably S and T parameters) are generally speaking only applicable to a special class of new physics scenarios known as universal theories. The oblique parameters should not be associated with Wilson coefficients in a particular operator basis in the effective field theory (EFT) framework, unless restrictions have been imposed on the EFT so that it describes universal theories. Here, we work out these restrictions, and present a detailed EFT analysis of universal theories. We find that at the dimension-6 level, universal theories are completely characterized by 16more » parameters. They are conveniently chosen to be: 5 oblique parameters that agree with the commonly-adopted ones, 4 anomalous triple-gauge couplings, 3 rescaling factors for the h 3, hff, hV V vertices, 3 parameters for hV V vertices absent in the Standard Model, and 1 four-fermion coupling of order yf 2. Furthermore, all these parameters are defined in an unambiguous and basis-independent way, allowing for consistent constraints on the universal theories parameter space from precision electroweak and Higgs data.« less

  16. Theories of Career Development. A Comparison of the Theories.

    ERIC Educational Resources Information Center

    Osipow, Samuel H.

    These seven theories of career development are examined in previous chapters: (1) Roe's personality theory, (2) Holland's career typology theory, (3) the Ginzberg, Ginsburg, Axelrod, and Herma Theory, (4) psychoanalytic conceptions, (5) Super's developmental self-concept theory, (6) other personality theories, and (7) social systems theories.…

  17. Quantum Theory of Rare-Earth Magnets

    NASA Astrophysics Data System (ADS)

    Miyake, Takashi; Akai, Hisazumi

    2018-04-01

    Strong permanent magnets mainly consist of rare earths (R) and transition metals (T). The main phase of the neodymium magnet, which is the strongest magnet, is Nd2Fe14B. Sm2Fe17N3 is another magnet compound having excellent magnetic properties comparable to those of Nd2Fe14B. Their large saturation magnetization, strong magnetocrystalline anisotropy, and high Curie temperature originate from the interaction between the T-3d electrons and R-4f electrons. This article discusses the magnetism of rare-earth magnet compounds. The basic theory and first-principles calculation approaches for quantitative description of the magnetic properties are presented, together with applications to typical compounds such as Nd2Fe14B, Sm2Fe17N3, and the recently synthesized NdFe12N.

  18. What Can We Learn from Hadronic and Radiative Decays of Light Mesons?

    NASA Astrophysics Data System (ADS)

    Kubis, Bastian

    2013-04-01

    Chiral perturbation theory offers a powerful tool for the investigation of light pseudoscalar mesons. It incorporates the fundamental symmetries of QCD, interrelates various processes, and allows to link these to the light quark masses. Its shortcomings lie in a limited energy range: the radius of convergence of the chiral expansion is confined to below resonance scales. Furthermore, the strongest consequences of chiral symmetry are manifest for pseudoscalars (pions, kaons, eta) only: vector mesons, e.g., have a severe impact in particular for reactions involving photons. In this talk, I advocate dispersions relations as another model-independent tool to extend the applicability range of chiral perturbation theory. They even allow to tackle the physics of vector mesons in a rigorous way. It will be shown how dispersive methods can be used to resum large rescattering effects, and to provide model-independent links between hadronic and radiative decay modes. Examples to be discussed will include decays of the eta meson, giving access to light-quark-mass ratios or allowing to test the chiral anomaly; and meson transition form factors, which have an important impact on the hadronic light-by-light-scattering contribution to the anomalous magnetic moment of the muon.

  19. Laboratory evolution of the migratory polymorphism in the sand cricket: combining physiology with quantitative genetics.

    PubMed

    Roff, Derek A; Fairbairn, Daphne J

    2007-01-01

    Predicting evolutionary change is the central goal of evolutionary biology because it is the primary means by which we can test evolutionary hypotheses. In this article, we analyze the pattern of evolutionary change in a laboratory population of the wing-dimorphic sand cricket Gryllus firmus resulting from relaxation of selection favoring the migratory (long-winged) morph. Based on a well-characterized trade-off between fecundity and flight capability, we predict that evolution in the laboratory environment should result in a reduction in the proportion of long-winged morphs. We also predict increased fecundity and reduced functionality and weight of the major flight muscles in long-winged females but little change in short-winged (flightless) females. Based on quantitative genetic theory, we predict that the regression equation describing the trade-off between ovary weight and weight of the major flight muscles will show a change in its intercept but not in its slope. Comparisons across generations verify all of these predictions. Further, using values of genetic parameters estimated from previous studies, we show that a quantitative genetic simulation model can account for not only the qualitative changes but also the evolutionary trajectory. These results demonstrate the power of combining quantitative genetic and physiological approaches for understanding the evolution of complex traits.

  20. Nuclear medicine and imaging research (Instrumentation and quantitative methods of evaluation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beck, R.N.; Cooper, M.D.

    1989-09-01

    This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development ofmore » new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility.« less

  1. A Future of Communication Theory: Systems Theory.

    ERIC Educational Resources Information Center

    Lindsey, Georg N.

    Concepts of general systems theory, cybernetics and the like may provide the methodology for communication theory to move from a level of technology to a level of pure science. It was the purpose of this paper to (1) demonstrate the necessity of applying systems theory to the construction of communication theory, (2) review relevant systems…

  2. Quantitative in vivo receptor binding. I. Theory and application to the muscarinic cholinergic receptor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frey, K.A.; Ehrenkaufer, R.L.; Beaucage, S.

    1985-02-01

    A novel approach to in vivo receptor binding experiments is presented which allows direct quantitation of binding site densities. The method is based on an equilibrium model of tracer uptake and is designed to produce a static distribution proportional to receptor density and to minimize possible confounding influences of regional blood flow, blood-brain barrier permeability, and nonspecific binding. This technique was applied to the measurement of regional muscarinic cholinergic receptor densities in rat brain using (/sup 3/H)scopolamine. Specific in vivo binding of scopolamine demonstrated saturability, a pharmacologic profile, and regional densities which are consistent with interaction of the tracer withmore » the muscarinic receptor. Estimates of receptor density obtained with the in vivo method and in vitro measurements in homogenates were highly correlated. Furthermore, reduction in striatal muscarinic receptors following ibotenic acid lesions resulted in a significant decrease in tracer uptake in vivo, indicating that the correlation between scopolamine distribution and receptor density may be used to demonstrate pathologic conditions. We propose that the general method presented here is directly applicable to investigation of high affinity binding sites for a variety of radioligands.« less

  3. Quantitation of Fine Displacement in Echography

    NASA Astrophysics Data System (ADS)

    Masuda, Kohji; Ishihara, Ken; Yoshii, Ken; Furukawa, Toshiyuki; Kumagai, Sadatoshi; Maeda, Hajime; Kodama, Shinzo

    1993-05-01

    A High-speed Digital Subtraction Echography was developed to visualize the fine displacement of human internal organs. This method indicates differences in position through time series images of high-frame-rate echography. Fine displacement less than ultrasonic wavelength can be observed. This method, however, lacks the ability to quantitatively measure displacement length. The subtraction between two successive images was affected by displacement direction in spite of the displacement length being the same. To solve this problem, convolution of an echogram with Gaussian distribution was used. To express displacement length as brightness quantitatively, normalization using a brightness gradient was applied. The quantitation algorithm was applied to successive B-mode images. Compared to the simply subtracted images, quantitated images express more precisely the motion of organs. Expansion of the carotid artery and fine motion of ventricular walls can be visualized more easily. Displacement length can be quantitated with wavelength. Under more static conditions, this system quantitates displacement length that is much less than wavelength.

  4. Quantitative Evidence for Lanthanide-Oxygen Orbital Mixing in CeO2, PrO2, and TbO2.

    PubMed

    Minasian, Stefan G; Batista, Enrique R; Booth, Corwin H; Clark, David L; Keith, Jason M; Kozimor, Stosh A; Lukens, Wayne W; Martin, Richard L; Shuh, David K; Stieber, S Chantal E; Tylisczcak, Tolek; Wen, Xiao-Dong

    2017-12-13

    Understanding the nature of covalent (band-like) vs ionic (atomic-like) electrons in metal oxides continues to be at the forefront of research in the physical sciences. In particular, the development of a coherent and quantitative model of bonding and electronic structure for the lanthanide dioxides, LnO 2 (Ln = Ce, Pr, and Tb), has remained a considerable challenge for both experiment and theory. Herein, relative changes in mixing between the O 2p orbitals and the Ln 4f and 5d orbitals in LnO 2 are evaluated quantitatively using O K-edge X-ray absorption spectroscopy (XAS) obtained with a scanning transmission X-ray microscope and density functional theory (DFT) calculations. For each LnO 2 , the results reveal significant amounts of Ln 5d and O 2p mixing in the orbitals of t 2g (σ-bonding) and e g (π-bonding) symmetry. The remarkable agreement between experiment and theory also shows that significant mixing with the O 2p orbitals occurs in a band derived from the 4f orbitals of a 2u symmetry (σ-bonding) for each compound. However, a large increase in orbital mixing is observed for PrO 2 that is ascribed to a unique interaction derived from the 4f orbitals of t 1u symmetry (σ- and π-bonding). O K-edge XAS and DFT results are compared with complementary L 3 -edge and M 5,4 -edge XAS measurements and configuration interaction calculations, which shows that each spectroscopic approach provides evidence for ground state O 2p and Ln 4f orbital mixing despite inducing very different core-hole potentials in the final state.

  5. Foundations for a theory of gravitation theories

    NASA Technical Reports Server (NTRS)

    Thorne, K. S.; Lee, D. L.; Lightman, A. P.

    1972-01-01

    A foundation is laid for future analyses of gravitation theories. This foundation is applicable to any theory formulated in terms of geometric objects defined on a 4-dimensional spacetime manifold. The foundation consists of (1) a glossary of fundamental concepts; (2) a theorem that delineates the overlap between Lagrangian-based theories and metric theories; (3) a conjecture (due to Schiff) that the Weak Equivalence Principle implies the Einstein Equivalence Principle; and (4) a plausibility argument supporting this conjecture for the special case of relativistic, Lagrangian-based theories.

  6. The Application of the Theory of Planned Behaviour to Diet in Carers of People with an Intellectual Disability

    ERIC Educational Resources Information Center

    Jenkins, Catherine M.; McKenzie, Karen

    2011-01-01

    Background: The utility of the theory of planned behaviour (TPB) in predicting the intentions of care staff to encourage healthy eating behaviour in those they supported was examined. Method: A quantitative, within-participant, questionnaire based design was used with 112 carers to assess the performance of two TPB models. The first contained the…

  7. Situation-specific theories from the middle-range transitions theory.

    PubMed

    Im, Eun-Ok

    2014-01-01

    The purpose of this article was to analyze the theory development process of the situation-specific theories that were derived from the middle-range transitions theory. This analysis aims to provide directions for future development of situation-specific theories. First, transitions theory is concisely described with its history, goal, and major concepts. Then, the approach that was used to retrieve the situation-specific theories derived from transitions theory is described. Next, an analysis of 6 situation-specific theories is presented. Finally, 4 themes reflecting commonalities and variances in the theory development process are discussed with implications for future theoretical development.

  8. Applications of Microfluidics in Quantitative Biology.

    PubMed

    Bai, Yang; Gao, Meng; Wen, Lingling; He, Caiyun; Chen, Yuan; Liu, Chenli; Fu, Xiongfei; Huang, Shuqiang

    2018-05-01

    Quantitative biology is dedicated to taking advantage of quantitative reasoning and advanced engineering technologies to make biology more predictable. Microfluidics, as an emerging technique, provides new approaches to precisely control fluidic conditions on small scales and collect data in high-throughput and quantitative manners. In this review, the authors present the relevant applications of microfluidics to quantitative biology based on two major categories (channel-based microfluidics and droplet-based microfluidics), and their typical features. We also envision some other microfluidic techniques that may not be employed in quantitative biology right now, but have great potential in the near future. © 2017 Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences. Biotechnology Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  9. Numerical simulation of damage and progressive failures in composite laminates using the layerwise plate theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reddy, Y.S.

    1992-01-01

    The failure behavior of composite laminates is modeled numerically using the Generalized Layerwise Plate Theory (GLPT) of Reddy and a progressive failure algorithm. The Layerwise Theory of Reddy assumes a piecewise continuous displacement field through the thickness of the laminate and therefore has the ability to capture the interlaminar stress fields near the free edges and cut outs more accurately. The progressive failure algorithm is based on the assumption that the material behaves like a stable progressively fracturing solid. A three-dimensional stiffness reduction scheme is developed and implemented to study progressive failures in composite laminates. The effect of various parametersmore » such as out-of-plane material properties, boundary conditions, and stiffness reduction methods on the failure stresses and strains of a quasi-isotropic composite laminate with free edges subjected to tensile loading is studied. The ultimate stresses and strains predicted by the Generalized Layerwise Plate Theory (GLPT) and the more widely used First Order Shear Deformation Theory (FSDT) are compared with experimental results. The predictions of the GLPT are found to be in good agreement with the experimental results both qualitatively and quantitatively, while the predictions of FSDT are found to be different from experimental results both qualitatively and quantitatively. The predictive ability of various phenomenological failure criteria is evaluated with reference to the experimental results available in the literature. The effect of geometry of the test specimen and the displacement boundary conditions at the grips on the ultimate stresses and strains of a composite laminate under compressive loading is studied. The ultimate stresses and strains are found to be quite sensitive to the geometry of the test specimen and the displacement boundary conditions at the grips. The degree of sensitivity is observed to depend strongly on the lamination sequence.« less

  10. A quantitative approach to evaluating caring in nursing simulation.

    PubMed

    Eggenberger, Terry L; Keller, Kathryn B; Chase, Susan K; Payne, Linda

    2012-01-01

    This study was designed to test a quantitative method of measuring caring in the simulated environment. Since competency in caring is central to nursing practice, ways of including caring concepts in designing scenarios and in evaluation of performance need to be developed. Coates' Caring Efficacy scales were adapted for simulation and named the Caring Efficacy Scale-Simulation Student Version (CES-SSV) and Caring Efficacy Scale-Simulation Faculty Version (CES-SFV). A correlational study was designed to compare student self-ratings with faculty ratings on caring efficacy during an adult acute simulation experience with traditional and accelerated baccalaureate students in a nursing program grounded in caring theory. Student self-ratings were significantly correlated with objective ratings (r = 0.345, 0.356). Both the CES-SSV and the CES-SFV were found to have excellent internal consistency and significantly correlated interrater reliability. They were useful in measuring caring in the simulated learning environment.

  11. The Price Equation, Gradient Dynamics, and Continuous Trait Game Theory.

    PubMed

    Lehtonen, Jussi

    2018-01-01

    A recent article convincingly nominated the Price equation as the fundamental theorem of evolution and used it as a foundation to derive several other theorems. A major section of evolutionary theory that was not addressed is that of game theory and gradient dynamics of continuous traits with frequency-dependent fitness. Deriving fundamental results in these fields under the unifying framework of the Price equation illuminates similarities and differences between approaches and allows a simple, unified view of game-theoretical and dynamic concepts. Using Taylor polynomials and the Price equation, I derive a dynamic measure of evolutionary change, a condition for singular points, the convergence stability criterion, and an alternative interpretation of evolutionary stability. Furthermore, by applying the Price equation to a multivariable Taylor polynomial, the direct fitness approach to kin selection emerges. Finally, I compare these results to the mean gradient equation of quantitative genetics and the canonical equation of adaptive dynamics.

  12. Acoustic Interaction Forces and Torques Acting on Suspended Spheres in an Ideal Fluid.

    PubMed

    Lopes, J Henrique; Azarpeyvand, Mahdi; Silva, Glauber T

    2016-01-01

    In this paper, the acoustic interaction forces and torques exerted by an arbitrary time-harmonic wave on a set of N objects suspended in an inviscid fluid are theoretically analyzed. We utilize the partial-wave expansion method with translational addition theorem and re-expansion of multipole series to solve the related multiple scattering problem. We show that the acoustic interaction force and torque can be obtained using the farfield radiation force and torque formulas. To exemplify the method, we calculate the interaction forces exerted by an external traveling and standing plane wave on an arrangement of two and three olive-oil droplets in water. The droplets' radii are comparable to the wavelength (i.e., Mie scattering regime). The results show that the acoustic interaction forces present an oscillatory spatial distribution which follows the pattern formed by interference between the external and rescattered waves. In addition, acoustic interaction torques arise on the absorbing droplets whenever a nonsymmetric wavefront is formed by the external and rescattered waves' interference.

  13. Forward Λ production and nuclear stopping power in d+Au collisions at sNN=200 GeV

    NASA Astrophysics Data System (ADS)

    Abelev, B. I.; Aggarwal, M. M.; Ahammed, Z.; Anderson, B. D.; Arkhipkin, D.; Averichev, G. S.; Bai, Y.; Balewski, J.; Barannikova, O.; Barnby, L. S.; Baudot, J.; Baumgart, S.; Belaga, V. V.; Bellingeri-Laurikainen, A.; Bellwied, R.; Benedosso, F.; Betts, R. R.; Bhardwaj, S.; Bhasin, A.; Bhati, A. K.; Bichsel, H.; Bielcik, J.; Bielcikova, J.; Bland, L. C.; Blyth, S.-L.; Bombara, M.; Bonner, B. E.; Botje, M.; Bouchet, J.; Brandin, A. V.; Bravar, A.; Burton, T. P.; Bystersky, M.; Cai, X. Z.; Caines, H.; Sánchez, M. Calderón De La Barca; Callner, J.; Catu, O.; Cebra, D.; Cervantes, M. C.; Chajecki, Z.; Chaloupka, P.; Chattopadhyay, S.; Chen, H. F.; Chen, J. H.; Chen, J. Y.; Cheng, J.; Cherney, M.; Chikanian, A.; Christie, W.; Chung, S. U.; Clarke, R. F.; Codrington, M. J. M.; Coffin, J. P.; Cormier, T. M.; Cosentino, M. R.; Cramer, J. G.; Crawford, H. J.; Das, D.; Dash, S.; Daugherity, M.; Moura, M. M. De; Dedovich, T. G.; Dephillips, M.; Derevschikov, A. A.; Didenko, L.; Dietel, T.; Djawotho, P.; Dogra, S. M.; Dong, X.; Drachenberg, J. L.; Draper, J. E.; Du, F.; Dunin, V. B.; Dunlop, J. C.; Mazumdar, M. R. Dutta; Eckardt, V.; Edwards, W. R.; Efimov, L. G.; Emelianov, V.; Engelage, J.; Eppley, G.; Erazmus, B.; Estienne, M.; Fachini, P.; Fatemi, R.; Fedorisin, J.; Feng, A.; Filip, P.; Finch, E.; Fine, V.; Fisyak, Y.; Fu, J.; Gagliardi, C. A.; Gaillard, L.; Ganti, M. S.; Garcia-Solis, E.; Ghazikhanian, V.; Ghosh, P.; Gorbunov, Y. N.; Gos, H.; Grebenyuk, O.; Grosnick, D.; Grube, B.; Guertin, S. M.; Guimaraes, K. S. F. F.; Gupta, N.; Haag, B.; Hallman, T. J.; Hamed, A.; Harris, J. W.; He, W.; Heinz, M.; Henry, T. W.; Heppelmann, S.; Hippolyte, B.; Hirsch, A.; Hjort, E.; Hoffman, A. M.; Hoffmann, G. W.; Hofman, D. J.; Hollis, R. S.; Horner, M. J.; Huang, H. Z.; Hughes, E. W.; Humanic, T. J.; Igo, G.; Iordanova, A.; Jacobs, P.; Jacobs, W. W.; Jakl, P.; Jia, F.; Jones, P. G.; Judd, E. G.; Kabana, S.; Kang, K.; Kapitan, J.; Kaplan, M.; Keane, D.; Kechechyan, A.; Kettler, D.; Khodyrev, V. Yu.; Kiryluk, J.; Kisiel, A.; Kislov, E. M.; Klein, S. R.; Knospe, A. G.; Kocoloski, A.; Koetke, D. D.; Kollegger, T.; Kopytine, M.; Kotchenda, L.; Kouchpil, V.; Kowalik, K. L.; Kravtsov, P.; Kravtsov, V. I.; Krueger, K.; Kuhn, C.; Kulikov, A. I.; Kumar, A.; Kurnadi, P.; Kuznetsov, A. A.; Lamont, M. A. C.; Landgraf, J. M.; Lange, S.; Lapointe, S.; Laue, F.; Lauret, J.; Lebedev, A.; Lednicky, R.; Lee, C.-H.; Lehocka, S.; Levine, M. J.; Li, C.; Li, Q.; Li, Y.; Lin, G.; Lin, X.; Lindenbaum, S. J.; Lisa, M. A.; Liu, F.; Liu, H.; Liu, J.; Liu, L.; Ljubicic, T.; Llope, W. J.; Longacre, R. S.; Love, W. A.; Lu, Y.; Ludlam, T.; Lynn, D.; Ma, G. L.; Ma, J. G.; Ma, Y. G.; Mahapatra, D. P.; Majka, R.; Mangotra, L. K.; Manweiler, R.; Margetis, S.; Markert, C.; Martin, L.; Matis, H. S.; Matulenko, Yu. A.; McClain, C. J.; McShane, T. S.; Melnick, Yu.; Meschanin, A.; Millane, J.; Miller, M. L.; Minaev, N. G.; Mioduszewski, S.; Mischke, A.; Mitchell, J.; Mohanty, B.; Morozov, D. A.; Munhoz, M. G.; Nandi, B. K.; Nattrass, C.; Nayak, T. K.; Nelson, J. M.; Nepali, C.; Netrakanti, P. K.; Nogach, L. V.; Nurushev, S. B.; Odyniec, G.; Ogawa, A.; Okorokov, V.; Oldenburg, M.; Olson, D.; Pachr, M.; Pal, S. K.; Panebratsev, Y.; Pavlinov, A. I.; Pawlak, T.; Peitzmann, T.; Perevoztchikov, V.; Perkins, C.; Peryt, W.; Phatak, S. C.; Planinic, M.; Pluta, J.; Poljak, N.; Porile, N.; Poskanzer, A. M.; Potekhin, M.; Potrebenikova, E.; Potukuchi, B. V. K. S.; Prindle, D.; Pruneau, C.; Pruthi, N. K.; Putschke, J.; Qattan, I. A.; Raniwala, R.; Raniwala, S.; Ray, R. L.; Relyea, D.; Ridiger, A.; Ritter, H. G.; Roberts, J. B.; Rogachevskiy, O. V.; Romero, J. L.; Rose, A.; Roy, C.; Ruan, L.; Russcher, M. J.; Sahoo, R.; Sakrejda, I.; Sakuma, T.; Salur, S.; Sandweiss, J.; Sarsour, M.; Sazhin, P. S.; Schambach, J.; Scharenberg, R. P.; Schmitz, N.; Seger, J.; Selyuzhenkov, I.; Seyboth, P.; Shabetai, A.; Shahaliev, E.; Shao, M.; Sharma, M.; Shen, W. Q.; Shimanskiy, S. S.; Sichtermann, E. P.; Simon, F.; Singaraju, R. N.; Smirnov, N.; Snellings, R.; Sorensen, P.; Sowinski, J.; Speltz, J.; Spinka, H. M.; Srivastava, B.; Stadnik, A.; Stanislaus, T. D. S.; Staszak, D.; Stock, R.; Strikhanov, M.; Stringfellow, B.; Suaide, A. A. P.; Suarez, M. C.; Subba, N. L.; Sumbera, M.; Sun, X. M.; Sun, Z.; Surrow, B.; Symons, T. J. M.; Toledo, A. Szanto De; Takahashi, J.; Tang, A. H.; Tarnowsky, T.; Thomas, J. H.; Timmins, A. R.; Timoshenko, S.; Tokarev, M.; Trainor, T. A.; Trentalange, S.; Tribble, R. E.; Tsai, O. D.; Ulery, J.; Ullrich, T.; Underwood, D. G.; Buren, G. Van; Kolk, N. Van Der; Leeuwen, M. Van; Molen, A. M. Vander; Varma, R.; Vasilevski, I. M.; Vasiliev, A. N.; Vernet, R.; Vigdor, S. E.; Viyogi, Y. P.; Vokal, S.; Voloshin, S. A.; Wada, M.; Waggoner, W. T.; Wang, F.; Wang, G.; Wang, J. S.; Wang, X. L.; Wang, Y.; Webb, J. C.; Westfall, G. D.; , C. Whitten, Jr.; Wieman, H.; Wissink, S. W.; Witt, R.; Wu, J.; Wu, Y.; Xu, N.; Xu, Q. H.; Xu, Z.; Yepes, P.; Yoo, I.-K.; Yue, Q.; Yurevich, V. I.; Zawisza, M.; Zhan, W.; Zhang, H.; Zhang, W. M.; Zhang, Y.; Zhang, Z. P.; Zhao, Y.; Zhong, C.; Zhou, J.; Zoulkarneev, R.; Zoulkarneeva, Y.; Zubarev, A. N.; Zuo, J. X.

    2007-12-01

    We report the measurement of Λ and Λ¯ yields and inverse slope parameters in d+Au collisions at sNN=200 GeV at forward and backward rapidities (y=±2.75), using data from the STAR forward time projection chambers. The contributions of different processes to baryon transport and particle production are probed exploiting the inherent asymmetry of the d+Au system. Comparisons to model calculations show that baryon transport on the deuteron side is consistent with multiple collisions of the deuteron nucleons with gold participants. On the gold side, HIJING-based models without a hadronic rescattering phase do not describe the measured particle yields, while models that include target remnants or hadronic rescattering do. The multichain model can provide a good description of the net baryon density in d+Au collisions at energies currently available at the BNL Relativistic Heavy Ion Collider, and the derived parameters of the model agree with those from nuclear collisions at lower energies.

  14. Time-dependence of graph theory metrics in functional connectivity analysis.

    PubMed

    Chiang, Sharon; Cassese, Alberto; Guindani, Michele; Vannucci, Marina; Yeh, Hsiang J; Haneef, Zulfi; Stern, John M

    2016-01-15

    Brain graphs provide a useful way to computationally model the network structure of the connectome, and this has led to increasing interest in the use of graph theory to quantitate and investigate the topological characteristics of the healthy brain and brain disorders on the network level. The majority of graph theory investigations of functional connectivity have relied on the assumption of temporal stationarity. However, recent evidence increasingly suggests that functional connectivity fluctuates over the length of the scan. In this study, we investigate the stationarity of brain network topology using a Bayesian hidden Markov model (HMM) approach that estimates the dynamic structure of graph theoretical measures of whole-brain functional connectivity. In addition to extracting the stationary distribution and transition probabilities of commonly employed graph theory measures, we propose two estimators of temporal stationarity: the S-index and N-index. These indexes can be used to quantify different aspects of the temporal stationarity of graph theory measures. We apply the method and proposed estimators to resting-state functional MRI data from healthy controls and patients with temporal lobe epilepsy. Our analysis shows that several graph theory measures, including small-world index, global integration measures, and betweenness centrality, may exhibit greater stationarity over time and therefore be more robust. Additionally, we demonstrate that accounting for subject-level differences in the level of temporal stationarity of network topology may increase discriminatory power in discriminating between disease states. Our results confirm and extend findings from other studies regarding the dynamic nature of functional connectivity, and suggest that using statistical models which explicitly account for the dynamic nature of functional connectivity in graph theory analyses may improve the sensitivity of investigations and consistency across investigations

  15. Time-dependence of graph theory metrics in functional connectivity analysis

    PubMed Central

    Chiang, Sharon; Cassese, Alberto; Guindani, Michele; Vannucci, Marina; Yeh, Hsiang J.; Haneef, Zulfi; Stern, John M.

    2016-01-01

    Brain graphs provide a useful way to computationally model the network structure of the connectome, and this has led to increasing interest in the use of graph theory to quantitate and investigate the topological characteristics of the healthy brain and brain disorders on the network level. The majority of graph theory investigations of functional connectivity have relied on the assumption of temporal stationarity. However, recent evidence increasingly suggests that functional connectivity fluctuates over the length of the scan. In this study, we investigate the stationarity of brain network topology using a Bayesian hidden Markov model (HMM) approach that estimates the dynamic structure of graph theoretical measures of whole-brain functional connectivity. In addition to extracting the stationary distribution and transition probabilities of commonly employed graph theory measures, we propose two estimators of temporal stationarity: the S-index and N-index. These indexes can be used to quantify different aspects of the temporal stationarity of graph theory measures. We apply the method and proposed estimators to resting-state functional MRI data from healthy controls and patients with temporal lobe epilepsy. Our analysis shows that several graph theory measures, including small-world index, global integration measures, and betweenness centrality, may exhibit greater stationarity over time and therefore be more robust. Additionally, we demonstrate that accounting for subject-level differences in the level of temporal stationarity of network topology may increase discriminatory power in discriminating between disease states. Our results confirm and extend findings from other studies regarding the dynamic nature of functional connectivity, and suggest that using statistical models which explicitly account for the dynamic nature of functional connectivity in graph theory analyses may improve the sensitivity of investigations and consistency across investigations. PMID

  16. Many-body theory of electrical, thermal and optical response of molecular heterojunctions

    NASA Astrophysics Data System (ADS)

    Bergfield, Justin Phillip

    In this work, we develop a many-body theory of electronic transport through single molecule junctions based on nonequilibrium Green's functions (NEGFs). The central quantity of this theory is the Coulomb self-energy matrix of the junction SigmaC. SigmaC is evaluated exactly in the sequential-tunneling limit, and the correction due to finite lead-molecule tunneling is evaluated using a conserving approximation based on diagrammatic perturbation theory on the Keldysh contour. In this way, tunneling processes are included to infinite order, meaning that any approximation utilized is a truncation in the physical processes considered rather than in the order of those processes. Our theory reproduces the key features of both the Coulomb blockade and coherent transport regimes simultaneously in a single unified theory. Nonperturbative effects of intramolecular correlations are included, which are necessary to accurately describe the highest occupied molecular orbital (HOMO)-lowest unoccupied molecular orbital (LUMO) gap, essential for a quantitative theory of transport. This work covers four major topics related to transport in single-molecule junctions. First, we use our many-body theory to calculate the nonlinear electrical response of the archetypal Au-1,4-benzenedithiol-Au junction and find irregularly shaped 'molecular diamonds' which have been experimentally observed in some larger molecules but which are inaccessible to existing theoretical approaches. Next, we extend our theory to include heat transport and develop an exact expression for the heat current in an interacting nanostructure. Using this result, we discover that quantum coherence can strongly enhance the thermoelectric response of a device, a result with a number of technological applications. We then develop the formalism to include multi-orbital lead-molecule contacts and multi-channel leads, both of which strongly affect the observable transport. Lastly, we include a dynamic screening correction to

  17. Exploring the Relationship of Organizational Culture and Implicit Leadership Theory to Performance Differences in the Nuclear and Fossil Energy Industry

    NASA Astrophysics Data System (ADS)

    Cravey, Kristopher J.

    Notable performance differences exist between nuclear and fossil power generation plants in areas such as safety, outage duration efficiency, and capacity factor. This study explored the relationship of organizational culture and implicit leadership theory to these performance differences. A mixed methods approach consisting of quantitative instruments, namely the Organizational Culture Assessment Instrument and the GLOBE Leadership Scales, and qualitative interviews were used in this study. Subjects were operations middle managers in a U.S. energy company that serves nuclear or fossil power plants. Results from the quantitative instruments revealed no differences between nuclear and fossil groups in regards to organizational culture types and implicit leadership theories. However, the qualitative results did reveal divergence between the two groups in regards to what is valued in the organization and how that drives behaviors and decision making. These organizational phenomenological differences seem to explain why performance differences exist between nuclear and fossil plants because, ultimately, they affect how the organization functions.

  18. The place of white in a world of grays: a double-anchoring theory of lightness perception.

    PubMed

    Bressan, Paola

    2006-07-01

    The specific gray shades in a visual scene can be derived from relative luminance values only when an anchoring rule is followed. The double-anchoring theory I propose in this article, as a development of the anchoring theory of Gilchrist et al. (1999), assumes that any given region (a) belongs to one or more frameworks, created by Gestalt grouping principles, and (b) is independently anchored, within each framework, to both the highest luminance and the surround luminance. The region's final lightness is a weighted average of the values computed, relative to both anchors, in all frameworks. The new model accounts not only for all lightness illusions that are qualitatively explained by the anchoring theory but also for a number of additional effects, and it does so quantitatively, with the support of mathematical simulations. ((c) 2006 APA, all rights reserved).

  19. Simple proof of the impossibility of bit commitment in generalized probabilistic theories using cone programming

    NASA Astrophysics Data System (ADS)

    Sikora, Jamie; Selby, John

    2018-04-01

    Bit commitment is a fundamental cryptographic task, in which Alice commits a bit to Bob such that she cannot later change the value of the bit, while, simultaneously, the bit is hidden from Bob. It is known that ideal bit commitment is impossible within quantum theory. In this work, we show that it is also impossible in generalized probabilistic theories (under a small set of assumptions) by presenting a quantitative trade-off between Alice's and Bob's cheating probabilities. Our proof relies crucially on a formulation of cheating strategies as cone programs, a natural generalization of semidefinite programs. In fact, using the generality of this technique, we prove that this result holds for the more general task of integer commitment.

  20. Theory of superconductivity in oxides. Final technical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, P.W.

    1988-05-18

    Progress was made towards a final theory of high-Tc superconductivity. The key elements are the work on normal-state properties and the actual mechanism for Tc. With the understanding (ZA) of the large anisotropy and other transport properties in the normal state, the model is uniquely determined: one must have one version or another of a holon-spinon quantum-fluid state, which is not a normal Fermi liquid. And with the recognition (HWA) of the large-repulsion holon-holon interactions, the author has the first way of thinking quantitatively about the superconducting state. Work on the pure Heisenberg system, which is related but not necessarilymore » crucial to understanding the superconducting properties is described.« less

  1. Sex allocation theory reveals a hidden cost of neonicotinoid exposure in a parasitoid wasp

    PubMed Central

    Whitehorn, Penelope R.; Cook, Nicola; Blackburn, Charlotte V.; Gill, Sophie M.; Green, Jade; Shuker, David M.

    2015-01-01

    Sex allocation theory has proved to be one the most successful theories in evolutionary ecology. However, its role in more applied aspects of ecology has been limited. Here we show how sex allocation theory helps uncover an otherwise hidden cost of neonicotinoid exposure in the parasitoid wasp Nasonia vitripennis. Female N. vitripennis allocate the sex of their offspring in line with Local Mate Competition (LMC) theory. Neonicotinoids are an economically important class of insecticides, but their deployment remains controversial, with evidence linking them to the decline of beneficial species. We demonstrate for the first time to our knowledge, that neonicotinoids disrupt the crucial reproductive behaviour of facultative sex allocation at sub-lethal, field-relevant doses in N. vitripennis. The quantitative predictions we can make from LMC theory show that females exposed to neonicotinoids are less able to allocate sex optimally and that this failure imposes a significant fitness cost. Our work highlights that understanding the ecological consequences of neonicotinoid deployment requires not just measures of mortality or even fecundity reduction among non-target species, but also measures that capture broader fitness costs, in this case offspring sex allocation. Our work also highlights new avenues for exploring how females obtain information when allocating sex under LMC. PMID:25925105

  2. Realist explanatory theory building method for social epidemiology: a protocol for a mixed method multilevel study of neighbourhood context and postnatal depression.

    PubMed

    Eastwood, John G; Jalaludin, Bin B; Kemp, Lynn A

    2014-01-01

    A recent criticism of social epidemiological studies, and multi-level studies in particular has been a paucity of theory. We will present here the protocol for a study that aims to build a theory of the social epidemiology of maternal depression. We use a critical realist approach which is trans-disciplinary, encompassing both quantitative and qualitative traditions, and that assumes both ontological and hierarchical stratification of reality. We describe a critical realist Explanatory Theory Building Method comprising of an: 1) emergent phase, 2) construction phase, and 3) confirmatory phase. A concurrent triangulated mixed method multilevel cross-sectional study design is described. The Emergent Phase uses: interviews, focus groups, exploratory data analysis, exploratory factor analysis, regression, and multilevel Bayesian spatial data analysis to detect and describe phenomena. Abductive and retroductive reasoning will be applied to: categorical principal component analysis, exploratory factor analysis, regression, coding of concepts and categories, constant comparative analysis, drawing of conceptual networks, and situational analysis to generate theoretical concepts. The Theory Construction Phase will include: 1) defining stratified levels; 2) analytic resolution; 3) abductive reasoning; 4) comparative analysis (triangulation); 5) retroduction; 6) postulate and proposition development; 7) comparison and assessment of theories; and 8) conceptual frameworks and model development. The strength of the critical realist methodology described is the extent to which this paradigm is able to support the epistemological, ontological, axiological, methodological and rhetorical positions of both quantitative and qualitative research in the field of social epidemiology. The extensive multilevel Bayesian studies, intensive qualitative studies, latent variable theory, abductive triangulation, and Inference to Best Explanation provide a strong foundation for Theory

  3. Bogoliubov theory of acoustic Hawking radiation in Bose-Einstein condensates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Recati, A.; Physik-Department, Technische Universitaet Muenchen, D-85748 Garching; Pavloff, N.

    2009-10-15

    We apply the microscopic Bogoliubov theory of dilute Bose-Einstein condensates to analyze quantum and thermal fluctuations in a flowing atomic condensate in the presence of a sonic horizon. For the simplest case of a step-like horizon, closed-form analytical expressions are found for the spectral distribution of the analog Hawking radiation and for the density correlation function. The peculiar long-distance density correlations that appear as a consequence of the Hawking emission features turns out to be reinforced by a finite initial temperature of the condensate. The analytical results are in good quantitative agreement with first principle numerical calculations.

  4. Quantitative mass spectrometry: an overview

    NASA Astrophysics Data System (ADS)

    Urban, Pawel L.

    2016-10-01

    Mass spectrometry (MS) is a mainstream chemical analysis technique in the twenty-first century. It has contributed to numerous discoveries in chemistry, physics and biochemistry. Hundreds of research laboratories scattered all over the world use MS every day to investigate fundamental phenomena on the molecular level. MS is also widely used by industry-especially in drug discovery, quality control and food safety protocols. In some cases, mass spectrometers are indispensable and irreplaceable by any other metrological tools. The uniqueness of MS is due to the fact that it enables direct identification of molecules based on the mass-to-charge ratios as well as fragmentation patterns. Thus, for several decades now, MS has been used in qualitative chemical analysis. To address the pressing need for quantitative molecular measurements, a number of laboratories focused on technological and methodological improvements that could render MS a fully quantitative metrological platform. In this theme issue, the experts working for some of those laboratories share their knowledge and enthusiasm about quantitative MS. I hope this theme issue will benefit readers, and foster fundamental and applied research based on quantitative MS measurements. This article is part of the themed issue 'Quantitative mass spectrometry'.

  5. Quantitative imaging methods in osteoporosis.

    PubMed

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G

    2016-12-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.

  6. Quantitative structure-activation barrier relationship modeling for Diels-Alder ligations utilizing quantum chemical structural descriptors.

    PubMed

    Nandi, Sisir; Monesi, Alessandro; Drgan, Viktor; Merzel, Franci; Novič, Marjana

    2013-10-30

    In the present study, we show the correlation of quantum chemical structural descriptors with the activation barriers of the Diels-Alder ligations. A set of 72 non-catalysed Diels-Alder reactions were subjected to quantitative structure-activation barrier relationship (QSABR) under the framework of theoretical quantum chemical descriptors calculated solely from the structures of diene and dienophile reactants. Experimental activation barrier data were obtained from literature. Descriptors were computed using Hartree-Fock theory using 6-31G(d) basis set as implemented in Gaussian 09 software. Variable selection and model development were carried out by stepwise multiple linear regression methodology. Predictive performance of the quantitative structure-activation barrier relationship (QSABR) model was assessed by training and test set concept and by calculating leave-one-out cross-validated Q2 and predictive R2 values. The QSABR model can explain and predict 86.5% and 80% of the variances, respectively, in the activation energy barrier training data. Alternatively, a neural network model based on back propagation of errors was developed to assess the nonlinearity of the sought correlations between theoretical descriptors and experimental reaction barriers. A reasonable predictability for the activation barrier of the test set reactions was obtained, which enabled an exploration and interpretation of the significant variables responsible for Diels-Alder interaction between dienes and dienophiles. Thus, studies in the direction of QSABR modelling that provide efficient and fast prediction of activation barriers of the Diels-Alder reactions turn out to be a meaningful alternative to transition state theory based computation.

  7. Geographical Theories.

    ERIC Educational Resources Information Center

    Golledge, Reginald G.

    1996-01-01

    Discusses the origin of theories in geography and particularly the development of location theories. Considers the influence of economic theory on agricultural land use, industrial location, and geographic location theories. Explores a set of interrelated activities that show how the marketing process illustrates process theory. (MJP)

  8. The Analyst's "Use" of Theory or Theories: The Play of Theory.

    PubMed

    Cooper, Steven H

    2017-10-01

    Two clinical vignettes demonstrate a methodological approach that guides the analyst's attention to metaphors and surfaces that are the focus of different theories. Clinically, the use of different theories expands the metaphorical language with which the analyst tries to make contact with the patient's unconscious life. Metaphorical expressions may be said to relate to each other as the syntax of unconscious fantasy (Arlow 1979). The unconscious fantasy itself represents a metaphorical construction of childhood experience that has persisted, dynamically expressive and emergent into adult life. This persistence is evident in how, in some instances, long periods of an analysis focus on translating one or a few metaphors, chiefly because the manifest metaphorical expressions of a central theme regularly lead to better understanding of an unconscious fantasy. At times employing another model or theory assists in a level of self-reflection about clinical understanding and clinical decisions. The analyst's choice of theory or theories is unique to the analyst and is not prescriptive, except as illustrating a way to think about these issues. The use of multiple models in no way suggests or implies that theories may be integrated.

  9. The use of behavioural theories in end-of-life care research: A systematic review.

    PubMed

    Scherrens, Anne-Lore; Beernaert, Kim; Robijn, Lenzo; Deliens, Luc; Pauwels, Nele S; Cohen, Joachim; Deforche, Benedicte

    2018-06-01

    It is necessary to understand behaviours that contribute to improvement in the quality of end-of-life care; use of behavioural theories allows identification of factors underlying end-of-life care behaviour, but little is known about the extent to which, and in what manner, these theories are used in an end-of-life care research context. To assess the number of end-of-life care studies that have used behavioural theories, which theories were used, to what extent main constructs were explored/measured and which behavioural outcomes were examined. We conducted a systematic review. The protocol was registered on PROSPERO (CRD42016036009). The MEDLINE (PubMed), PsycINFO, EMBASE, Web of Science and CINAHL databases were searched from inception to June 2017. We included studies aimed at understanding or changing end-of-life care behaviours and that explicitly referred to individual behavioural theories. We screened 2231 records by title and abstract, retrieved 43 full-text articles and included 31 studies - 27 quantitative (of which four (quasi-)randomised controlled trials) and four qualitative - for data extraction. More than half used the Theory of Planned Behaviour (9), the Theory of Reasoned Action (4) or the Transtheoretical Model (8). In 9 of 31 studies, the theory was fully used, and 16 of the 31 studies focussed on behaviours in advance care planning. In end-of-life care research, the use of behavioural theories is limited. As many behaviours can determine the quality of care, their more extensive use may be warranted if we want to better understand and influence behaviours and improve end-of-life care.

  10. Understanding quantitative research: part 1.

    PubMed

    Hoe, Juanita; Hoare, Zoë

    This article, which is the first in a two-part series, provides an introduction to understanding quantitative research, basic statistics and terminology used in research articles. Critical appraisal of research articles is essential to ensure that nurses remain up to date with evidence-based practice to provide consistent and high-quality nursing care. This article focuses on developing critical appraisal skills and understanding the use and implications of different quantitative approaches to research. Part two of this article will focus on explaining common statistical terms and the presentation of statistical data in quantitative research.

  11. Decidability of formal theories and hyperincursivity theory

    NASA Astrophysics Data System (ADS)

    Grappone, Arturo G.

    2000-05-01

    This paper shows the limits of the Proof Standard Theory (briefly, PST) and gives some ideas of how to build a proof anticipatory theory (briefly, PAT) that has no such limits. Also, this paper considers that Gödel's proof of the undecidability of Principia Mathematica formal theory is not valid for axiomatic theories that use a PAT to build their proofs because the (hyper)incursive functions are self-representable.

  12. Quantum many-body theory for electron spin decoherence in nanoscale nuclear spin baths.

    PubMed

    Yang, Wen; Ma, Wen-Long; Liu, Ren-Bao

    2017-01-01

    Decoherence of electron spins in nanoscale systems is important to quantum technologies such as quantum information processing and magnetometry. It is also an ideal model problem for studying the crossover between quantum and classical phenomena. At low temperatures or in light-element materials where the spin-orbit coupling is weak, the phonon scattering in nanostructures is less important and the fluctuations of nuclear spins become the dominant decoherence mechanism for electron spins. Since the 1950s, semi-classical noise theories have been developed for understanding electron spin decoherence. In spin-based solid-state quantum technologies, the relevant systems are in the nanometer scale and nuclear spin baths are quantum objects which require a quantum description. Recently, quantum pictures have been established to understand the decoherence and quantum many-body theories have been developed to quantitatively describe this phenomenon. Anomalous quantum effects have been predicted and some have been experimentally confirmed. A systematically truncated cluster-correlation expansion theory has been developed to account for the many-body correlations in nanoscale nuclear spin baths that are built up during electron spin decoherence. The theory has successfully predicted and explained a number of experimental results in a wide range of physical systems. In this review, we will cover this recent progress. The limitations of the present quantum many-body theories and possible directions for future development will also be discussed.

  13. Theory of Multiple Intelligences: Is It a Scientific Theory?

    ERIC Educational Resources Information Center

    Chen, Jie-Qi

    2004-01-01

    This essay discusses the status of multiple intelligences (MI) theory as a scientific theory by addressing three issues: the empirical evidence Gardner used to establish MI theory, the methodology he employed to validate MI theory, and the purpose or function of MI theory.

  14. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    PubMed

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    Stress cardiovascular magnetic resonance (CMR) perfusion imaging is a promising modality for the evaluation of coronary artery disease (CAD) due to high spatial resolution and absence of radiation. Semi-quantitative and quantitative analysis of CMR perfusion are based on signal-intensity curves produced during the first-pass of gadolinium contrast. Multiple semi-quantitative and quantitative parameters have been introduced. Diagnostic performance of these parameters varies extensively among studies and standardized protocols are lacking. This study aims to determine the diagnostic accuracy of semi- quantitative and quantitative CMR perfusion parameters, compared to multiple reference standards. Pubmed, WebOfScience, and Embase were systematically searched using predefined criteria (3272 articles). A check for duplicates was performed (1967 articles). Eligibility and relevance of the articles was determined by two reviewers using pre-defined criteria. The primary data extraction was performed independently by two researchers with the use of a predefined template. Differences in extracted data were resolved by discussion between the two researchers. The quality of the included studies was assessed using the 'Quality Assessment of Diagnostic Accuracy Studies Tool' (QUADAS-2). True positives, false positives, true negatives, and false negatives were subtracted/calculated from the articles. The principal summary measures used to assess diagnostic accuracy were sensitivity, specificity, andarea under the receiver operating curve (AUC). Data was pooled according to analysis territory, reference standard and perfusion parameter. Twenty-two articles were eligible based on the predefined study eligibility criteria. The pooled diagnostic accuracy for segment-, territory- and patient-based analyses showed good diagnostic performance with sensitivity of 0.88, 0.82, and 0.83, specificity of 0.72, 0.83, and 0.76 and AUC of 0.90, 0.84, and 0.87, respectively. In per territory

  15. Semi-quantitative assessment of pulmonary perfusion in children using dynamic contrast-enhanced MRI

    NASA Astrophysics Data System (ADS)

    Fetita, Catalin; Thong, William E.; Ou, Phalla

    2013-03-01

    This paper addresses the study of semi-quantitative assessment of pulmonary perfusion acquired from dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) in a study population mainly composed of children with pulmonary malformations. The automatic analysis approach proposed is based on the indicator-dilution theory introduced in 1954. First, a robust method is developed to segment the pulmonary artery and the lungs from anatomical MRI data, exploiting 2D and 3D mathematical morphology operators. Second, the time-dependent contrast signal of the lung regions is deconvolved by the arterial input function for the assessment of the local hemodynamic system parameters, ie. mean transit time, pulmonary blood volume and pulmonary blood flow. The discrete deconvolution method implements here a truncated singular value decomposition (tSVD) method. Parametric images for the entire lungs are generated as additional elements for diagnosis and quantitative follow-up. The preliminary results attest the feasibility of perfusion quantification in pulmonary DCE-MRI and open an interesting alternative to scintigraphy for this type of evaluation, to be considered at least as a preliminary decision in the diagnostic due to the large availability of the technique and to the non-invasive aspects.

  16. Statistical mechanics of ecological systems: Neutral theory and beyond

    NASA Astrophysics Data System (ADS)

    Azaele, Sandro; Suweis, Samir; Grilli, Jacopo; Volkov, Igor; Banavar, Jayanth R.; Maritan, Amos

    2016-07-01

    The simplest theories often have much merit and many limitations, and, in this vein, the value of neutral theory (NT) of biodiversity has been the subject of much debate over the past 15 years. NT was proposed at the turn of the century by Stephen Hubbell to explain several patterns observed in the organization of ecosystems. Among ecologists, it had a polarizing effect: There were a few ecologists who were enthusiastic, and there were a larger number who firmly opposed it. Physicists and mathematicians, instead, welcomed the theory with excitement. Indeed, NT spawned several theoretical studies that attempted to explain empirical data and predicted trends of quantities that had not yet been studied. While there are a few reviews of NT oriented toward ecologists, the goal here is to review the quantitative aspects of NT and its extensions for physicists who are interested in learning what NT is, what its successes are, and what important problems remain unresolved. Furthermore, this review could also be of interest to theoretical ecologists because many potentially interesting results are buried in the vast NT literature. It is proposed to make these more accessible by extracting them and presenting them in a logical fashion. The focus of this review is broader than NT: new, more recent approaches for studying ecological systems and how one might introduce realistic non-neutral models are also discussed.

  17. Electronic excitations in molecular solids: bridging theory and experiment.

    PubMed

    Skelton, Jonathan M; da Silva, E Lora; Crespo-Otero, Rachel; Hatcher, Lauren E; Raithby, Paul R; Parker, Stephen C; Walsh, Aron

    2015-01-01

    As the spatial and temporal resolution accessible to experiment and theory converge, computational chemistry is an increasingly powerful tool for modelling and interpreting spectroscopic data. However, the study of molecular processes, in particular those related to electronic excitations (e.g. photochemistry), frequently pushes quantum-chemical techniques to their limit. The disparity in the level of theory accessible to periodic and molecular calculations presents a significant challenge when modelling molecular crystals, since accurate calculations require a high level of theory to describe the molecular species, but must also take into account the influence of the crystalline environment on their properties. In this article, we briefly review the different classes of quantum-chemical techniques, and present an overview of methods that account for environmental influences with varying levels of approximation. Using a combination of solid-state and molecular calculations, we quantitatively evaluate the performance of implicit-solvent models for the [Ni(Et4dien)(η2-O,ON)(η1-NO2)] linkage-isomer system as a test case. We focus particularly on the accurate reproduction of the energetics of the isomerisation, and on predicting spectroscopic properties to compare with experimental results. This work illustrates how the synergy between periodic and molecular calculations can be exploited for the study of molecular crystals, and forms a basis for the investigation of more challenging phenomena, such as excited-state dynamics, and for further methodological developments.

  18. Multivariate Quantitative Chemical Analysis

    NASA Technical Reports Server (NTRS)

    Kinchen, David G.; Capezza, Mary

    1995-01-01

    Technique of multivariate quantitative chemical analysis devised for use in determining relative proportions of two components mixed and sprayed together onto object to form thermally insulating foam. Potentially adaptable to other materials, especially in process-monitoring applications in which necessary to know and control critical properties of products via quantitative chemical analyses of products. In addition to chemical composition, also used to determine such physical properties as densities and strengths.

  19. The Data-Ink Ratio and Accuracy of Information Derived from Newspaper Graphs: An Experimental Test of the Theory.

    ERIC Educational Resources Information Center

    Kelly, James D.

    A study tested the data-ink ratio theory, which holds that a reader's recall of quantitative data displayed in a graph containing a substantial amount of non-data-ink will be significantly less than recall from a graph containing little non-data-ink, as it might apply to graphics used in mass circulation newspapers. The experiment employed a…

  20. Infusion of Quantitative and Statistical Concepts into Biology Courses Does Not Improve Quantitative Literacy

    ERIC Educational Resources Information Center

    Beck, Christopher W.

    2018-01-01

    Multiple national reports have pushed for the integration of quantitative concepts into the context of disciplinary science courses. The aim of this study was to evaluate the quantitative and statistical literacy of biology students and explore learning gains when those skills were taught implicitly in the context of biology. I examined gains in…

  1. Integrating psychological theory into the design of an online intervention for sexual health: the sexunzipped website.

    PubMed

    Carswell, Kenneth; McCarthy, Ona; Murray, Elizabeth; Bailey, Julia V

    2012-11-19

    The Internet can provide a confidential and convenient medium for sexual health promotion for young people. This paper describes the development of an interactive, theory-based website (Sexunzipped) aimed at increasing safe sexual behavior of young people, as well as an outline of the evaluation protocol. The website focuses on safer sex, relationships, and sexual pleasure. An overview of the site is provided, including a description of the theoretical constructs which form the basis of the site development. An integrated behavioral model was chosen as the guiding theory for the Sexunzipped intervention. A randomized trial design will be used to evaluate the site quantitatively. The content of the site is described in detail with examples of the main content types: information pages, quizzes, and decision-making activities. We describe the protocol for quantitative evaluation of the website using a randomized trial design and discuss the principal challenges involved in developing the site, including the challenge of balancing the requirements of theory with young people's views on website content and design. Considerations for future interventions are discussed. Developing an online behavior-change intervention is costly and time consuming. Given the large public health potential, the cost involved in developing online interventions, and the need for attractive design, future interventions may benefit from collaborating with established sites that already have a user base, a brand, and a strong Internet presence. It is vital to involve users in decisions about intervention content, design, and features, paying attention to aspects that will attract and retain users' interest. A central challenge in developing effective Internet-based interventions for young people is to find effective ways to operationalize theory in ways that address the views and perspectives of young people.

  2. Integrating Psychological Theory Into the Design of an Online Intervention for Sexual Health: The Sexunzipped Website

    PubMed Central

    2012-01-01

    Background The Internet can provide a confidential and convenient medium for sexual health promotion for young people. Objective This paper describes the development of an interactive, theory-based website (Sexunzipped) aimed at increasing safe sexual behavior of young people, as well as an outline of the evaluation protocol. Methods The website focuses on safer sex, relationships, and sexual pleasure. An overview of the site is provided, including a description of the theoretical constructs which form the basis of the site development. An integrated behavioral model was chosen as the guiding theory for the Sexunzipped intervention. A randomized trial design will be used to evaluate the site quantitatively. Results The content of the site is described in detail with examples of the main content types: information pages, quizzes, and decision-making activities. We describe the protocol for quantitative evaluation of the website using a randomized trial design and discuss the principal challenges involved in developing the site, including the challenge of balancing the requirements of theory with young people’s views on website content and design. Conclusions Considerations for future interventions are discussed. Developing an online behavior-change intervention is costly and time consuming. Given the large public health potential, the cost involved in developing online interventions, and the need for attractive design, future interventions may benefit from collaborating with established sites that already have a user base, a brand, and a strong Internet presence. It is vital to involve users in decisions about intervention content, design, and features, paying attention to aspects that will attract and retain users’ interest. A central challenge in developing effective Internet-based interventions for young people is to find effective ways to operationalize theory in ways that address the views and perspectives of young people. PMID:23612122

  3. An almost general theory of mean size perception.

    PubMed

    Allik, Jüri; Toom, Mai; Raidvee, Aire; Averin, Kristiina; Kreegipuu, Kairi

    2013-05-03

    A general explanation for the observer's ability to judge the mean size of simple geometrical figures, such as circles, was advanced. Results indicated that, contrary to what would be predicted by statistical averaging, the precision of mean size perception decreases with the number of judged elements. Since mean size discrimination was insensitive to how total size differences were distributed among individual elements, this suggests that the observer has a limited cognitive access to the size of individual elements pooled together in a compulsory manner before size information reaches awareness. Confirming the associative law of addition means, observers are indeed sensitive to the mean, not the sizes of individual elements. All existing data can be explained by an almost general theory, namely, the Noise and Selection (N&S) Theory, formulated in exact quantitative terms, implementing two familiar psychophysical principles: the size of an element cannot be measured with absolute accuracy and only a limited number of elements can be taken into account in the computation of the average size. It was concluded that the computation of ensemble characteristics is not necessarily a tool for surpassing the capacity limitations of perceptual processing. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. QUANTITATIVE MORPHOLOGY

    EPA Science Inventory

    Abstract: In toxicology, the role of quantitative assessment of brain morphology can be understood in the context of two types of treatment-related alterations. One type of alteration is specifically associated with treatment and is not observed in control animals. Measurement ...

  5. A general theory of multimetric indices and their properties

    USGS Publications Warehouse

    Schoolmaster, Donald R.; Grace, James B.; Schweiger, E. William

    2012-01-01

    1. Stewardship of biological and ecological resources requires the ability to make integrative assessments of ecological integrity. One of the emerging methods for making such integrative assessments is multimetric indices (MMIs). These indices synthesize data, often from multiple levels of biological organization, with the goal of deriving a single index that reflects the overall effects of human disturbance. Despite the widespread use of MMIs, there is uncertainty about why this approach can be effective. An understanding of MMIs requires a quantitative theory that illustrates how the properties of candidate metrics relates to MMIs generated from those metrics. 2. We present the initial basis for such a theory by deriving the general mathematical characteristics of MMIs assembled from metrics. We then use the theory to derive quantitative answers to the following questions: Is there an optimal number of metrics to comprise an index? How does covariance among metrics affect the performance of the index derived from those metrics? And what are the criteria to decide whether a given metric will improve the performance of an index? 3. We find that the optimal number of metrics to be included in an index depends on the theoretical distribution of signal of the disturbance gradient contained in each metric. For example, if the rank-ordered parameters of a metric-disturbance regression can be described by a monotonically decreasing function, then an optimum number of metrics exists and can often be derived analytically. We derive the conditions by which adding a given metric can be expected to improve an index. 4. We find that the criterion defining such conditions depends nonlinearly of the signal of the disturbance gradient, the noise (error) of the metric and the correlation of the metric errors. Importantly, we find that correlation among metric errors increases the signal required for the metric to improve the index. 5. The theoretical framework presented in this

  6. Electroosmotic flow in a rectangular channel with variable wall zeta-potential: comparison of numerical simulation with asymptotic theory.

    PubMed

    Datta, Subhra; Ghosal, Sandip; Patankar, Neelesh A

    2006-02-01

    Electroosmotic flow in a straight micro-channel of rectangular cross-section is computed numerically for several situations where the wall zeta-potential is not constant but has a specified spatial variation. The results of the computation are compared with an earlier published asymptotic theory based on the lubrication approximation: the assumption that any axial variations take place on a long length scale compared to a characteristic channel width. The computational results are found to be in excellent agreement with the theory even when the scale of axial variations is comparable to the channel width. In the opposite limit when the wavelength of fluctuations is much shorter than the channel width, the lubrication theory fails to describe the solution either qualitatively or quantitatively. In this short wave limit the solution is well described by Ajdari's theory for electroosmotic flow between infinite parallel plates (Ajdari, A., Phys. Rev. E 1996, 53, 4996-5005.) The infinitely thin electric double layer limit is assumed in the theory as well as in the simulation.

  7. ψ ( 2 S ) versus J / ψ suppression in proton-nucleus collisions from factorization violating soft color exchanges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, Yan -Qing; Venugopalan, Raju; Watanabe, Kazuhiro

    Here, we argue that the large suppression of themore » $$\\psi(2S)$$ inclusive cross-section relative to the $$J/\\psi$$ inclusive cross-section in proton-nucleus (p+A) collisions can be attributed to factorization breaking effects in the formation of quarkonium. These factorization breaking effects arise from soft color exchanges between charm-anticharm pairs undergoing hadronization and comoving partons that are long-lived on time scales of quarkonium formation. We compute the short distance pair production of heavy quarks in the Color Glass Condensate (CGC) effective field theory and employ an improved Color Evaporation model (ICEM) to describe their hadronization into quarkonium at large distances. The combined CGC+ICEM model provides a quantitative description of $$J/\\psi$$ and $$\\psi(2S)$$ data in proton-proton (p+p) collisions from both RHIC and the LHC. Factorization breaking effects in hadronization, due to additional parton comovers in the nucleus, are introduced heuristically by imposing a cutoff $$\\Lambda$$, representing the momentum kick from soft color exchanges, in the ICEM model. Such soft exchanges have no perceptible effect on $$J/\\psi$$ suppression in p+A collisions. In contrast, the interplay of the physics of these soft exchanges at large distances, with the physics of semi-hard rescattering at short distances, causes a significant additional suppression of $$\\psi(2S)$$ yields relative to that of the $$J/\\psi$$. A good fit of all RHIC and LHC $$J/\\psi$$ and $$\\psi(2S)$$ data, for transverse momenta $$P_\\perp\\leq 5$$ GeV in p+p and p+A collisions, is obtained for $$\\Lambda\\sim 10$$ MeV.« less

  8. ψ ( 2 S ) versus J / ψ suppression in proton-nucleus collisions from factorization violating soft color exchanges

    DOE PAGES

    Ma, Yan -Qing; Venugopalan, Raju; Watanabe, Kazuhiro; ...

    2018-01-31

    Here, we argue that the large suppression of themore » $$\\psi(2S)$$ inclusive cross-section relative to the $$J/\\psi$$ inclusive cross-section in proton-nucleus (p+A) collisions can be attributed to factorization breaking effects in the formation of quarkonium. These factorization breaking effects arise from soft color exchanges between charm-anticharm pairs undergoing hadronization and comoving partons that are long-lived on time scales of quarkonium formation. We compute the short distance pair production of heavy quarks in the Color Glass Condensate (CGC) effective field theory and employ an improved Color Evaporation model (ICEM) to describe their hadronization into quarkonium at large distances. The combined CGC+ICEM model provides a quantitative description of $$J/\\psi$$ and $$\\psi(2S)$$ data in proton-proton (p+p) collisions from both RHIC and the LHC. Factorization breaking effects in hadronization, due to additional parton comovers in the nucleus, are introduced heuristically by imposing a cutoff $$\\Lambda$$, representing the momentum kick from soft color exchanges, in the ICEM model. Such soft exchanges have no perceptible effect on $$J/\\psi$$ suppression in p+A collisions. In contrast, the interplay of the physics of these soft exchanges at large distances, with the physics of semi-hard rescattering at short distances, causes a significant additional suppression of $$\\psi(2S)$$ yields relative to that of the $$J/\\psi$$. A good fit of all RHIC and LHC $$J/\\psi$$ and $$\\psi(2S)$$ data, for transverse momenta $$P_\\perp\\leq 5$$ GeV in p+p and p+A collisions, is obtained for $$\\Lambda\\sim 10$$ MeV.« less

  9. The Quantitative Preparation of Future Geoscience Graduate Students

    NASA Astrophysics Data System (ADS)

    Manduca, C. A.; Hancock, G. S.

    2006-12-01

    Modern geoscience is a highly quantitative science. In February, a small group of faculty and graduate students from across the country met to discuss the quantitative preparation of geoscience majors for graduate school. The group included ten faculty supervising graduate students in quantitative areas spanning the earth, atmosphere, and ocean sciences; five current graduate students in these areas; and five faculty teaching undergraduate students in the spectrum of institutions preparing students for graduate work. Discussion focused in four key ares: Are incoming graduate students adequately prepared for the quantitative aspects of graduate geoscience programs? What are the essential quantitative skills are that are required for success in graduate school? What are perceived as the important courses to prepare students for the quantitative aspects of graduate school? What programs/resources would be valuable in helping faculty/departments improve the quantitative preparation of students? The participants concluded that strengthening the quantitative preparation of undergraduate geoscience majors would increase their opportunities in graduate school. While specifics differed amongst disciplines, a special importance was placed on developing the ability to use quantitative skills to solve geoscience problems. This requires the ability to pose problems so they can be addressed quantitatively, understand the relationship between quantitative concepts and physical representations, visualize mathematics, test the reasonableness of quantitative results, creatively move forward from existing models/techniques/approaches, and move between quantitative and verbal descriptions. A list of important quantitative competencies desirable in incoming graduate students includes mechanical skills in basic mathematics, functions, multi-variate analysis, statistics and calculus, as well as skills in logical analysis and the ability to learn independently in quantitative ways

  10. Testing a lepton quarticity flavor theory of neutrino oscillations with the DUNE experiment

    NASA Astrophysics Data System (ADS)

    Srivastava, Rahul; Ternes, Christoph A.; Tórtola, Mariam; Valle, José W. F.

    2018-03-01

    Oscillation studies play a central role in elucidating at least some aspects of the flavor problem. Here we examine the status of the predictions of a lepton quarticity flavor theory of neutrino oscillations against the existing global sample of oscillation data. By performing quantitative simulations we also determine the potential of the upcoming DUNE experiment in narrowing down the currently ill-measured oscillation parameters θ23 and δCP. We present the expected improved sensitivity on these parameters for different assumptions.

  11. Quantitative Characterization of Tissue Microstructure with Temporal Diffusion Spectroscopy

    PubMed Central

    Xu, Junzhong; Does, Mark D.; Gore, John C.

    2009-01-01

    The signals recorded by diffusion-weighted magnetic resonance imaging (DWI) are dependent on the micro-structural properties of biological tissues, so it is possible to obtain quantitative structural information non-invasively from such measurements. Oscillating gradient spin echo (OGSE) methods have the ability to probe the behavior of water diffusion over different time scales and the potential to detect variations in intracellular structure. To assist in the interpretation of OGSE data, analytical expressions have been derived for diffusion-weighted signals with OGSE methods for restricted diffusion in some typical structures, including parallel planes, cylinders and spheres, using the theory of temporal diffusion spectroscopy. These analytical predictions have been confirmed with computer simulations. These expressions suggest how OGSE signals from biological tissues should be analyzed to characterize tissue microstructure, including how to estimate cell nuclear sizes. This approach provides a model to interpret diffusion data obtained from OGSE measurements that can be used for applications such as monitoring tumor response to treatment in vivo. PMID:19616979

  12. Theory X and Theory Y in the Organizational Structure.

    ERIC Educational Resources Information Center

    Barry, Thomas J.

    This document defines contrasting assumptions about the labor force--theory X and theory Y--and shows how they apply to the pyramid organizational structure, examines the assumptions of the two theories, and finally, based on a survey and individual interviews, proposes a merger of theories X and Y to produce theory Z. Organizational structures…

  13. DD production and their interactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu Yanrui; Oka, Makoto; Takizawa, Makoto

    2010-07-01

    S- and P-wave DD scatterings are studied in a meson exchange model with the coupling constants obtained in the heavy quark effective theory. With the extracted P-wave phase shifts and the separable potential approximation, we include the DD rescattering effect and investigate the production process e{sup +}e{sup -{yields}}DD. We find that it is difficult to explain the anomalous line shape observed by the BES Collaboration with this mechanism. Combining our model calculation and the experimental measurement, we estimate the upper limit of the nearly universal cutoff parameter to be around 2 GeV. With this number, the upper limits of themore » binding energies of the S-wave DD and BB bound states are obtained. Assuming that the S-wave and P-wave interactions rely on the same cutoff, our study provides a way of extracting the information about S-wave molecular bound states from the P-wave meson pair production.« less

  14. Hermeneutical Field Theory and the Structural Character of Understanding.

    NASA Astrophysics Data System (ADS)

    Whitehouse, William Leonard

    Through a series of exploratory case studies focusing on hermeneutics, phenomenology, relativity, field theory, quantum mechanics, chronobiology, chaos theory, holographic theory and various aspects of mathematics, a set of hermeneutical constraints and degrees of freedom are generated. There are a set of eight field equations given in the thesis which give qualitative symbolic expression to the aforementioned spectrum of constraints and degrees of freedom that constitute the structural character of understanding. However, as is sometimes the case with their quantitative mathematical counterparts, the hermeneutical field equations are capable of giving a variety of descriptions or solutions for one and the same set of conditions. The task, therefore, is to try to sort out those solutions which have reflective properties with respect to the structural character of reality from those which do not have such properties. The thesis addresses this task by introducing the idea of hermeneutical field theory. In this theory the notion of a semiotic operator or semiotic quantum plays a central role. More specifically, this quantum is considered to be the carrier of hermeneutical force. It arises as a field property at the complex, horizontal membrane-manifold linking human consciousness with different levels of scale of reality. When taken collectively, the aforementioned set of equations gives expression to the structural character of hermeneutical field theory. Therefore, when one begins to run concrete variables through the theory underlying these equations, one encounters various kinds of hermeneutical constraints and degrees of freedom. These constraints and degrees of freedom characterize the dialectical engagement of consciousness and reality as one seeks to acquire understanding concerning the above mentioned variables and the context which gives rise to them. Hermeneutical field theory is really the study of the factors that affect the state of the six internal

  15. Quantitative proteomics in biological research.

    PubMed

    Wilm, Matthias

    2009-10-01

    Proteomics has enabled the direct investigation of biological material, at first through the analysis of individual proteins, then of lysates from cell cultures, and finally of extracts from tissues and biopsies from entire organisms. Its latest manifestation - quantitative proteomics - allows deeper insight into biological systems. This article reviews the different methods used to extract quantitative information from mass spectra. It follows the technical developments aimed toward global proteomics, the attempt to characterize every expressed protein in a cell by at least one peptide. When applications of the technology are discussed, the focus is placed on yeast biology. In particular, differential quantitative proteomics, the comparison between an experiment and its control, is very discriminating for proteins involved in the process being studied. When trying to understand biological processes on a molecular level, differential quantitative proteomics tends to give a clearer picture than global transcription analyses. As a result, MS has become an even more indispensable tool for biochemically motivated biological research.

  16. Challenges and perspectives in quantitative NMR.

    PubMed

    Giraudeau, Patrick

    2017-01-01

    This perspective article summarizes, from the author's point of view at the beginning of 2016, the major challenges and perspectives in the field of quantitative NMR. The key concepts in quantitative NMR are first summarized; then, the most recent evolutions in terms of resolution and sensitivity are discussed, as well as some potential future research directions in this field. A particular focus is made on methodologies capable of boosting the resolution and sensitivity of quantitative NMR, which could open application perspectives in fields where the sample complexity and the analyte concentrations are particularly challenging. These include multi-dimensional quantitative NMR and hyperpolarization techniques such as para-hydrogen-induced polarization or dynamic nuclear polarization. Because quantitative NMR cannot be dissociated from the key concepts of analytical chemistry, i.e. trueness and precision, the methodological developments are systematically described together with their level of analytical performance. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  17. When is a theory a theory? A case example.

    PubMed

    Alkin, Marvin C

    2017-08-01

    This discussion comments on the approximately 20years history of writings on the prescriptive theory called Empowerment Evaluation. To do so, involves examining how "Empowerment Evaluation Theory" has been defined at various points of time (particularly 1996 and now in 2015). Defining a theory is different from judging the success of a theory. This latter topic has been addressed elsewhere by Michael Scriven, Michael Patton, and Brad Cousins. I am initially guided by the work of Robin Miller (2010) who has written on the issue of how to judge the success of a theory. In doing so, she provided potential standards for judging the adequacy of theories. My task is not judging the adequacy or success of the Empowerment Evaluation prescriptive theory in practice, but determining how well the theory is delineated. That is, to what extent do the writings qualify as a prescriptive theory. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Contextualized theory-based predictors of intention to practice monogamy among adolescents in Botswana junior secondary schools: Results of focus group sessions and a cross-sectional study.

    PubMed

    Chilisa, Bagele; Mohiemang, Irene; Mpeta, Kolentino Nyamadzapasi; Malinga, Tumane; Ntshwarang, Poloko; Koyabe, Bramwell Walela; Heeren, G Anita

    2016-01-01

    Culture and tradition influences behaviour. Multiple partner and concurrent relationships are made responsible for the increase of HIV infection in Sub-Saharan Africa. A contextualized "Theory of Planned Behaviour" was used to identify predictors of intention to practice monogamy. A mixed method design using qualitative data from focus groups, stories and a survey were analyzed for quantitative data. The qualitative data added to the behavioural beliefs a socio-cultural belief domain as well as attitudes, subjective norms, and perceived behavioural control predicted the intention to practice monogamy. The adolescents showed a tendency towards having more than one sexual partner. The normative beliefs and the socio cultural beliefs also predicted intentions while hedonistic belief and partner reaction did not. In contextualizing theory-based interventions, it is important to draw from stories and the langauage that circulate in a community about a given behaviour. More studies are needed on ways to combine qualitative approaches with quantitative approaches to inform the development of theory based culturally appropriate and context specific intervention strategies to reduce the risk of HIV.

  19. Contextualized theory-based predictors of intention to practice monogamy among adolescents in Botswana junior secondary schools: Results of focus group sessions and a cross-sectional study

    PubMed Central

    Chilisa, Bagele; Mohiemang, Irene; Mpeta, Kolentino Nyamadzapasi; Malinga, Tumane; Ntshwarang, Poloko; Koyabe, Bramwell Walela; Heeren, G. Anita

    2016-01-01

    Culture and tradition influences behaviour. Multiple partner and concurrent relationships are made responsible for the increase of HIV infection in Sub-Saharan Africa. A contextualized “Theory of Planned Behaviour” was used to identify predictors of intention to practice monogamy. A mixed method design using qualitative data from focus groups, stories and a survey were analyzed for quantitative data. The qualitative data added to the behavioural beliefs a socio-cultural belief domain as well as attitudes, subjective norms, and perceived behavioural control predicted the intention to practice monogamy. The adolescents showed a tendency towards having more than one sexual partner. The normative beliefs and the socio cultural beliefs also predicted intentions while hedonistic belief and partner reaction did not. In contextualizing theory-based interventions, it is important to draw from stories and the langauage that circulate in a community about a given behaviour. More studies are needed on ways to combine qualitative approaches with quantitative approaches to inform the development of theory based culturally appropriate and context specific intervention strategies to reduce the risk of HIV. PMID:28090169

  20. Understanding Pre-Quantitative Risk in Projects

    NASA Technical Reports Server (NTRS)

    Cooper, Lynne P.

    2011-01-01

    Standard approaches to risk management in projects depend on the ability of teams to identify risks and quantify the probabilities and consequences of these risks (e.g., the 5 x 5 risk matrix). However, long before quantification does - or even can - occur, and long after, teams make decisions based on their pre-quantitative understanding of risk. These decisions can have long-lasting impacts on the project. While significant research has looked at the process of how to quantify risk, our understanding of how teams conceive of and manage pre-quantitative risk is lacking. This paper introduces the concept of pre-quantitative risk and discusses the implications of addressing pre-quantitative risk in projects.

  1. Ice recrystallization kinetics in the presence of synthetic antifreeze glycoprotein analogues using the framework of LSW theory.

    PubMed

    Budke, C; Heggemann, C; Koch, M; Sewald, N; Koop, T

    2009-03-05

    The Ostwald ripening of polycrystalline ice in aqueous sucrose solutions was investigated experimentally. The kinetics of this ice recrystallization process was studied at temperatures between -6 and -10 degrees C and varying ice volume fractions. Using the theory of Lifshitz, Slyozov, and Wagner (LSW), the diffusion-limited rate constant for ice recrystallization was determined. Also, the effects of synthetic analogues of natural antifreeze glycoproteins (AFGP) were studied. These analogues synAFGPmi (i = 3-5) contained monosaccharide side groups instead of disaccharide side groups that occur in natural AFGP. In order to account for the inhibition effect of the synAFGPmi, we have modified classical LSW theory, allowing for the derivation of inhibition rate constants. It was found that the investigated synAFGPmi inhibit ice recrystallization at concentrations down to approximately 3 microg mL(-1) or, equivalently, approximately 1 micromol L(-1) for the largest synAFGPmi investigated: synAFGPm5. Hence, our new method is capable of quantitatively assessing the efficiency of very similar AFGP with a sensitivity that is at least 2 orders of magnitude larger than that typical for quantitative thermal hysteresis measurements.

  2. A gold nanoparticle-based semi-quantitative and quantitative ultrasensitive paper sensor for the detection of twenty mycotoxins

    NASA Astrophysics Data System (ADS)

    Kong, Dezhao; Liu, Liqiang; Song, Shanshan; Suryoprabowo, Steven; Li, Aike; Kuang, Hua; Wang, Libing; Xu, Chuanlai

    2016-02-01

    A semi-quantitative and quantitative multi-immunochromatographic (ICA) strip detection assay was developed for the simultaneous detection of twenty types of mycotoxins from five classes, including zearalenones (ZEAs), deoxynivalenols (DONs), T-2 toxins (T-2s), aflatoxins (AFs), and fumonisins (FBs), in cereal food samples. Sensitive and specific monoclonal antibodies were selected for this assay. The semi-quantitative results were obtained within 20 min by the naked eye, with visual limits of detection for ZEAs, DONs, T-2s, AFs and FBs of 0.1-0.5, 2.5-250, 0.5-1, 0.25-1 and 2.5-10 μg kg-1, and cut-off values of 0.25-1, 5-500, 1-10, 0.5-2.5 and 5-25 μg kg-1, respectively. The quantitative results were obtained using a hand-held strip scan reader, with the calculated limits of detection for ZEAs, DONs, T-2s, AFs and FBs of 0.04-0.17, 0.06-49, 0.15-0.22, 0.056-0.49 and 0.53-1.05 μg kg-1, respectively. The analytical results of spiked samples were in accordance with the accurate content in the simultaneous detection analysis. This newly developed ICA strip assay is suitable for the on-site detection and rapid initial screening of mycotoxins in cereal samples, facilitating both semi-quantitative and quantitative determination.A semi-quantitative and quantitative multi-immunochromatographic (ICA) strip detection assay was developed for the simultaneous detection of twenty types of mycotoxins from five classes, including zearalenones (ZEAs), deoxynivalenols (DONs), T-2 toxins (T-2s), aflatoxins (AFs), and fumonisins (FBs), in cereal food samples. Sensitive and specific monoclonal antibodies were selected for this assay. The semi-quantitative results were obtained within 20 min by the naked eye, with visual limits of detection for ZEAs, DONs, T-2s, AFs and FBs of 0.1-0.5, 2.5-250, 0.5-1, 0.25-1 and 2.5-10 μg kg-1, and cut-off values of 0.25-1, 5-500, 1-10, 0.5-2.5 and 5-25 μg kg-1, respectively. The quantitative results were obtained using a hand-held strip scan

  3. Strangeness S =-1 hyperon-nucleon scattering in covariant chiral effective field theory

    NASA Astrophysics Data System (ADS)

    Li, Kai-Wen; Ren, Xiu-Lei; Geng, Li-Sheng; Long, Bingwei

    2016-07-01

    Motivated by the successes of covariant baryon chiral perturbation theory in one-baryon systems and in heavy-light systems, we study relevance of relativistic effects in hyperon-nucleon interactions with strangeness S =-1 . In this exploratory work, we follow the covariant framework developed by Epelbaum and Gegelia to calculate the Y N scattering amplitude at leading order. By fitting the five low-energy constants to the experimental data, we find that the cutoff dependence is mitigated, compared with the heavy-baryon approach. Nevertheless, the description of the experimental data remains quantitatively similar at leading order.

  4. Abelian gauge symmetries in F-theory and dual theories

    NASA Astrophysics Data System (ADS)

    Song, Peng

    In this dissertation, we focus on important physical and mathematical aspects, especially abelian gauge symmetries, of F-theory compactifications and its dual formulations within type IIB and heterotic string theory. F-theory is a non-perturbative formulation of type IIB string theory which enjoys important dualities with other string theories such as M-theory and E8 x E8 heterotic string theory. One of the main strengths of F-theory is its geometrization of many physical problems in the dual string theories. In particular, its study requires a lot of mathematical tools such as advanced techniques in algebraic geometry. Thus, it has also received a lot of interests among mathematicians, and is a vivid area of research within both the physics and the mathematics community. Although F-theory has been a long-standing theory, abelian gauge symmetry in Ftheory has been rarely studied, until recently. Within the mathematics community, in 2009, Grassi and Perduca first discovered the possibility of constructing elliptically fibered varieties with non-trivial toric Mordell-Weil group. In the physics community, in 2012, Morrison and Park first made a major advancement by constructing general F-theory compactifications with U(1) abelian gauge symmetry. They found that in such cases, the elliptically-fibered Calabi-Yau manifold that F-theory needs to be compactified on has its fiber being a generic elliptic curve in the blow-up of the weighted projective space P(1;1;2) at one point. Subsequent developments have been made by Cvetic, Klevers and Piragua extended the works of Morrison and Park and constructed general F-theory compactifications with U(1) x U(1) abelian gauge symmetry. They found that in the U(1) x U(1) abelian gauge symmetry case, the elliptically-fibered Calabi-Yau manifold that F-theory needs to be compactified on has its fiber being a generic elliptic curve in the del Pezzo surface dP2. In chapter 2 of this dissertation, I bring this a step further by

  5. Using game theory to investigate the epigenetic control mechanisms of embryo development. Comment on: ;Epigenetic game theory: How to compute the epigenetic control of maternal-to-zygotic transition; by Qian Wang et al.

    NASA Astrophysics Data System (ADS)

    Zhang, Le; Zhang, Shaoxiang

    2017-03-01

    A body of research [1-7] has already shown that epigenetic reprogramming plays a critical role in maintaining the normal development of embryos. However, the mechanistic quantitation of the epigenetic interactions between sperms and oocytes and the related impact on embryo development are still not clear [6,7]. In this study, Wang et al., [8] develop a modeling framework that addresses this question by integrating game theory and the latest discoveries of the epigenetic control of embryo development.

  6. Theory of the interface between a classical plasma and a hard wall

    NASA Astrophysics Data System (ADS)

    Ballone, P.; Pastore, G.; Tosi, M. P.

    1983-09-01

    The interfacial density profile of a classical one-component plasma confined by a hard wall is studied in planar and spherical geometries. The approach adapts to interfacial problems a modified hypernetted-chain approximation developed by Lado and by Rosenfeld and Ashcroft for the bulk structure of simple liquids. The specific new aim is to embody selfconsistently into the theory a contact theorem, fixing the plasma density at the wall through an equilibrium condition which involves the electrical potential drop across the interface and the bulk pressure. The theory is brought into fully quantitative contact with computer simulation data for a plasma confined in a spherical cavity of large but finite radius. The interfacial potential at the point of zero charge is accurately reproduced by suitably combining the contact theorem with relevant bulk properties in a simple, approximate representation of the interfacial charge density profile.

  7. Theory of the interface between a classical plasma and a hard wall

    NASA Astrophysics Data System (ADS)

    Ballone, P.; Pastore, G.; Tosi, M. P.

    1984-12-01

    The interfacial density profile of a classical one-component plasma confined by a hard wall is studied in planar and spherical geometries. The approach adapts to interfacial problems a modified hypernetted-chain approximation developed by Lado and by Rosenfeld and Ashcroft for the bulk structure of simple liquids. The specific new aim is to embody self-consistently into the theory a “contact theorem”, fixing the plasma density at the wall through an equilibrium condition which involves the electrical potential drop across the interface and the bulk pressure. The theory is brought into fully quantitative contact with computer simulation data for a plasma confined in a spherical cavity of large but finite radius. It is also shown that the interfacial potential at the point of zero charge is accurately reproduced by suitably combining the contact theorem with relevant bulk properties in a simple, approximate representation of the interfacial charge density profile.

  8. A novel multi-walled carbon nanotube-based antibody conjugate for quantitative and semi-quantitative lateral flow assays.

    PubMed

    Sun, Wenjuan; Hu, Xiaolong; Liu, Jia; Zhang, Yurong; Lu, Jianzhong; Zeng, Libo

    2017-10-01

    In this study, the multi-walled carbon nanotubes (MWCNTs) were applied in lateral flow strips (LFS) for semi-quantitative and quantitative assays. Firstly, the solubility of MWCNTs was improved using various surfactants to enhance their biocompatibility for practical application. The dispersed MWCNTs were conjugated with the methamphetamine (MET) antibody in a non-covalent manner and then manufactured into the LFS for the quantitative detection of MET. The MWCNTs-based lateral flow assay (MWCNTs-LFA) exhibited an excellent linear relationship between the values of test line and MET when its concentration ranges from 62.5 to 1500 ng/mL. The sensitivity of the LFS was evaluated by conjugating MWCNTs with HCG antibody and the MWCNTs conjugated method is 10 times more sensitive than the one conjugated with classical colloidal gold nanoparticles. Taken together, our data demonstrate that MWCNTs-LFA is a more sensitive and reliable assay for semi-quantitative and quantitative detection which can be used in forensic analysis.

  9. Sedimentation field flow fractionation and optical absorption spectroscopy for a quantitative size characterization of silver nanoparticles.

    PubMed

    Contado, Catia; Argazzi, Roberto; Amendola, Vincenzo

    2016-11-04

    Many advanced industrial and biomedical applications that use silver nanoparticles (AgNPs), require that particles are not only nano-sized, but also well dispersed, not aggregated and not agglomerated. This study presents two methods able to give rapidly sizes of monodispersed AgNPs suspensions in the dimensional range of 20-100nm. The first method, based on the application of Mie's theory, determines the particle sizes from the values of the surface plasmon resonance wavelength (SPR MAX ), read from the optical absorption spectra, recorded between 190nm and 800nm. The computed sizes were compared with those determined by transmission electron microscopy (TEM) and dynamic light scattering (DLS) and resulted in agreement with the nominal values in a range between 13% (for 20nm NPs) and 1% (for 100nm NPs), The second method is based on the masterly combination of the Sedimentation Field Flow Fractionation (SdFFF - now sold as Centrifugal FFF-CFFF) and the Optical Absorption Spectroscopy (OAS) techniques to accomplish sizes and quantitative particle size distributions for monodispersed, non-aggregated AgNPs suspensions. The SdFFF separation abilities, well exploited to size NPs, greatly benefits from the application of Mie's theory to the UV-vis signal elaboration, producing quantitative mass-based particle size distributions, from which trusted number-sized particle size distributions can be derived. The silver mass distributions were verified and supported by detecting off-line the Ag concentration with the graphite furnace atomic absorption spectrometry (GF-AAS). Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Quantitative Muscle Ultrasonography in Carpal Tunnel Syndrome.

    PubMed

    Lee, Hyewon; Jee, Sungju; Park, Soo Ho; Ahn, Seung-Chan; Im, Juneho; Sohn, Min Kyun

    2016-12-01

    To assess the reliability of quantitative muscle ultrasonography (US) in healthy subjects and to evaluate the correlation between quantitative muscle US findings and electrodiagnostic study results in patients with carpal tunnel syndrome (CTS). The clinical significance of quantitative muscle US in CTS was also assessed. Twenty patients with CTS and 20 age-matched healthy volunteers were recruited. All control and CTS subjects underwent a bilateral median and ulnar nerve conduction study (NCS) and quantitative muscle US. Transverse US images of the abductor pollicis brevis (APB) and abductor digiti minimi (ADM) were obtained to measure muscle cross-sectional area (CSA), thickness, and echo intensity (EI). EI was determined using computer-assisted, grayscale analysis. Inter-rater and intra-rater reliability for quantitative muscle US in control subjects, and differences in muscle thickness, CSA, and EI between the CTS patient and control groups were analyzed. Relationships between quantitative US parameters and electrodiagnostic study results were evaluated. Quantitative muscle US had high inter-rater and intra-rater reliability in the control group. Muscle thickness and CSA were significantly decreased, and EI was significantly increased in the APB of the CTS group (all p<0.05). EI demonstrated a significant positive correlation with latency of the median motor and sensory NCS in CTS patients (p<0.05). These findings suggest that quantitative muscle US parameters may be useful for detecting muscle changes in CTS. Further study involving patients with other neuromuscular diseases is needed to evaluate peripheral muscle change using quantitative muscle US.

  11. Spontaneous Focusing on Quantitative Relations: Towards a Characterization

    ERIC Educational Resources Information Center

    Degrande, Tine; Verschaffel, Lieven; Van Dooren, Wim

    2017-01-01

    In contrast to previous studies on Spontaneous Focusing on Quantitative Relations (SFOR), the present study investigated not only the "extent" to which children focus on (multiplicative) quantitative relations, but also the "nature" of children's quantitative focus (i.e., the types of quantitative relations that children focus…

  12. Quantitative Reasoning in Environmental Science: A Learning Progression

    ERIC Educational Resources Information Center

    Mayes, Robert Lee; Forrester, Jennifer Harris; Christus, Jennifer Schuttlefield; Peterson, Franziska Isabel; Bonilla, Rachel; Yestness, Nissa

    2014-01-01

    The ability of middle and high school students to reason quantitatively within the context of environmental science was investigated. A quantitative reasoning (QR) learning progression was created with three progress variables: quantification act, quantitative interpretation, and quantitative modeling. An iterative research design was used as it…

  13. Sex allocation theory reveals a hidden cost of neonicotinoid exposure in a parasitoid wasp.

    PubMed

    Whitehorn, Penelope R; Cook, Nicola; Blackburn, Charlotte V; Gill, Sophie M; Green, Jade; Shuker, David M

    2015-05-22

    Sex allocation theory has proved to be one the most successful theories in evolutionary ecology. However, its role in more applied aspects of ecology has been limited. Here we show how sex allocation theory helps uncover an otherwise hidden cost of neonicotinoid exposure in the parasitoid wasp Nasonia vitripennis. Female N. vitripennis allocate the sex of their offspring in line with Local Mate Competition (LMC) theory. Neonicotinoids are an economically important class of insecticides, but their deployment remains controversial, with evidence linking them to the decline of beneficial species. We demonstrate for the first time to our knowledge, that neonicotinoids disrupt the crucial reproductive behaviour of facultative sex allocation at sub-lethal, field-relevant doses in N. vitripennis. The quantitative predictions we can make from LMC theory show that females exposed to neonicotinoids are less able to allocate sex optimally and that this failure imposes a significant fitness cost. Our work highlights that understanding the ecological consequences of neonicotinoid deployment requires not just measures of mortality or even fecundity reduction among non-target species, but also measures that capture broader fitness costs, in this case offspring sex allocation. Our work also highlights new avenues for exploring how females obtain information when allocating sex under LMC. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  14. Quantum defect theory for the orbital Feshbach resonance

    NASA Astrophysics Data System (ADS)

    Cheng, Yanting; Zhang, Ren; Zhang, Peng

    2017-01-01

    In the ultracold gases of alkali-earth-metal-like atoms, a new type of Feshbach resonance, i.e., the orbital Feshbach resonance (OFR), has been proposed and experimentally observed in ultracold 173Yb atoms [R. Zhang et al., Phys. Rev. Lett. 115, 135301 (2015), 10.1103/PhysRevLett.115.135301]. When the OFR of the 173Yb atoms occurs, the energy gap between the open and closed channels is smaller by two orders of magnitude than the van der Waals energy. As a result, quantitative accurate results for the low-energy two-body problems can be obtained via multichannel quantum defect theory (MQDT), which is based on the exact solution of the Schrödinger equation with the van der Waals potential. In this paper we use MQDT to calculate the two-atom scattering length, effective range, and binding energy of two-body bound states for the systems with OFR. With these results we further study the clock-transition spectrum for the two-body bound states, which can be used to experimentally measure the binding energy. Our results are helpful for the quantitative theoretical and experimental research for the ultracold gases of alkali-earth-metal-like atoms with OFR.

  15. Entanglement distillation protocols and number theory

    NASA Astrophysics Data System (ADS)

    Bombin, H.; Martin-Delgado, M. A.

    2005-09-01

    We show that the analysis of entanglement distillation protocols for qudits of arbitrary dimension D benefits from applying basic concepts from number theory, since the set ZDn associated with Bell diagonal states is a module rather than a vector space. We find that a partition of ZDn into divisor classes characterizes the invariant properties of mixed Bell diagonal states under local permutations. We construct a very general class of recursion protocols by means of unitary operations implementing these local permutations. We study these distillation protocols depending on whether we use twirling operations in the intermediate steps or not, and we study them both analytically and numerically with Monte Carlo methods. In the absence of twirling operations, we construct extensions of the quantum privacy algorithms valid for secure communications with qudits of any dimension D . When D is a prime number, we show that distillation protocols are optimal both qualitatively and quantitatively.

  16. System, Subsystem, Hive: Boundary Problems in Computational Theories of Consciousness

    PubMed Central

    Fekete, Tomer; van Leeuwen, Cees; Edelman, Shimon

    2016-01-01

    A computational theory of consciousness should include a quantitative measure of consciousness, or MoC, that (i) would reveal to what extent a given system is conscious, (ii) would make it possible to compare not only different systems, but also the same system at different times, and (iii) would be graded, because so is consciousness. However, unless its design is properly constrained, such an MoC gives rise to what we call the boundary problem: an MoC that labels a system as conscious will do so for some—perhaps most—of its subsystems, as well as for irrelevantly extended systems (e.g., the original system augmented with physical appendages that contribute nothing to the properties supposedly supporting consciousness), and for aggregates of individually conscious systems (e.g., groups of people). This problem suggests that the properties that are being measured are epiphenomenal to consciousness, or else it implies a bizarre proliferation of minds. We propose that a solution to the boundary problem can be found by identifying properties that are intrinsic or systemic: properties that clearly differentiate between systems whose existence is a matter of fact, as opposed to those whose existence is a matter of interpretation (in the eye of the beholder). We argue that if a putative MoC can be shown to be systemic, this ipso facto resolves any associated boundary issues. As test cases, we analyze two recent theories of consciousness in light of our definitions: the Integrated Information Theory and the Geometric Theory of consciousness. PMID:27512377

  17. From 'just the facts' to 'more theory and methods, please': The evolution of the research article in Administrative Science Quarterly, 1956-2008.

    PubMed

    Strang, David; Siler, Kyle

    2017-08-01

    This paper analyzes the surface structure of research articles published in Administrative Science Quarterly between 1956 and 2008. The period is marked by a shift from essays that interweave theory, methods and results to experimental reports that separate them. There is dramatic growth in the size of theory, methods and discussion sections, accompanied by a shrinking results section. Bibliographic references and hypotheses expand in number and become concentrated in theory sections. Article structure varies primarily with historical time and also with research design (broadly, quantitative vs. qualitative) and the author's background. We link trends in article structure to the disciplinary development of organization studies and consider its distinctive trajectory relative to physical science.

  18. Identity theory and personality theory: mutual relevance.

    PubMed

    Stryker, Sheldon

    2007-12-01

    Some personality psychologists have found a structural symbolic interactionist frame and identity theory relevant to their work. This frame and theory, developed in sociology, are first reviewed. Emphasized in the review are a multiple identity conception of self, identities as internalized expectations derived from roles embedded in organized networks of social interaction, and a view of social structures as facilitators in bringing people into networks or constraints in keeping them out, subsequently, attention turns to a discussion of the mutual relevance of structural symbolic interactionism/identity theory and personality theory, looking to extensions of the current literature on these topics.

  19. [From the cell theory to the neuron theory].

    PubMed

    Tixier-Vidal, Andrée

    2010-01-01

    The relationship between the cell theory formulated by Schwann (1839) and by Virchow (1855) on the one hand, and, on the other hand, the neuron theory, as formulated by Waldeyer (1891) and by Cajal (1906), are discussed from a historical point of view. Both of them are the result of technical and conceptuel progress. Both of them had to fight against the dominant dogma before being accepted. The cell theory opposed the school of Bichat, the vitalist philosophy and the positivist philosophy of Auguste Comte. The neuron theory, which is clearly based on the cell theory, was mostly concerned with the mode of interneuronal communication; it opposed the concept of contiguity to Golgi's concept of continuity. At present, the cell theory remains central in every field of Biology. By contrast, the neuron theory, which until the middle of the XXth century opened the study of the nervous system to a necessary reductionnist approach, is no longer central to recent developments of neurosciences. © Société de Biologie, 2011.

  20. Analytical theory of polymer-network-mediated interaction between colloidal particles

    PubMed Central

    Di Michele, Lorenzo; Zaccone, Alessio; Eiser, Erika

    2012-01-01

    Nanostructured materials based on colloidal particles embedded in a polymer network are used in a variety of applications ranging from nanocomposite rubbers to organic-inorganic hybrid solar cells. Further, polymer-network-mediated colloidal interactions are highly relevant to biological studies whereby polymer hydrogels are commonly employed to probe the mechanical response of living cells, which can determine their biological function in physiological environments. The performance of nanomaterials crucially relies upon the spatial organization of the colloidal particles within the polymer network that depends, in turn, on the effective interactions between the particles in the medium. Existing models based on nonlocal equilibrium thermodynamics fail to clarify the nature of these interactions, precluding the way toward the rational design of polymer-composite materials. In this article, we present a predictive analytical theory of these interactions based on a coarse-grained model for polymer networks. We apply the theory to the case of colloids partially embedded in cross-linked polymer substrates and clarify the origin of attractive interactions recently observed experimentally. Monte Carlo simulation results that quantitatively confirm the theoretical predictions are also presented. PMID:22679289