Sample records for rigorous theoretical analysis

  1. Engineering education as a complex system

    NASA Astrophysics Data System (ADS)

    Gattie, David K.; Kellam, Nadia N.; Schramski, John R.; Walther, Joachim

    2011-12-01

    This paper presents a theoretical basis for cultivating engineering education as a complex system that will prepare students to think critically and make decisions with regard to poorly understood, ill-structured issues. Integral to this theoretical basis is a solution space construct developed and presented as a benchmark for evaluating problem-solving orientations that emerge within students' thinking as they progress through an engineering curriculum. It is proposed that the traditional engineering education model, while analytically rigorous, is characterised by properties that, although necessary, are insufficient for preparing students to address complex issues of the twenty-first century. A Synthesis and Design Studio model for engineering education is proposed, which maintains the necessary rigor of analysis within a uniquely complex yet sufficiently structured learning environment.

  2. The Dynamics of Germinal Centre Selection as Measured by Graph-Theoretical Analysis of Mutational Lineage Trees

    PubMed Central

    Dunn-Walters, Deborah K.; Belelovsky, Alex; Edelman, Hanna; Banerjee, Monica; Mehr, Ramit

    2002-01-01

    We have developed a rigorous graph-theoretical algorithm for quantifying the shape properties of mutational lineage trees. We show that information about the dynamics of hypermutation and antigen-driven clonal selection during the humoral immune response is contained in the shape of mutational lineage trees deduced from the responding clones. Age and tissue related differences in the selection process can be studied using this method. Thus, tree shape analysis can be used as a means of elucidating humoral immune response dynamics in various situations. PMID:15144020

  3. Culturally Sensitive Risk Behavior Prevention Programs for African American Adolescents: A Systematic Analysis

    ERIC Educational Resources Information Center

    Metzger, Isha; Cooper, Shauna M.; Zarrett, Nicole; Flory, Kate

    2013-01-01

    The current review conducted a systematic assessment of culturally sensitive risk prevention programs for African American adolescents. Prevention programs meeting the inclusion and exclusion criteria were evaluated across several domains: (1) theoretical orientation and foundation; (2) methodological rigor; (3) level of cultural integration; (4)…

  4. Testing Theoretical Models of Magnetic Damping Using an Air Track

    ERIC Educational Resources Information Center

    Vidaurre, Ana; Riera, Jaime; Monsoriu, Juan A.; Gimenez, Marcos H.

    2008-01-01

    Magnetic braking is a long-established application of Lenz's law. A rigorous analysis of the laws governing this problem involves solving Maxwell's equations in a time-dependent situation. Approximate models have been developed to describe different experimental results related to this phenomenon. In this paper we present a new method for the…

  5. Professionalization of the Senior Chinese Officer Corps Trends and Implications

    DTIC Science & Technology

    1997-01-01

    81The officers who retired were Ye Jianying , Nie Rongzhen, Xu Xiangqian, Wang Zhen, Song Renqiong, and Li Desheng. Of course, the political impact of...increased education level, functional spe- cialization, and adherence to retirement norms.4 Li Cheng and Lynn White, in their 1993 Asian Survey article...making rigorous comparative analysis untenable. Second, Li and White do not place their results or analysis in any theoretical context. In

  6. Hollow-cylinder waveguide isolators for use at millimeter wavelengths

    NASA Technical Reports Server (NTRS)

    Kanda, M.; May, W. G.

    1974-01-01

    A semiconductor waveguide isolator consisting of a hollow column of a semiconductor mounted coaxially is considered in a circular waveguide in a longitudinal dc magnetic field. An elementary and physical analysis based on the excitation of plane waves in the guide and a more rigorous mode matching analysis are presented. These theoretical predictions are compared with experimental results for an InSb isolator at 94GHz and 75 K.

  7. Quantitative force measurements using frequency modulation atomic force microscopy—theoretical foundations

    NASA Astrophysics Data System (ADS)

    Sader, John E.; Uchihashi, Takayuki; Higgins, Michael J.; Farrell, Alan; Nakayama, Yoshikazu; Jarvis, Suzanne P.

    2005-03-01

    Use of the atomic force microscope (AFM) in quantitative force measurements inherently requires a theoretical framework enabling conversion of the observed deflection properties of the cantilever to an interaction force. In this paper, the theoretical foundations of using frequency modulation atomic force microscopy (FM-AFM) in quantitative force measurements are examined and rigorously elucidated, with consideration being given to both 'conservative' and 'dissipative' interactions. This includes a detailed discussion of the underlying assumptions involved in such quantitative force measurements, the presentation of globally valid explicit formulae for evaluation of so-called 'conservative' and 'dissipative' forces, discussion of the origin of these forces, and analysis of the applicability of FM-AFM to quantitative force measurements in liquid.

  8. Rigorous derivation of the effective model describing a non-isothermal fluid flow in a vertical pipe filled with porous medium

    NASA Astrophysics Data System (ADS)

    Beneš, Michal; Pažanin, Igor

    2018-03-01

    This paper reports an analytical investigation of non-isothermal fluid flow in a thin (or long) vertical pipe filled with porous medium via asymptotic analysis. We assume that the fluid inside the pipe is cooled (or heated) by the surrounding medium and that the flow is governed by the prescribed pressure drop between pipe's ends. Starting from the dimensionless Darcy-Brinkman-Boussinesq system, we formally derive a macroscopic model describing the effective flow at small Brinkman-Darcy number. The asymptotic approximation is given by the explicit formulae for the velocity, pressure and temperature clearly acknowledging the effects of the cooling (heating) and porous structure. The theoretical error analysis is carried out to indicate the order of accuracy and to provide a rigorous justification of the effective model.

  9. Hollow-cylinder waveguide isolators for use at millimeter wavelengths

    NASA Technical Reports Server (NTRS)

    Kanda, M.; May, W. G.

    1974-01-01

    The device considered in this study is a semiconductor waveguide isolator consisting of a hollow column of a semiconductor mounted coaxially in a circular waveguide in a longitudinal dc magnetic field. An elementary and physical analysis based on the excitation of plane waves in the guide and a more rigorous mode-matching analysis (MMA) are presented. These theoretical predictions are compared with experimental results for an InSb isolator at 94 GHz and 75 K.

  10. Music-therapy analyzed through conceptual mapping

    NASA Astrophysics Data System (ADS)

    Martinez, Rodolfo; de la Fuente, Rebeca

    2002-11-01

    Conceptual maps have been employed lately as a learning tool, as a modern study technique, and as a new way to understand intelligence, which allows for the development of a strong theoretical reference, in order to prove the research hypothesis. This paper presents a music-therapy analysis based on this tool to produce a conceptual mapping network, which ranges from magic through the rigor of the hard sciences.

  11. The response function of modulated grid Faraday cup plasma instruments

    NASA Technical Reports Server (NTRS)

    Barnett, A.; Olbert, S.

    1986-01-01

    Modulated grid Faraday cup plasma analyzers are a very useful tool for making in situ measurements of space plasmas. One of their great attributes is that their simplicity permits their angular response function to be calculated theoretically. An expression is derived for this response function by computing the trajectories of the charged particles inside the cup. The Voyager Plasma Science (PLS) experiment is used as a specific example. Two approximations to the rigorous response function useful for data analysis are discussed. The theoretical formulas were tested by multi-sensor analysis of solar wind data. The tests indicate that the formulas represent the true cup response function for all angles of incidence with a maximum error of only a few percent.

  12. David crighton, 1942-2000: a commentary on his career and his influence on aeroacoustic theory

    NASA Astrophysics Data System (ADS)

    Ffowcs Williams, John E.

    David Crighton, a greatly admired figure in fluid mechanics, Head of the Department of Applied Mathematics and Theoretical Physics at Cambridge, and Master of Jesus College, Cambridge, died at the peak of his career. He had made important contributions to the theory of waves generated by unsteady flow. Crighton's work was always characterized by the application of rigorous mathematical approximations to fluid mechanical idealizations of practically relevant problems. At the time of his death, he was certainly the most influential British applied mathematical figure, and his former collaborators and students form a strong school that continues his special style of mathematical application. Rigorous analysis of well-posed aeroacoustical problems was transformed by David Crighton.

  13. Academic Rigor or Academic Rigor Mortis? Supervising Dissertations Is Serious Business

    ERIC Educational Resources Information Center

    Wright, Robin Redmon

    2017-01-01

    This reflection considers the importance of and responsibility to graduate research supervision through an examination of a published dissertation that has had significant influence on the country's current immigration debate. The author exhorts both graduate students and adult education faculty to insist on clearly stated theoretical and…

  14. A Cluster-Randomized Trial of Restorative Practices: An Illustration to Spur High-Quality Research and Evaluation.

    PubMed

    Acosta, Joie D; Chinman, Matthew; Ebener, Patricia; Phillips, Andrea; Xenakis, Lea; Malone, Patrick S

    2016-01-01

    Restorative Practices in schools lack rigorous evaluation studies. As an example of rigorous school-based research, this paper describes the first randomized control trial of restorative practices to date, the Study of Restorative Practices. It is a 5-year, cluster-randomized controlled trial (RCT) of the Restorative Practices Intervention (RPI) in 14 middle schools in Maine to assess whether RPI impacts both positive developmental outcomes and problem behaviors and whether the effects persist during the transition from middle to high school. The two-year RPI intervention began in the 2014-2015 school year. The study's rationale and theoretical concerns are discussed along with methodological concerns including teacher professional development. The theoretical rationale and description of the methods from this study may be useful to others conducting rigorous research and evaluation in this area.

  15. Statistical comparison of a hybrid approach with approximate and exact inference models for Fusion 2+

    NASA Astrophysics Data System (ADS)

    Lee, K. David; Wiesenfeld, Eric; Gelfand, Andrew

    2007-04-01

    One of the greatest challenges in modern combat is maintaining a high level of timely Situational Awareness (SA). In many situations, computational complexity and accuracy considerations make the development and deployment of real-time, high-level inference tools very difficult. An innovative hybrid framework that combines Bayesian inference, in the form of Bayesian Networks, and Possibility Theory, in the form of Fuzzy Logic systems, has recently been introduced to provide a rigorous framework for high-level inference. In previous research, the theoretical basis and benefits of the hybrid approach have been developed. However, lacking is a concrete experimental comparison of the hybrid framework with traditional fusion methods, to demonstrate and quantify this benefit. The goal of this research, therefore, is to provide a statistical analysis on the comparison of the accuracy and performance of hybrid network theory, with pure Bayesian and Fuzzy systems and an inexact Bayesian system approximated using Particle Filtering. To accomplish this task, domain specific models will be developed under these different theoretical approaches and then evaluated, via Monte Carlo Simulation, in comparison to situational ground truth to measure accuracy and fidelity. Following this, a rigorous statistical analysis of the performance results will be performed, to quantify the benefit of hybrid inference to other fusion tools.

  16. Assessing Sensitivity of Early Head Start Study Findings to Manipulated Randomization Threats

    ERIC Educational Resources Information Center

    Green, Sheridan

    2013-01-01

    Increasing demands for design rigor and an emphasis on evidence-based practice on a national level indicated a need for further guidance related to successful implementation of randomized studies in education. Rigorous and meaningful experimental research and its conclusions help establish a valid theoretical and evidence base for educational…

  17. A Cluster-Randomized Trial of Restorative Practices: An Illustration to Spur High-Quality Research and Evaluation

    PubMed Central

    Acosta, Joie D.; Chinman, Matthew; Ebener, Patricia; Phillips, Andrea; Xenakis, Lea; Malone, Patrick S.

    2017-01-01

    Restorative Practices in schools lack rigorous evaluation studies. As an example of rigorous school-based research, this paper describes the first randomized control trial of restorative practices to date, the Study of Restorative Practices. It is a 5-year, cluster-randomized controlled trial (RCT) of the Restorative Practices Intervention (RPI) in 14 middle schools in Maine to assess whether RPI impacts both positive developmental outcomes and problem behaviors and whether the effects persist during the transition from middle to high school. The two-year RPI intervention began in the 2014–2015 school year. The study’s rationale and theoretical concerns are discussed along with methodological concerns including teacher professional development. The theoretical rationale and description of the methods from this study may be useful to others conducting rigorous research and evaluation in this area. PMID:28936104

  18. Providing a Theoretical Basis for Nanotoxicity Risk Analysis Departing from Traditional Physiologically-Based Pharmacokinetic (PBPK) Modeling

    DTIC Science & Technology

    2010-09-01

    estimation of total exposure at any toxicological endpoint in the body. This effort is a significant contribution as it highlights future research needs...rigorous modeling of the nanoparticle transport by including physico-chemical properties of engineered particles. Similarly, toxicological dose-response...exposure risks as compared to larger sized particles of the same material. Although the toxicology of a base material may be thoroughly defined, the

  19. QTest: Quantitative Testing of Theories of Binary Choice.

    PubMed

    Regenwetter, Michel; Davis-Stober, Clintin P; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of "Random Cumulative Prospect Theory." A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences.

  20. Theory and applications of structured light single pixel imaging

    NASA Astrophysics Data System (ADS)

    Stokoe, Robert J.; Stockton, Patrick A.; Pezeshki, Ali; Bartels, Randy A.

    2018-02-01

    Many single-pixel imaging techniques have been developed in recent years. Though the methods of image acquisition vary considerably, the methods share unifying features that make general analysis possible. Furthermore, the methods developed thus far are based on intuitive processes that enable simple and physically-motivated reconstruction algorithms, however, this approach may not leverage the full potential of single-pixel imaging. We present a general theoretical framework of single-pixel imaging based on frame theory, which enables general, mathematically rigorous analysis. We apply our theoretical framework to existing single-pixel imaging techniques, as well as provide a foundation for developing more-advanced methods of image acquisition and reconstruction. The proposed frame theoretic framework for single-pixel imaging results in improved noise robustness, decrease in acquisition time, and can take advantage of special properties of the specimen under study. By building on this framework, new methods of imaging with a single element detector can be developed to realize the full potential associated with single-pixel imaging.

  1. MUSIC-characterization of small scatterers for normal measurement data

    NASA Astrophysics Data System (ADS)

    Griesmaier, Roland; Hanke, Martin

    2009-07-01

    We investigate the reconstruction of the positions of a collection of small metallic objects buried beneath the ground from measurements of the vertical component of scattered fields corresponding to vertically polarized dipole excitations on a horizontal two-dimensional measurement device above the surface of the ground. A MUSIC reconstruction method for this problem has recently been proposed by Iakovleva et al (2007 IEEE Trans. Antennas Propag. 55 2598). In this paper, we give a rigorous theoretical justification of this method. To that end we prove a characterization of the positions of the scatterers in terms of the measurement data, applying an asymptotic analysis of the scattered fields. We present numerical results to illustrate our theoretical findings.

  2. Limit analysis of hollow spheres or spheroids with Hill orthotropic matrix

    NASA Astrophysics Data System (ADS)

    Pastor, Franck; Pastor, Joseph; Kondo, Djimedo

    2012-03-01

    Recent theoretical studies of the literature are concerned by the hollow sphere or spheroid (confocal) problems with orthotropic Hill type matrix. They have been developed in the framework of the limit analysis kinematical approach by using very simple trial velocity fields. The present Note provides, through numerical upper and lower bounds, a rigorous assessment of the approximate criteria derived in these theoretical works. To this end, existing static 3D codes for a von Mises matrix have been easily extended to the orthotropic case. Conversely, instead of the non-obvious extension of the existing kinematic codes, a new original mixed approach has been elaborated on the basis of the plane strain structure formulation earlier developed by F. Pastor (2007). Indeed, such a formulation does not need the expressions of the unit dissipated powers. Interestingly, it delivers a numerical code better conditioned and notably more rapid than the previous one, while preserving the rigorous upper bound character of the corresponding numerical results. The efficiency of the whole approach is first demonstrated through comparisons of the results to the analytical upper bounds of Benzerga and Besson (2001) or Monchiet et al. (2008) in the case of spherical voids in the Hill matrix. Moreover, we provide upper and lower bounds results for the hollow spheroid with the Hill matrix which are compared to those of Monchiet et al. (2008).

  3. Testing adaptive toolbox models: a Bayesian hierarchical approach.

    PubMed

    Scheibehenne, Benjamin; Rieskamp, Jörg; Wagenmakers, Eric-Jan

    2013-01-01

    Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox framework. How can a toolbox model be quantitatively specified? How can the number of toolbox strategies be limited to prevent uncontrolled strategy sprawl? How can a toolbox model be formally tested against alternative theories? The authors show how these challenges can be met by using Bayesian inference techniques. By means of parameter recovery simulations and the analysis of empirical data across a variety of domains (i.e., judgment and decision making, children's cognitive development, function learning, and perceptual categorization), the authors illustrate how Bayesian inference techniques allow toolbox models to be quantitatively specified, strategy sprawl to be contained, and toolbox models to be rigorously tested against competing theories. The authors demonstrate that their approach applies at the individual level but can also be generalized to the group level with hierarchical Bayesian procedures. The suggested Bayesian inference techniques represent a theoretical and methodological advancement for toolbox theories of cognition and behavior.

  4. Computational strategy for the solution of large strain nonlinear problems using the Wilkins explicit finite-difference approach

    NASA Technical Reports Server (NTRS)

    Hofmann, R.

    1980-01-01

    The STEALTH code system, which solves large strain, nonlinear continuum mechanics problems, was rigorously structured in both overall design and programming standards. The design is based on the theoretical elements of analysis while the programming standards attempt to establish a parallelism between physical theory, programming structure, and documentation. These features have made it easy to maintain, modify, and transport the codes. It has also guaranteed users a high level of quality control and quality assurance.

  5. Combinatorial compatibility as habit-controlling factor in lysozyme crystallization I. Monomeric and tetrameric F faces derived graph-theoretically

    NASA Astrophysics Data System (ADS)

    Strom, C. S.; Bennema, P.

    1997-03-01

    A series of two articles discusses possible morphological evidence for oligomerization of growth units in the crystallization of tetragonal lysozyme, based on a rigorous graph-theoretic derivation of the F faces. In the first study (Part I), the growth layers are derived as valid networks satisfying the conditions of F slices in the context of the PBC theory using the graph-theoretic method implemented in program FFACE [C.S. Strom, Z. Krist. 172 (1985) 11]. The analysis is performed in monomeric and alternative tetrameric and octameric formulations of the unit cell, assuming tetramer formation according to the strongest bonds. F (flat) slices with thickness Rdhkl ( {1}/{2} < R ≤ 1 ) are predicted theoretically in the forms 1 1 0, 0 1 1, 1 1 1. The relevant energies are established in the broken bond model. The relation between possible oligomeric specifications of the unit cell and combinatorially feasible F slice compositions in these orientations is explored.

  6. Tight finite-key analysis for quantum cryptography

    PubMed Central

    Tomamichel, Marco; Lim, Charles Ci Wen; Gisin, Nicolas; Renner, Renato

    2012-01-01

    Despite enormous theoretical and experimental progress in quantum cryptography, the security of most current implementations of quantum key distribution is still not rigorously established. One significant problem is that the security of the final key strongly depends on the number, M, of signals exchanged between the legitimate parties. Yet, existing security proofs are often only valid asymptotically, for unrealistically large values of M. Another challenge is that most security proofs are very sensitive to small differences between the physical devices used by the protocol and the theoretical model used to describe them. Here we show that these gaps between theory and experiment can be simultaneously overcome by using a recently developed proof technique based on the uncertainty relation for smooth entropies. PMID:22252558

  7. Tight finite-key analysis for quantum cryptography.

    PubMed

    Tomamichel, Marco; Lim, Charles Ci Wen; Gisin, Nicolas; Renner, Renato

    2012-01-17

    Despite enormous theoretical and experimental progress in quantum cryptography, the security of most current implementations of quantum key distribution is still not rigorously established. One significant problem is that the security of the final key strongly depends on the number, M, of signals exchanged between the legitimate parties. Yet, existing security proofs are often only valid asymptotically, for unrealistically large values of M. Another challenge is that most security proofs are very sensitive to small differences between the physical devices used by the protocol and the theoretical model used to describe them. Here we show that these gaps between theory and experiment can be simultaneously overcome by using a recently developed proof technique based on the uncertainty relation for smooth entropies.

  8. Application of THz Vibrational Spectroscopy to Molecular Characterization and the Theoretical Fundamentals: An Illustration Using Saccharide Molecules.

    PubMed

    Zhang, Feng; Wang, Houng-Wei; Tominaga, Keisuke; Hayashi, Michitoshi; Hasunuma, Tomohisa; Kondo, Akihiko

    2017-02-01

    This work illustrates several theoretical fundamentals for the application of THz vibrational spectroscopy to molecular characterization in the solid state using two different types of saccharide systems as examples. Four subjects have been specifically addressed: (1) the qualitative differences in the molecular vibrational signatures monitored by THz and mid-IR vibrational spectroscopy; (2) the selection rules for THz vibrational spectroscopy as applied to crystalline and amorphous systems; (3) a normal mode simulation, using α-l-xylose as an example; and (4) a rigorous mode analysis to quantify the percentage contributions of the intermolecular and intramolecular vibrations to the normal mode of interest. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Crystal Structure and Theoretical Analysis of Green Gold Au 30 (S- t Bu) 18 Nanomolecules and Their Relation to Au 30 S(S- t Bu) 18

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dass, Amala; Jones, Tanya; Rambukwella, Milan

    We report the complete X-ray crystallographic structure as determined through single crystal X-ray diffraction and a thorough theoretical analysis of the green gold Au30(S-tBu)18. While the structure of Au30S(S-tBu)18 with 19 sulfur atoms has been reported, the crystal structure of Au30(S-tBu)18 without the μ3-sulfur has remained elusive until now, though matrix-assisted laser desorption ionization mass spectrometry (MALDI-MS) and electrospray ionization mass spectrometry (ESI-MS) data unequivocally shows its presence in abundance. The Au30(S-tBu)18 nanomolecule is not only distinct in its crystal structure but has unique temperature dependent optical properties. Structure determination allows a rigorous comparison and an excellent agreement with theoreticalmore » predictions of structure, stability, and optical response.« less

  10. QTest: Quantitative Testing of Theories of Binary Choice

    PubMed Central

    Regenwetter, Michel; Davis-Stober, Clintin P.; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of “Random Cumulative Prospect Theory.” A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences. PMID:24999495

  11. Nucleation and Growth Kinetics from LaMer Burst Data.

    PubMed

    Chu, Daniel B K; Owen, Jonathan S; Peters, Baron

    2017-10-12

    In LaMer burst nucleation, the individual nucleation events happen en masse, quasi-simultaneously, and at nearly identical homogeneous conditions. These properties make LaMer burst nucleation important for applications that require monodispersed particles and also for theoretical analyses. Sugimoto and co-workers predicted that the number of nuclei generated during a LaMer burst depends only on the solute supply rate and the growth rate, independent of the nucleation kinetics. Some experiments confirm that solute supply kinetics control the number of nuclei, but flaws in the original theoretical analysis raise questions about the predicted roles of growth and nucleation kinetics. We provide a rigorous analysis of the coupled equations that govern concentrations of nuclei and solutes. Our analysis confirms that the number of nuclei is largely determined by the solute supply and growth rates, but our predicted relationship differs from that of Sugimoto et al. Moreover, we find that additional nucleus size dependent corrections should emerge in systems with slow growth kinetics. Finally, we show how the nucleation kinetics determine the particle size distribution. We suggest that measured particle size distributions might therefore provide ways to test theoretical models of homogeneous nucleation kinetics.

  12. Approximation Methods for Inverse Problems Governed by Nonlinear Parabolic Systems

    DTIC Science & Technology

    1999-12-17

    We present a rigorous theoretical framework for approximation of nonlinear parabolic systems with delays in the context of inverse least squares...numerical results demonstrating the convergence are given for a model of dioxin uptake and elimination in a distributed liver model that is a special case of the general theoretical framework .

  13. Development of a theoretical framework for analyzing cerebrospinal fluid dynamics

    PubMed Central

    Cohen, Benjamin; Voorhees, Abram; Vedel, Søren; Wei, Timothy

    2009-01-01

    Background To date hydrocephalus researchers acknowledge the need for rigorous but utilitarian fluid mechanics understanding and methodologies in studying normal and hydrocephalic intracranial dynamics. Pressure volume models and electric circuit analogs introduced pressure into volume conservation; but control volume analysis enforces independent conditions on pressure and volume. Previously, utilization of clinical measurements has been limited to understanding of the relative amplitude and timing of flow, volume and pressure waveforms; qualitative approaches without a clear framework for meaningful quantitative comparison. Methods Control volume analysis is presented to introduce the reader to the theoretical background of this foundational fluid mechanics technique for application to general control volumes. This approach is able to directly incorporate the diverse measurements obtained by clinicians to better elucidate intracranial dynamics and progression to disorder. Results Several examples of meaningful intracranial control volumes and the particular measurement sets needed for the analysis are discussed. Conclusion Control volume analysis provides a framework to guide the type and location of measurements and also a way to interpret the resulting data within a fundamental fluid physics analysis. PMID:19772652

  14. Response function of modulated grid Faraday cup plasma instruments

    NASA Technical Reports Server (NTRS)

    Barnett, A.; Olbert, S.

    1986-01-01

    Modulated grid Faraday cup plasma analyzers are a very useful tool for making in situ measurements of space plasmas. One of their great attributes is that their simplicity permits their angular response function to be calculated theoretically. An expression is derived for this response function by computing the trajectories of the charged particles inside the cup. The Voyager plasma science experiment is used as a specific example. Two approximations to the rigorous response function useful for data analysis are discussed. Multisensor analysis of solar wind data indicates that the formulas represent the true cup response function for all angles of incidence with a maximum error of only a few percent.

  15. A Theoretical Framework for Lagrangian Descriptors

    NASA Astrophysics Data System (ADS)

    Lopesino, C.; Balibrea-Iniesta, F.; García-Garrido, V. J.; Wiggins, S.; Mancho, A. M.

    This paper provides a theoretical background for Lagrangian Descriptors (LDs). The goal of achieving rigorous proofs that justify the ability of LDs to detect invariant manifolds is simplified by introducing an alternative definition for LDs. The definition is stated for n-dimensional systems with general time dependence, however we rigorously prove that this method reveals the stable and unstable manifolds of hyperbolic points in four particular 2D cases: a hyperbolic saddle point for linear autonomous systems, a hyperbolic saddle point for nonlinear autonomous systems, a hyperbolic saddle point for linear nonautonomous systems and a hyperbolic saddle point for nonlinear nonautonomous systems. We also discuss further rigorous results which show the ability of LDs to highlight additional invariants sets, such as n-tori. These results are just a simple extension of the ergodic partition theory which we illustrate by applying this methodology to well-known examples, such as the planar field of the harmonic oscillator and the 3D ABC flow. Finally, we provide a thorough discussion on the requirement of the objectivity (frame-invariance) property for tools designed to reveal phase space structures and their implications for Lagrangian descriptors.

  16. Photons Revisited

    NASA Astrophysics Data System (ADS)

    Batic, Matej; Begalli, Marcia; Han, Min Cheol; Hauf, Steffen; Hoff, Gabriela; Kim, Chan Hyeong; Kim, Han Sung; Grazia Pia, Maria; Saracco, Paolo; Weidenspointner, Georg

    2014-06-01

    A systematic review of methods and data for the Monte Carlo simulation of photon interactions is in progress: it concerns a wide set of theoretical modeling approaches and data libraries available for this purpose. Models and data libraries are assessed quantitatively with respect to an extensive collection of experimental measurements documented in the literature to determine their accuracy; this evaluation exploits rigorous statistical analysis methods. The computational performance of the associated modeling algorithms is evaluated as well. An overview of the assessment of photon interaction models and results of the experimental validation are presented.

  17. Development of a software safety process and a case study of its use

    NASA Technical Reports Server (NTRS)

    Knight, John C.

    1993-01-01

    The goal of this research is to continue the development of a comprehensive approach to software safety and to evaluate the approach with a case study. The case study is a major part of the project, and it involves the analysis of a specific safety-critical system from the medical equipment domain. The particular application being used was selected because of the availability of a suitable candidate system. We consider the results to be generally applicable and in no way particularly limited by the domain. The research is concentrating on issues raised by the specification and verification phases of the software lifecycle since they are central to our previously-developed rigorous definitions of software safety. The theoretical research is based on our framework of definitions for software safety. In the area of specification, the main topics being investigated are the development of techniques for building system fault trees that correctly incorporate software issues and the development of rigorous techniques for the preparation of software safety specifications. The research results are documented. Another area of theoretical investigation is the development of verification methods tailored to the characteristics of safety requirements. Verification of the correct implementation of the safety specification is central to the goal of establishing safe software. The empirical component of this research is focusing on a case study in order to provide detailed characterizations of the issues as they appear in practice, and to provide a testbed for the evaluation of various existing and new theoretical results, tools, and techniques. The Magnetic Stereotaxis System is summarized.

  18. Educational Technology: A Theoretical Discussion

    ERIC Educational Resources Information Center

    Andrews, Barbara; Hakken, David

    1977-01-01

    Views educational technology in relation to the pattern of technological change, argues that the new technology must be rigorously evaluated, and suggests it is best understood as a business approach to education. (DD)

  19. [Preliminarily application of content analysis to qualitative nursing data].

    PubMed

    Liang, Shu-Yuan; Chuang, Yeu-Hui; Wu, Shu-Fang

    2012-10-01

    Content analysis is a methodology for objectively and systematically studying the content of communication in various formats. Content analysis in nursing research and nursing education is called qualitative content analysis. Qualitative content analysis is frequently applied to nursing research, as it allows researchers to determine categories inductively and deductively. This article examines qualitative content analysis in nursing research from theoretical and practical perspectives. We first describe how content analysis concepts such as unit of analysis, meaning unit, code, category, and theme are used. Next, we describe the basic steps involved in using content analysis, including data preparation, data familiarization, analysis unit identification, creating tentative coding categories, category refinement, and establishing category integrity. Finally, this paper introduces the concept of content analysis rigor, including dependability, confirmability, credibility, and transferability. This article elucidates the content analysis method in order to help professionals conduct systematic research that generates data that are informative and useful in practical application.

  20. Effects of rigor status during high-pressure processing on the physical qualities of farm-raised abalone (Haliotis rufescens).

    PubMed

    Hughes, Brianna H; Greenberg, Neil J; Yang, Tom C; Skonberg, Denise I

    2015-01-01

    High-pressure processing (HPP) is used to increase meat safety and shelf-life, with conflicting quality effects depending on rigor status during HPP. In the seafood industry, HPP is used to shuck and pasteurize oysters, but its use on abalones has only been minimally evaluated and the effect of rigor status during HPP on abalone quality has not been reported. Farm-raised abalones (Haliotis rufescens) were divided into 12 HPP treatments and 1 unprocessed control treatment. Treatments were processed pre-rigor or post-rigor at 2 pressures (100 and 300 MPa) and 3 processing times (1, 3, and 5 min). The control was analyzed post-rigor. Uniform plugs were cut from adductor and foot meat for texture profile analysis, shear force, and color analysis. Subsamples were used for scanning electron microscopy of muscle ultrastructure. Texture profile analysis revealed that post-rigor processed abalone was significantly (P < 0.05) less firm and chewy than pre-rigor processed irrespective of muscle type, processing time, or pressure. L values increased with pressure to 68.9 at 300 MPa for pre-rigor processed foot, 73.8 for post-rigor processed foot, 90.9 for pre-rigor processed adductor, and 89.0 for post-rigor processed adductor. Scanning electron microscopy images showed fraying of collagen fibers in processed adductor, but did not show pressure-induced compaction of the foot myofibrils. Post-rigor processed abalone meat was more tender than pre-rigor processed meat, and post-rigor processed foot meat was lighter in color than pre-rigor processed foot meat, suggesting that waiting for rigor to resolve prior to processing abalones may improve consumer perceptions of quality and market value. © 2014 Institute of Food Technologists®

  1. Interaction of surface plasmon polaritons in heavily doped GaN microstructures with terahertz radiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melentev, G. A., E-mail: gamelen@spbstu.ru; Shalygin, V. A.; Vorobjev, L. E.

    2016-03-07

    We present the results of experimental and theoretical studies of the surface plasmon polariton excitations in heavily doped GaN epitaxial layers. Reflection and emission of radiation in the frequency range of 2–20 THz including the Reststrahlen band were investigated for samples with grating etched on the sample surface, as well as for samples with flat surface. The reflectivity spectrum for p-polarized radiation measured for the sample with the surface-relief grating demonstrates a set of resonances associated with excitations of different surface plasmon polariton modes. Spectral peculiarities due to the diffraction effect have been also revealed. The characteristic features of themore » reflectivity spectrum, namely, frequencies, amplitudes, and widths of the resonance dips, are well described theoretically by a modified technique of rigorous coupled-wave analysis of Maxwell equations. The emissivity spectra of the samples were measured under epilayer temperature modulation by pulsed electric field. The emissivity spectrum of the sample with surface-relief grating shows emission peaks in the frequency ranges corresponding to the decay of the surface plasmon polariton modes. Theoretical analysis based on the blackbody-like radiation theory well describes the main peculiarities of the observed THz emission.« less

  2. Human Rights and the Excess of Identity: A Legal and Theoretical Inquiry into the Notion of Identity in Strasbourg Case Law.

    PubMed

    Al Tamimi, Yussef

    2018-06-01

    Identity is a central theme in contemporary politics, but legal academia lacks a rigorous analysis of this concept. The aim of this article is twofold: (i) firstly, it aims to reveal presumptions on identity in human rights law by mapping how the European Court of Human Rights approaches identity and (ii) secondly, it seeks to analyse these presumptions using theoretical insights on identity. By merging legal and theoretical analysis, this article contributes a reading of the Court's case law which suggests that the tension between the political and apolitical is visible as a common thread in the Court's use of identity. In case law concerning paternity, the Court appears to hold a specific view of what is presented as an unquestionable part of identity. This ostensibly pre-political notion of identity becomes untenable in cases where the nature of an identity feature, such as the headscarf, is contended or a minority has adopted a national identity that conflicts with the majoritarian national identity. The Court's approach to identity in such cases reflects a paradox that is inherent to identity; identity is personal while simultaneously constituted and shaped by overarching power mechanisms.

  3. Testability of evolutionary game dynamics based on experimental economics data

    NASA Astrophysics Data System (ADS)

    Wang, Yijia; Chen, Xiaojie; Wang, Zhijian

    2017-11-01

    Understanding the dynamic processes of a real game system requires an appropriate dynamics model, and rigorously testing a dynamics model is nontrivial. In our methodological research, we develop an approach to testing the validity of game dynamics models that considers the dynamic patterns of angular momentum and speed as measurement variables. Using Rock-Paper-Scissors (RPS) games as an example, we illustrate the geometric patterns in the experiment data. We then derive the related theoretical patterns from a series of typical dynamics models. By testing the goodness-of-fit between the experimental and theoretical patterns, we show that the validity of these models can be evaluated quantitatively. Our approach establishes a link between dynamics models and experimental systems, which is, to the best of our knowledge, the most effective and rigorous strategy for ascertaining the testability of evolutionary game dynamics models.

  4. Theoretical Evaluation of the Transient Response of Constant Head and Constant Flow-Rate Permeability Tests

    USGS Publications Warehouse

    Zhang, M.; Takahashi, M.; Morin, R.H.; Esaki, T.

    1998-01-01

    A theoretical analysis is presented that compares the response characteristics of the constant head and the constant flowrate (flow pump) laboratory techniques for quantifying the hydraulic properties of geologic materials having permeabilities less than 10-10 m/s. Rigorous analytical solutions that describe the transient distributions of hydraulic gradient within a specimen are developed, and equations are derived for each method. Expressions simulating the inflow and outflow rates across the specimen boundaries during a constant-head permeability test are also presented. These solutions illustrate the advantages and disadvantages of each method, including insights into measurement accuracy and the validity of using Darcy's law under certain conditions. The resulting observations offer practical considerations in the selection of an appropriate laboratory test method for the reliable measurement of permeability in low-permeability geologic materials.

  5. Performance Analysis of Local Ensemble Kalman Filter

    NASA Astrophysics Data System (ADS)

    Tong, Xin T.

    2018-03-01

    Ensemble Kalman filter (EnKF) is an important data assimilation method for high-dimensional geophysical systems. Efficient implementation of EnKF in practice often involves the localization technique, which updates each component using only information within a local radius. This paper rigorously analyzes the local EnKF (LEnKF) for linear systems and shows that the filter error can be dominated by the ensemble covariance, as long as (1) the sample size exceeds the logarithmic of state dimension and a constant that depends only on the local radius; (2) the forecast covariance matrix admits a stable localized structure. In particular, this indicates that with small system and observation noises, the filter error will be accurate in long time even if the initialization is not. The analysis also reveals an intrinsic inconsistency caused by the localization technique, and a stable localized structure is necessary to control this inconsistency. While this structure is usually taken for granted for the operation of LEnKF, it can also be rigorously proved for linear systems with sparse local observations and weak local interactions. These theoretical results are also validated by numerical implementation of LEnKF on a simple stochastic turbulence in two dynamical regimes.

  6. Rigorous modal analysis of plasmonic nanoresonators

    NASA Astrophysics Data System (ADS)

    Yan, Wei; Faggiani, Rémi; Lalanne, Philippe

    2018-05-01

    The specificity of modal-expansion formalisms is their capabilities to model the physical properties in the natural resonance-state basis of the system in question, leading to a transparent interpretation of the numerical results. In electromagnetism, modal-expansion formalisms are routinely used for optical waveguides. In contrast, they are much less mature for analyzing open non-Hermitian systems, such as micro- and nanoresonators. Here, by accounting for material dispersion with auxiliary fields, we considerably extend the capabilities of these formalisms, in terms of computational effectiveness, number of states handled, and range of validity. We implement an efficient finite-element solver to compute the resonance states, and derive closed-form expressions of the modal excitation coefficients for reconstructing the scattered fields. Together, these two achievements allow us to perform rigorous modal analysis of complicated plasmonic resonators, being not limited to a few resonance states, with straightforward physical interpretations and remarkable computation speeds. We particularly show that, when the number of states retained in the expansion increases, convergence toward accurate predictions is achieved, offering a solid theoretical foundation for analyzing important issues, e.g., Fano interference, quenching, and coupling with the continuum, which are critical in nanophotonic research.

  7. Aerial photography flight quality assessment with GPS/INS and DEM data

    NASA Astrophysics Data System (ADS)

    Zhao, Haitao; Zhang, Bing; Shang, Jiali; Liu, Jiangui; Li, Dong; Chen, Yanyan; Zuo, Zhengli; Chen, Zhengchao

    2018-01-01

    The flight altitude, ground coverage, photo overlap, and other acquisition specifications of an aerial photography flight mission directly affect the quality and accuracy of the subsequent mapping tasks. To ensure smooth post-flight data processing and fulfill the pre-defined mapping accuracy, flight quality assessments should be carried out in time. This paper presents a novel and rigorous approach for flight quality evaluation of frame cameras with GPS/INS data and DEM, using geometric calculation rather than image analysis as in the conventional methods. This new approach is based mainly on the collinearity equations, in which the accuracy of a set of flight quality indicators is derived through a rigorous error propagation model and validated with scenario data. Theoretical analysis and practical flight test of an aerial photography mission using an UltraCamXp camera showed that the calculated photo overlap is accurate enough for flight quality assessment of 5 cm ground sample distance image, using the SRTMGL3 DEM and the POSAV510 GPS/INS data. An even better overlap accuracy could be achieved for coarser-resolution aerial photography. With this new approach, the flight quality evaluation can be conducted on site right after landing, providing accurate and timely information for decision making.

  8. Complex dynamics of an SEIR epidemic model with saturated incidence rate and treatment

    NASA Astrophysics Data System (ADS)

    Khan, Muhammad Altaf; Khan, Yasir; Islam, Saeed

    2018-03-01

    In this paper, we describe the dynamics of an SEIR epidemic model with saturated incidence, treatment function, and optimal control. Rigorous mathematical results have been established for the model. The stability analysis of the model is investigated and found that the model is locally asymptotically stable when R0 < 1. The model is locally as well as globally asymptotically stable at endemic equilibrium when R0 > 1. The proposed model may possess a backward bifurcation. The optimal control problem is designed and obtained their necessary results. Numerical results have been presented for justification of theoretical results.

  9. Scattering General Analysis; ANALISIS GENERAL DE LA DISPERSION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tixaire, A.G.

    1962-01-01

    A definition of scattering states is given. It is shown that such states must belong to the absolutely continuous part of the spectrum of the total hamiltonian whenever scattering systems are considered. Such embedding may be proper unless the quantum system is physically admissible. The Moller wave operators are analyzed using Abel- and Cesaro-limit theoretical arguments. Von Neumann s ergodic theorem is partially generalized. A rigorous derivation of the Gell-Mann and Goldberger and Lippmann and Schwinger equations is obtained by making use of results on spectral theory, wave function, and eigendifferential concepts contained. (auth)

  10. Three-port beam splitter of a binary fused-silica grating.

    PubMed

    Feng, Jijun; Zhou, Changhe; Wang, Bo; Zheng, Jiangjun; Jia, Wei; Cao, Hongchao; Lv, Peng

    2008-12-10

    A deep-etched polarization-independent binary fused-silica phase grating as a three-port beam splitter is designed and manufactured. The grating profile is optimized by use of the rigorous coupled-wave analysis around the 785 nm wavelength. The physical explanation of the grating is illustrated by the modal method. Simple analytical expressions of the diffraction efficiencies and modal guidelines for the three-port beam splitter grating design are given. Holographic recording technology and inductively coupled plasma etching are used to manufacture the fused-silica grating. Experimental results are in good agreement with the theoretical values.

  11. Reduced and Validated Kinetic Mechanisms for Hydrogen-CO-sir Combustion in Gas Turbines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yiguang Ju; Frederick Dryer

    2009-02-07

    Rigorous experimental, theoretical, and numerical investigation of various issues relevant to the development of reduced, validated kinetic mechanisms for synthetic gas combustion in gas turbines was carried out - including the construction of new radiation models for combusting flows, improvement of flame speed measurement techniques, measurements and chemical kinetic analysis of H{sub 2}/CO/CO{sub 2}/O{sub 2}/diluent mixtures, revision of the H{sub 2}/O{sub 2} kinetic model to improve flame speed prediction capabilities, and development of a multi-time scale algorithm to improve computational efficiency in reacting flow simulations.

  12. Rigor or Reliability and Validity in Qualitative Research: Perspectives, Strategies, Reconceptualization, and Recommendations.

    PubMed

    Cypress, Brigitte S

    Issues are still raised even now in the 21st century by the persistent concern with achieving rigor in qualitative research. There is also a continuing debate about the analogous terms reliability and validity in naturalistic inquiries as opposed to quantitative investigations. This article presents the concept of rigor in qualitative research using a phenomenological study as an exemplar to further illustrate the process. Elaborating on epistemological and theoretical conceptualizations by Lincoln and Guba, strategies congruent with qualitative perspective for ensuring validity to establish the credibility of the study are described. A synthesis of the historical development of validity criteria evident in the literature during the years is explored. Recommendations are made for use of the term rigor instead of trustworthiness and the reconceptualization and renewed use of the concept of reliability and validity in qualitative research, that strategies for ensuring rigor must be built into the qualitative research process rather than evaluated only after the inquiry, and that qualitative researchers and students alike must be proactive and take responsibility in ensuring the rigor of a research study. The insights garnered here will move novice researchers and doctoral students to a better conceptual grasp of the complexity of reliability and validity and its ramifications for qualitative inquiry.

  13. A rigorous and simpler method of image charges

    NASA Astrophysics Data System (ADS)

    Ladera, C. L.; Donoso, G.

    2016-07-01

    The method of image charges relies on the proven uniqueness of the solution of the Laplace differential equation for an electrostatic potential which satisfies some specified boundary conditions. Granted by that uniqueness, the method of images is rightly described as nothing but shrewdly guessing which and where image charges are to be placed to solve the given electrostatics problem. Here we present an alternative image charges method that is based not on guessing but on rigorous and simpler theoretical grounds, namely the constant potential inside any conductor and the application of powerful geometric symmetries. The aforementioned required uniqueness and, more importantly, guessing are therefore both altogether dispensed with. Our two new theoretical fundaments also allow the image charges method to be introduced in earlier physics courses for engineering and sciences students, instead of its present and usual introduction in electromagnetic theory courses that demand familiarity with the Laplace differential equation and its boundary conditions.

  14. Systemic Analysis Approaches for Air Transportation

    NASA Technical Reports Server (NTRS)

    Conway, Sheila

    2005-01-01

    Air transportation system designers have had only limited success using traditional operations research and parametric modeling approaches in their analyses of innovations. They need a systemic methodology for modeling of safety-critical infrastructure that is comprehensive, objective, and sufficiently concrete, yet simple enough to be used with reasonable investment. The methodology must also be amenable to quantitative analysis so issues of system safety and stability can be rigorously addressed. However, air transportation has proven itself an extensive, complex system whose behavior is difficult to describe, no less predict. There is a wide range of system analysis techniques available, but some are more appropriate for certain applications than others. Specifically in the area of complex system analysis, the literature suggests that both agent-based models and network analysis techniques may be useful. This paper discusses the theoretical basis for each approach in these applications, and explores their historic and potential further use for air transportation analysis.

  15. The construction of arbitrary order ERKN methods based on group theory for solving oscillatory Hamiltonian systems with applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mei, Lijie, E-mail: bxhanm@126.com; Wu, Xinyuan, E-mail: xywu@nju.edu.cn

    In general, extended Runge–Kutta–Nyström (ERKN) methods are more effective than traditional Runge–Kutta–Nyström (RKN) methods in dealing with oscillatory Hamiltonian systems. However, the theoretical analysis for ERKN methods, such as the order conditions, the symplectic conditions and the symmetric conditions, becomes much more complicated than that for RKN methods. Therefore, it is a bottleneck to construct high-order ERKN methods efficiently. In this paper, we first establish the ERKN group Ω for ERKN methods and the RKN group G for RKN methods, respectively. We then rigorously show that ERKN methods are a natural extension of RKN methods, that is, there exists anmore » epimorphism η of the ERKN group Ω onto the RKN group G. This epimorphism gives a global insight into the structure of the ERKN group by the analysis of its kernel and the corresponding RKN group G. Meanwhile, we establish a particular mapping φ of G into Ω so that each image element is an ideal representative element of the congruence class in Ω. Furthermore, an elementary theoretical analysis shows that this map φ can preserve many structure-preserving properties, such as the order, the symmetry and the symplecticity. From the epimorphism η together with its section φ, we may gain knowledge about the structure of the ERKN group Ω via the RKN group G. In light of the theoretical analysis of this paper, we obtain high-order structure-preserving ERKN methods in an effective way for solving oscillatory Hamiltonian systems. Numerical experiments are carried out and the results are very promising, which strongly support our theoretical analysis presented in this paper.« less

  16. Nuclear magnetic relaxation by the dipolar EMOR mechanism: General theory with applications to two-spin systems.

    PubMed

    Chang, Zhiwei; Halle, Bertil

    2016-02-28

    In aqueous systems with immobilized macromolecules, including biological tissue, the longitudinal spin relaxation of water protons is primarily induced by exchange-mediated orientational randomization (EMOR) of intra- and intermolecular magnetic dipole-dipole couplings. We have embarked on a systematic program to develop, from the stochastic Liouville equation, a general and rigorous theory that can describe relaxation by the dipolar EMOR mechanism over the full range of exchange rates, dipole coupling strengths, and Larmor frequencies. Here, we present a general theoretical framework applicable to spin systems of arbitrary size with symmetric or asymmetric exchange. So far, the dipolar EMOR theory is only available for a two-spin system with symmetric exchange. Asymmetric exchange, when the spin system is fragmented by the exchange, introduces new and unexpected phenomena. Notably, the anisotropic dipole couplings of non-exchanging spins break the axial symmetry in spin Liouville space, thereby opening up new relaxation channels in the locally anisotropic sites, including longitudinal-transverse cross relaxation. Such cross-mode relaxation operates only at low fields; at higher fields it becomes nonsecular, leading to an unusual inverted relaxation dispersion that splits the extreme-narrowing regime into two sub-regimes. The general dipolar EMOR theory is illustrated here by a detailed analysis of the asymmetric two-spin case, for which we present relaxation dispersion profiles over a wide range of conditions as well as analytical results for integral relaxation rates and time-dependent spin modes in the zero-field and motional-narrowing regimes. The general theoretical framework presented here will enable a quantitative analysis of frequency-dependent water-proton longitudinal relaxation in model systems with immobilized macromolecules and, ultimately, will provide a rigorous link between relaxation-based magnetic resonance image contrast and molecular parameters.

  17. Nuclear magnetic relaxation by the dipolar EMOR mechanism: General theory with applications to two-spin systems

    NASA Astrophysics Data System (ADS)

    Chang, Zhiwei; Halle, Bertil

    2016-02-01

    In aqueous systems with immobilized macromolecules, including biological tissue, the longitudinal spin relaxation of water protons is primarily induced by exchange-mediated orientational randomization (EMOR) of intra- and intermolecular magnetic dipole-dipole couplings. We have embarked on a systematic program to develop, from the stochastic Liouville equation, a general and rigorous theory that can describe relaxation by the dipolar EMOR mechanism over the full range of exchange rates, dipole coupling strengths, and Larmor frequencies. Here, we present a general theoretical framework applicable to spin systems of arbitrary size with symmetric or asymmetric exchange. So far, the dipolar EMOR theory is only available for a two-spin system with symmetric exchange. Asymmetric exchange, when the spin system is fragmented by the exchange, introduces new and unexpected phenomena. Notably, the anisotropic dipole couplings of non-exchanging spins break the axial symmetry in spin Liouville space, thereby opening up new relaxation channels in the locally anisotropic sites, including longitudinal-transverse cross relaxation. Such cross-mode relaxation operates only at low fields; at higher fields it becomes nonsecular, leading to an unusual inverted relaxation dispersion that splits the extreme-narrowing regime into two sub-regimes. The general dipolar EMOR theory is illustrated here by a detailed analysis of the asymmetric two-spin case, for which we present relaxation dispersion profiles over a wide range of conditions as well as analytical results for integral relaxation rates and time-dependent spin modes in the zero-field and motional-narrowing regimes. The general theoretical framework presented here will enable a quantitative analysis of frequency-dependent water-proton longitudinal relaxation in model systems with immobilized macromolecules and, ultimately, will provide a rigorous link between relaxation-based magnetic resonance image contrast and molecular parameters.

  18. Plasmon induced modification of silicon nanocrystals photoluminescence in presence of gold nanostripes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dyakov, S. A.; Zhigunov, D. M.; Marinins, A.

    Here, we report on the results of theoretical and experimental studies of photoluminescense of silicon nanocrystals in the proximity to plasmonic modes of different types. In the studied samples, the type of plasmonic mode is determined by the filling ratio of a one-dimensional array of gold stripes which covers the thin film with silicon nanocrystals on a quartz substrate. We analyze the extinction, photoluminesce spectra and decay kinetics of silicon nanocrystals and show that the incident and emitted light is coupled to the corresponding plasmonic mode. We demonstrate the modification of the extinction and photoluminesce spectra under the transition frommore » wide to narrow gold stripes. The experimental extinction and photoluminescense spectra are in good agreement with theoretical calculations performed by the rigorous coupled wave analysis. Finally, we study the contribution of individual silicon nanocrystals to the overall photoluminescense intensity, depending on their spacial position inside the structure.« less

  19. Perspective: Quantum Hamiltonians for optical interactions

    NASA Astrophysics Data System (ADS)

    Andrews, David L.; Jones, Garth A.; Salam, A.; Woolley, R. Guy

    2018-01-01

    The multipolar Hamiltonian of quantum electrodynamics is extensively employed in chemical and optical physics to treat rigorously the interaction of electromagnetic fields with matter. It is also widely used to evaluate intermolecular interactions. The multipolar version of the Hamiltonian is commonly obtained by carrying out a unitary transformation of the Coulomb gauge Hamiltonian that goes by the name of Power-Zienau-Woolley (PZW). Not only does the formulation provide excellent agreement with experiment, and versatility in its predictive ability, but also superior physical insight. Recently, the foundations and validity of the PZW Hamiltonian have been questioned, raising a concern over issues of gauge transformation and invariance, and whether observable quantities obtained from unitarily equivalent Hamiltonians are identical. Here, an in-depth analysis of theoretical foundations clarifies the issues and enables misconceptions to be identified. Claims of non-physicality are refuted: the PZW transformation and ensuing Hamiltonian are shown to rest on solid physical principles and secure theoretical ground.

  20. Plasmon induced modification of silicon nanocrystals photoluminescence in presence of gold nanostripes

    DOE PAGES

    Dyakov, S. A.; Zhigunov, D. M.; Marinins, A.; ...

    2018-03-20

    Here, we report on the results of theoretical and experimental studies of photoluminescense of silicon nanocrystals in the proximity to plasmonic modes of different types. In the studied samples, the type of plasmonic mode is determined by the filling ratio of a one-dimensional array of gold stripes which covers the thin film with silicon nanocrystals on a quartz substrate. We analyze the extinction, photoluminesce spectra and decay kinetics of silicon nanocrystals and show that the incident and emitted light is coupled to the corresponding plasmonic mode. We demonstrate the modification of the extinction and photoluminesce spectra under the transition frommore » wide to narrow gold stripes. The experimental extinction and photoluminescense spectra are in good agreement with theoretical calculations performed by the rigorous coupled wave analysis. Finally, we study the contribution of individual silicon nanocrystals to the overall photoluminescense intensity, depending on their spacial position inside the structure.« less

  1. A 2-D numerical simulation study on longitudinal solute transport and longitudinal dispersion coefficient

    NASA Astrophysics Data System (ADS)

    Zhang, Wei

    2011-07-01

    The longitudinal dispersion coefficient, DL, is a fundamental parameter of longitudinal solute transport models: the advection-dispersion (AD) model and various deadzone models. Since DL cannot be measured directly, and since its calibration using tracer test data is quite expensive and not always available, researchers have developed various methods, theoretical or empirical, for estimating DL by easier available cross-sectional hydraulic measurements (i.e., the transverse velocity profile, etc.). However, for known and unknown reasons, DL cannot be satisfactorily predicted using these theoretical/empirical formulae. Either there is very large prediction error for theoretical methods, or there is a lack of generality for the empirical formulae. Here, numerical experiments using Mike21, a software package that implements one of the most rigorous two-dimensional hydrodynamic and solute transport equations, for longitudinal solute transport in hypothetical streams, are presented. An analysis of the evolution of simulated solute clouds indicates that the two fundamental assumptions in Fischer's longitudinal transport analysis may be not reasonable. The transverse solute concentration distribution, and hence the longitudinal transport appears to be controlled by a dimensionless number ?, where Q is the average volumetric flowrate, Dt is a cross-sectional average transverse dispersion coefficient, and W is channel flow width. A simple empirical ? relationship may be established. Analysis and a revision of Fischer's theoretical formula suggest that ɛ influences the efficiency of transverse mixing and hence has restraining effect on longitudinal spreading. The findings presented here would improve and expand our understanding of longitudinal solute transport in open channel flow.

  2. Tactics for mechanized reasoning: a commentary on Milner (1984) ‘The use of machines to assist in rigorous proof’

    PubMed Central

    Gordon, M. J. C.

    2015-01-01

    Robin Milner's paper, ‘The use of machines to assist in rigorous proof’, introduces methods for automating mathematical reasoning that are a milestone in the development of computer-assisted theorem proving. His ideas, particularly his theory of tactics, revolutionized the architecture of proof assistants. His methodology for automating rigorous proof soundly, particularly his theory of type polymorphism in programing, led to major contributions to the theory and design of programing languages. His citation for the 1991 ACM A.M. Turing award, the most prestigious award in computer science, credits him with, among other achievements, ‘probably the first theoretically based yet practical tool for machine assisted proof construction’. This commentary was written to celebrate the 350th anniversary of the journal Philosophical Transactions of the Royal Society. PMID:25750147

  3. Broadband moth-eye antireflection coatings on silicon

    NASA Astrophysics Data System (ADS)

    Sun, Chih-Hung; Jiang, Peng; Jiang, Bin

    2008-02-01

    We report a bioinspired templating technique for fabricating broadband antireflection coatings that mimic antireflective moth eyes. Wafer-scale, subwavelength-structured nipple arrays are directly patterned on silicon using spin-coated silica colloidal monolayers as etching masks. The templated gratings exhibit excellent broadband antireflection properties and the normal-incidence specular reflection matches with the theoretical prediction using a rigorous coupled-wave analysis (RCWA) model. We further demonstrate that two common simulation methods, RCWA and thin-film multilayer models, generate almost identical prediction for the templated nipple arrays. This simple bottom-up technique is compatible with standard microfabrication, promising for reducing the manufacturing cost of crystalline silicon solar cells.

  4. Bioinspired broadband antireflection coatings on GaSb

    NASA Astrophysics Data System (ADS)

    Min, Wei-Lun; Betancourt, Amaury P.; Jiang, Peng; Jiang, Bin

    2008-04-01

    We report an inexpensive yet scalable templating technique for fabricating moth-eye antireflection gratings on gallium antimonide substrates. Non-close-packed colloidal monolayers are utilized as etching masks to pattern subwavelength-structured nipple arrays on GaSb. The resulting gratings exhibit superior broadband antireflection properties and thermal stability than conventional multilayer dielectric coatings. The specular reflection of the templated nipple arrays match with the theoretical predictions using a rigorous coupled-wave analysis model. The effect of the nipple shape and size on the antireflection properties has also been investigated by the same model. These biomimetic coatings are of great technological importance in developing efficient thermophotovoltaic cells.

  5. Graph Theory-Based Pinning Synchronization of Stochastic Complex Dynamical Networks.

    PubMed

    Li, Xiao-Jian; Yang, Guang-Hong

    2017-02-01

    This paper is concerned with the adaptive pinning synchronization problem of stochastic complex dynamical networks (CDNs). Based on algebraic graph theory and Lyapunov theory, pinning controller design conditions are derived, and the rigorous convergence analysis of synchronization errors in the probability sense is also conducted. Compared with the existing results, the topology structures of stochastic CDN are allowed to be unknown due to the use of graph theory. In particular, it is shown that the selection of nodes for pinning depends on the unknown lower bounds of coupling strengths. Finally, an example on a Chua's circuit network is given to validate the effectiveness of the theoretical results.

  6. A generative, probabilistic model of local protein structure.

    PubMed

    Boomsma, Wouter; Mardia, Kanti V; Taylor, Charles C; Ferkinghoff-Borg, Jesper; Krogh, Anders; Hamelryck, Thomas

    2008-07-01

    Despite significant progress in recent years, protein structure prediction maintains its status as one of the prime unsolved problems in computational biology. One of the key remaining challenges is an efficient probabilistic exploration of the structural space that correctly reflects the relative conformational stabilities. Here, we present a fully probabilistic, continuous model of local protein structure in atomic detail. The generative model makes efficient conformational sampling possible and provides a framework for the rigorous analysis of local sequence-structure correlations in the native state. Our method represents a significant theoretical and practical improvement over the widely used fragment assembly technique by avoiding the drawbacks associated with a discrete and nonprobabilistic approach.

  7. Realization of a Quantum Random Generator Certified with the Kochen-Specker Theorem

    NASA Astrophysics Data System (ADS)

    Kulikov, Anatoly; Jerger, Markus; Potočnik, Anton; Wallraff, Andreas; Fedorov, Arkady

    2017-12-01

    Random numbers are required for a variety of applications from secure communications to Monte Carlo simulation. Yet randomness is an asymptotic property, and no output string generated by a physical device can be strictly proven to be random. We report an experimental realization of a quantum random number generator (QRNG) with randomness certified by quantum contextuality and the Kochen-Specker theorem. The certification is not performed in a device-independent way but through a rigorous theoretical proof of each outcome being value indefinite even in the presence of experimental imperfections. The analysis of the generated data confirms the incomputable nature of our QRNG.

  8. Realization of a Quantum Random Generator Certified with the Kochen-Specker Theorem.

    PubMed

    Kulikov, Anatoly; Jerger, Markus; Potočnik, Anton; Wallraff, Andreas; Fedorov, Arkady

    2017-12-15

    Random numbers are required for a variety of applications from secure communications to Monte Carlo simulation. Yet randomness is an asymptotic property, and no output string generated by a physical device can be strictly proven to be random. We report an experimental realization of a quantum random number generator (QRNG) with randomness certified by quantum contextuality and the Kochen-Specker theorem. The certification is not performed in a device-independent way but through a rigorous theoretical proof of each outcome being value indefinite even in the presence of experimental imperfections. The analysis of the generated data confirms the incomputable nature of our QRNG.

  9. Optical simulations of organic light-emitting diodes through a combination of rigorous electromagnetic solvers and Monte Carlo ray-tracing methods

    NASA Astrophysics Data System (ADS)

    Bahl, Mayank; Zhou, Gui-Rong; Heller, Evan; Cassarly, William; Jiang, Mingming; Scarmozzino, Rob; Gregory, G. Groot

    2014-09-01

    Over the last two decades there has been extensive research done to improve the design of Organic Light Emitting Diodes (OLEDs) so as to enhance light extraction efficiency, improve beam shaping, and allow color tuning through techniques such as the use of patterned substrates, photonic crystal (PCs) gratings, back reflectors, surface texture, and phosphor down-conversion. Computational simulation has been an important tool for examining these increasingly complex designs. It has provided insights for improving OLED performance as a result of its ability to explore limitations, predict solutions, and demonstrate theoretical results. Depending upon the focus of the design and scale of the problem, simulations are carried out using rigorous electromagnetic (EM) wave optics based techniques, such as finite-difference time-domain (FDTD) and rigorous coupled wave analysis (RCWA), or through ray optics based technique such as Monte Carlo ray-tracing. The former are typically used for modeling nanostructures on the OLED die, and the latter for modeling encapsulating structures, die placement, back-reflection, and phosphor down-conversion. This paper presents the use of a mixed-level simulation approach which unifies the use of EM wave-level and ray-level tools. This approach uses rigorous EM wave based tools to characterize the nanostructured die and generate both a Bidirectional Scattering Distribution function (BSDF) and a far-field angular intensity distribution. These characteristics are then incorporated into the ray-tracing simulator to obtain the overall performance. Such mixed-level approach allows for comprehensive modeling of the optical characteristic of OLEDs and can potentially lead to more accurate performance than that from individual modeling tools alone.

  10. Interface Pattern Selection in Directional Solidification

    NASA Technical Reports Server (NTRS)

    Trivedi, Rohit; Tewari, Surendra N.

    2001-01-01

    The central focus of this research is to establish key scientific concepts that govern the selection of cellular and dendritic patterns during the directional solidification of alloys. Ground-based studies have established that the conditions under which cellular and dendritic microstructures form are precisely where convection effects are dominant in bulk samples. Thus, experimental data can not be obtained terrestrially under pure diffusive regime. Furthermore, reliable theoretical models are not yet possible which can quantitatively incorporate fluid flow in the pattern selection criterion. Consequently, microgravity experiments on cellular and dendritic growth are designed to obtain benchmark data under diffusive growth conditions that can be quantitatively analyzed and compared with the rigorous theoretical model to establish the fundamental principles that govern the selection of specific microstructure and its length scales. In the cellular structure, different cells in an array are strongly coupled so that the cellular pattern evolution is controlled by complex interactions between thermal diffusion, solute diffusion and interface effects. These interactions give infinity of solutions, and the system selects only a narrow band of solutions. The aim of this investigation is to obtain benchmark data and develop a rigorous theoretical model that will allow us to quantitatively establish the physics of this selection process.

  11. Dynamics of essential collective motions in proteins: Theory

    NASA Astrophysics Data System (ADS)

    Stepanova, Maria

    2007-11-01

    A general theoretical background is introduced for characterization of conformational motions in protein molecules, and for building reduced coarse-grained models of proteins, based on the statistical analysis of their phase trajectories. Using the projection operator technique, a system of coupled generalized Langevin equations is derived for essential collective coordinates, which are generated by principal component analysis of molecular dynamic trajectories. The number of essential degrees of freedom is not limited in the theory. An explicit analytic relation is established between the generalized Langevin equation for essential collective coordinates and that for the all-atom phase trajectory projected onto the subspace of essential collective degrees of freedom. The theory introduced is applied to identify correlated dynamic domains in a macromolecule and to construct coarse-grained models representing the conformational motions in a protein through a few interacting domains embedded in a dissipative medium. A rigorous theoretical background is provided for identification of dynamic correlated domains in a macromolecule. Examples of domain identification in protein G are given and employed to interpret NMR experiments. Challenges and potential outcomes of the theory are discussed.

  12. Critical assessment of inverse gas chromatography as means of assessing surface free energy and acid-base interaction of pharmaceutical powders.

    PubMed

    Telko, Martin J; Hickey, Anthony J

    2007-10-01

    Inverse gas chromatography (IGC) has been employed as a research tool for decades. Despite this record of use and proven utility in a variety of applications, the technique is not routinely used in pharmaceutical research. In other fields the technique has flourished. IGC is experimentally relatively straightforward, but analysis requires that certain theoretical assumptions are satisfied. The assumptions made to acquire some of the recently reported data are somewhat modified compared to initial reports. Most publications in the pharmaceutical literature have made use of a simplified equation for the determination of acid/base surface properties resulting in parameter values that are inconsistent with prior methods. In comparing the surface properties of different batches of alpha-lactose monohydrate, new data has been generated and compared with literature to allow critical analysis of the theoretical assumptions and their importance to the interpretation of the data. The commonly used (simplified) approach was compared with the more rigorous approach originally outlined in the surface chemistry literature. (c) 2007 Wiley-Liss, Inc.

  13. Testing theoretical models of magnetic damping using an air track

    NASA Astrophysics Data System (ADS)

    Vidaurre, Ana; Riera, Jaime; Monsoriu, Juan A.; Giménez, Marcos H.

    2008-03-01

    Magnetic braking is a long-established application of Lenz's law. A rigorous analysis of the laws governing this problem involves solving Maxwell's equations in a time-dependent situation. Approximate models have been developed to describe different experimental results related to this phenomenon. In this paper we present a new method for the analysis of magnetic braking using a magnet fixed to the glider of an air track. The forces acting on the glider, a result of the eddy currents, can be easily observed and measured. As a consequence of the air track inclination, the glider accelerates at the beginning, although it asymptotically tends towards a uniform rectilinear movement characterized by a terminal speed. This speed depends on the interaction between the magnetic field and the conductivity properties of the air track. Compared with previous related approaches, in our experimental setup the magnet fixed to the glider produces a magnetic braking force which acts continuously, rather than over a short period of time. The experimental results satisfactorily concur with the theoretical models adapted to this configuration.

  14. PRO development: rigorous qualitative research as the crucial foundation.

    PubMed

    Lasch, Kathryn Eilene; Marquis, Patrick; Vigneux, Marc; Abetz, Linda; Arnould, Benoit; Bayliss, Martha; Crawford, Bruce; Rosa, Kathleen

    2010-10-01

    Recently published articles have described criteria to assess qualitative research in the health field in general, but very few articles have delineated qualitative methods to be used in the development of Patient-Reported Outcomes (PROs). In fact, how PROs are developed with subject input through focus groups and interviews has been given relatively short shrift in the PRO literature when compared to the plethora of quantitative articles on the psychometric properties of PROs. If documented at all, most PRO validation articles give little for the reader to evaluate the content validity of the measures and the credibility and trustworthiness of the methods used to develop them. Increasingly, however, scientists and authorities want to be assured that PRO items and scales have meaning and relevance to subjects. This article was developed by an international, interdisciplinary group of psychologists, psychometricians, regulatory experts, a physician, and a sociologist. It presents rigorous and appropriate qualitative research methods for developing PROs with content validity. The approach described combines an overarching phenomenological theoretical framework with grounded theory data collection and analysis methods to yield PRO items and scales that have content validity.

  15. PRO development: rigorous qualitative research as the crucial foundation

    PubMed Central

    Marquis, Patrick; Vigneux, Marc; Abetz, Linda; Arnould, Benoit; Bayliss, Martha; Crawford, Bruce; Rosa, Kathleen

    2010-01-01

    Recently published articles have described criteria to assess qualitative research in the health field in general, but very few articles have delineated qualitative methods to be used in the development of Patient-Reported Outcomes (PROs). In fact, how PROs are developed with subject input through focus groups and interviews has been given relatively short shrift in the PRO literature when compared to the plethora of quantitative articles on the psychometric properties of PROs. If documented at all, most PRO validation articles give little for the reader to evaluate the content validity of the measures and the credibility and trustworthiness of the methods used to develop them. Increasingly, however, scientists and authorities want to be assured that PRO items and scales have meaning and relevance to subjects. This article was developed by an international, interdisciplinary group of psychologists, psychometricians, regulatory experts, a physician, and a sociologist. It presents rigorous and appropriate qualitative research methods for developing PROs with content validity. The approach described combines an overarching phenomenological theoretical framework with grounded theory data collection and analysis methods to yield PRO items and scales that have content validity. PMID:20512662

  16. Mixed Criticality Scheduling for Industrial Wireless Sensor Networks

    PubMed Central

    Jin, Xi; Xia, Changqing; Xu, Huiting; Wang, Jintao; Zeng, Peng

    2016-01-01

    Wireless sensor networks (WSNs) have been widely used in industrial systems. Their real-time performance and reliability are fundamental to industrial production. Many works have studied the two aspects, but only focus on single criticality WSNs. Mixed criticality requirements exist in many advanced applications in which different data flows have different levels of importance (or criticality). In this paper, first, we propose a scheduling algorithm, which guarantees the real-time performance and reliability requirements of data flows with different levels of criticality. The algorithm supports centralized optimization and adaptive adjustment. It is able to improve both the scheduling performance and flexibility. Then, we provide the schedulability test through rigorous theoretical analysis. We conduct extensive simulations, and the results demonstrate that the proposed scheduling algorithm and analysis significantly outperform existing ones. PMID:27589741

  17. Design and fabrication of a polarization-independent two-port beam splitter.

    PubMed

    Feng, Jijun; Zhou, Changhe; Zheng, Jiangjun; Cao, Hongchao; Lv, Peng

    2009-10-10

    We design and manufacture a fused-silica polarization-independent two-port beam splitter grating. The physical mechanism of this deeply etched grating can be shown clearly by using the simplified modal method with consideration of corresponding accumulated phase difference of two excited propagating grating modes, which illustrates that the binary-phase fused-silica grating structure depends little on the incident wavelength, but mainly on the ratio of groove depth to grating period and the ratio of incident wavelength to grating period. These analytic results would also be very helpful for wavelength bandwidth analysis. The exact grating profile is optimized by using the rigorous coupled-wave analysis. Holographic recording technology and inductively coupled plasma etching are used to manufacture the fused-silica grating. Experimental results agree well with the theoretical values.

  18. Reconstructing Dewey: Dialectics and Democratic Education

    ERIC Educational Resources Information Center

    Jackson, Jeff

    2012-01-01

    This essay aims to demonstrate the theoretical purchase offered by linking Dewey's educational theory with a rigorous account of dialectical development. Drawing on recent literature which emphasizes the continuing influence of Hegel on Dewey's thought throughout the latter's career, this essay reconstructs Dewey's argument regarding the…

  19. Imaginary-frequency polarizability and van der Waals force constants of two-electron atoms, with rigorous bounds

    NASA Technical Reports Server (NTRS)

    Glover, R. M.; Weinhold, F.

    1977-01-01

    Variational functionals of Braunn and Rebane (1972) for the imagery-frequency polarizability (IFP) have been generalized by the method of Gramian inequalities to give rigorous upper and lower bounds, valid even when the true (but unknown) unperturbed wavefunction must be represented by a variational approximation. Using these formulas in conjunction with flexible variational trial functions, tight error bounds are computed for the IFP and the associated two- and three-body van der Waals interaction constants of the ground 1(1S) and metastable 2(1,3S) states of He and Li(+). These bounds generally establish the ground-state properties to within a fraction of a per cent and metastable properties to within a few per cent, permitting a comparative assessment of competing theoretical methods at this level of accuracy. Unlike previous 'error bounds' for these properties, the present results have a completely a priori theoretical character, with no empirical input data.

  20. Enhancing rigor and practice of scoping reviews in social policy research: considerations from a worked example on the Americans with disabilities act.

    PubMed

    Harris, Sarah Parker; Gould, Robert; Fujiura, Glenn

    2015-01-01

    There is increasing theoretical consideration about the use of systematic and scoping reviews of evidence in informing disability and rehabilitation research and practice. Indicative of this trend, this journal published a piece by Rumrill, Fitzgerald and Merchant in 2010 explaining the utility and process for conducting reviews of intervention-based research. There is still need to consider how to apply such rigor when conducting more exploratory reviews of heterogeneous research. This article explores the challenges, benefits, and procedures for conducting rigorous exploratory scoping reviews of diverse evidence. The article expands upon Rumrill, Fitzgerald and Merchant's framework and considers its application to more heterogeneous evidence on the impact of social policy. A worked example of a scoping review of the Americans with Disabilities Act is provided with a procedural framework for conducting scoping reviews on the effects of a social policy. The need for more nuanced techniques for enhancing rigor became apparent during the review process. There are multiple methodological steps that can enhance the utility of exploratory scoping reviews. The potential of systematic consideration during the exploratory review process is shown as a viable method to enhance the rigor in reviewing diverse bodies of evidence.

  1. Global health diplomacy: A critical review of the literature.

    PubMed

    Ruckert, Arne; Labonté, Ronald; Lencucha, Raphael; Runnels, Vivien; Gagnon, Michelle

    2016-04-01

    Global health diplomacy (GHD) describes the practices by which governments and non-state actors attempt to coordinate and orchestrate global policy solutions to improve global health. As an emerging field of practice, there is little academic work that has comprehensively examined and synthesized the theorization of Global Health Diplomacy (GHD), nor looked at why specific health concerns enter into foreign policy discussion and agendas. With the objective of uncovering the driving forces behind and theoretical explanations of GHD, we conducted a critical literature review. We searched three English-language scholarly databases using standardized search terms which yielded 606 articles. After screening of abstracts based on our inclusion/exclusion criteria, we retained 135 articles for importing into NVivo10 and coding. We found a lack of rigorous theorizing about GHD and fragmentation of the GHD literature which is not clearly structured around key issues and their theoretical explanations. To address this lack of theoretical grounding, we link the findings from the GHD literature to how theoretical concepts used in International Relations (IR) have been, and could be invoked in explaining GHD more effectively. To do this, we develop a theoretical taxonomy to explain GHD outcomes based on a popular categorization in IR, identifying three levels of analysis (individual, domestic/national, and global/international) and the driving forces for the integration of health into foreign policy at each level. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. A Rigorous Test of the Fit of the Circumplex Model to Big Five Personality Data: Theoretical and Methodological Issues and Two Large Sample Empirical Tests.

    PubMed

    DeGeest, David Scott; Schmidt, Frank

    2015-01-01

    Our objective was to apply the rigorous test developed by Browne (1992) to determine whether the circumplex model fits Big Five personality data. This test has yet to be applied to personality data. Another objective was to determine whether blended items explained correlations among the Big Five traits. We used two working adult samples, the Eugene-Springfield Community Sample and the Professional Worker Career Experience Survey. Fit to the circumplex was tested via Browne's (1992) procedure. Circumplexes were graphed to identify items with loadings on multiple traits (blended items), and to determine whether removing these items changed five-factor model (FFM) trait intercorrelations. In both samples, the circumplex structure fit the FFM traits well. Each sample had items with dual-factor loadings (8 items in the first sample, 21 in the second). Removing blended items had little effect on construct-level intercorrelations among FFM traits. We conclude that rigorous tests show that the fit of personality data to the circumplex model is good. This finding means the circumplex model is competitive with the factor model in understanding the organization of personality traits. The circumplex structure also provides a theoretically and empirically sound rationale for evaluating intercorrelations among FFM traits. Even after eliminating blended items, FFM personality traits remained correlated.

  3. Rigorous theoretical framework for particle sizing in turbid colloids using light refraction.

    PubMed

    García-Valenzuela, Augusto; Barrera, Rubén G; Gutierrez-Reyes, Edahí

    2008-11-24

    Using a non-local effective-medium approach, we analyze the refraction of light in a colloidal medium. We discuss the theoretical grounds and all the necessary precautions to design and perform experiments to measure the effective refractive index in dilute colloids. As an application, we show that it is possible to retrieve the size of small dielectric particles in a colloid by measuring the complex effective refractive index and the volume fraction occupied by the particles.

  4. Intersubjectivity in Theoretical and Practical Online Courses

    ERIC Educational Resources Information Center

    Lim, Janine; Hall, Barbara M.

    2015-01-01

    Rigorous interaction between peers has been an elusive goal in online asynchronous discussions. Intersubjectivity, the goal of peer-to-peer interaction, is a representation of a higher quality of synthesis. It is the representation of knowledge construction achieved through a synergistic progression from individual contributions to sequences of…

  5. An Exemplar for Teaching and Learning Qualitative Research

    ERIC Educational Resources Information Center

    Onwuegbuzie, Anthony J.; Leech, Nancy L.; Slate, John R.; Stark, Marcella; Sharma, Bipin; Frels, Rebecca; Harris, Kristin; Combs, Julie P.

    2012-01-01

    In this article, we outline a course wherein the instructors teach students how to conduct rigorous qualitative research. We discuss the four major distinct, but overlapping, phases of the course: conceptual/theoretical, technical, applied, and emergent scholar. Students write several qualitative reports, called qualitative notebooks, which…

  6. A novel equivalent definition of Caputo fractional derivative without singular kernel and superconvergent analysis

    NASA Astrophysics Data System (ADS)

    Liu, Zhengguang; Li, Xiaoli

    2018-05-01

    In this article, we present a new second-order finite difference discrete scheme for a fractal mobile/immobile transport model based on equivalent transformative Caputo formulation. The new transformative formulation takes the singular kernel away to make the integral calculation more efficient. Furthermore, this definition is also effective where α is a positive integer. Besides, the T-Caputo derivative also helps us to increase the convergence rate of the discretization of the α-order(0 < α < 1) Caputo derivative from O(τ2-α) to O(τ3-α), where τ is the time step. For numerical analysis, a Crank-Nicolson finite difference scheme to solve the fractal mobile/immobile transport model is introduced and analyzed. The unconditional stability and a priori estimates of the scheme are given rigorously. Moreover, the applicability and accuracy of the scheme are demonstrated by numerical experiments to support our theoretical analysis.

  7. Cell-to-cell signaling through light: just a ghost of chance?

    PubMed Central

    2013-01-01

    Despite the large number of reports attributing the signaling between detached cell cultures to the electromagnetic phenomena, almost no report so far included a rigorous analysis of the possibility of such signaling. In this paper, we examine the physical feasibility of the electromagnetic communication between cells, especially through light, with regard to the ambient noise illumination. We compare theoretically attainable parameters of communication with experimentally obtained data of the photon emission from cells without a specially pronounced ability of bioluminescence. We show that the weak intensity of the emission together with an unfavorable signal-to-noise ratio, which is typical for natural conditions, represent an important obstacle to the signal detection by cells. PMID:24219796

  8. Uncertainty quantification for nuclear density functional theory and information content of new measurements.

    PubMed

    McDonnell, J D; Schunck, N; Higdon, D; Sarich, J; Wild, S M; Nazarewicz, W

    2015-03-27

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.

  9. Palaeostress perturbations near the El Castillo de las Guardas fault (SW Iberian Massif)

    NASA Astrophysics Data System (ADS)

    García-Navarro, Encarnación; Fernández, Carlos

    2010-05-01

    Use of stress inversion methods on faults measured at 33 sites located at the northwestern part of the South Portuguese Zone (Variscan Iberian Massif), and analysis of the basic dyke attitude at this same region, has revealed a prominent perturbation of the stress trajectories around some large, crustal-scale faults, like the El Castillo de las Guardas fault. The results are compared with the predictions of theoretical models of palaeostress deviations near master faults. According to this comparison, the El Castillo de las Guardas fault, an old structure that probably reversed several times its slip sense, can be considered as a sinistral strike-slip fault during the Moscovian. These results also point out the main shortcomings that still hinder a rigorous quantitative use of the theoretical models of stress perturbations around major faults: the spatial variation in the parameters governing the brittle behaviour of the continental crust, and the possibility of oblique slip along outcrop-scale faults in regions subjected to general, non-plane strain.

  10. Practical secure quantum communications

    NASA Astrophysics Data System (ADS)

    Diamanti, Eleni

    2015-05-01

    We review recent advances in the field of quantum cryptography, focusing in particular on practical implementations of two central protocols for quantum network applications, namely key distribution and coin flipping. The former allows two parties to share secret messages with information-theoretic security, even in the presence of a malicious eavesdropper in the communication channel, which is impossible with classical resources alone. The latter enables two distrustful parties to agree on a random bit, again with information-theoretic security, and with a cheating probability lower than the one that can be reached in a classical scenario. Our implementations rely on continuous-variable technology for quantum key distribution and on a plug and play discrete-variable system for coin flipping, and necessitate a rigorous security analysis adapted to the experimental schemes and their imperfections. In both cases, we demonstrate the protocols with provable security over record long distances in optical fibers and assess the performance of our systems as well as their limitations. The reported advances offer a powerful toolbox for practical applications of secure communications within future quantum networks.

  11. Rigorous Measures of Implementation: A Methodological Framework for Evaluating Innovative STEM Programs

    ERIC Educational Resources Information Center

    Cassata-Widera, Amy; Century, Jeanne; Kim, Dae Y.

    2011-01-01

    The practical need for multidimensional measures of fidelity of implementation (FOI) of reform-based science, technology, engineering, and mathematics (STEM) instructional materials, combined with a theoretical need in the field for a shared conceptual framework that could support accumulating knowledge on specific enacted program elements across…

  12. Help Seeking in Academic Settings: Goals, Groups, and Contexts

    ERIC Educational Resources Information Center

    Karabenick, Stuart A., Ed.; Newman, Richard S., Ed.

    2006-01-01

    Building on Karabenick's earlier volume on this topic and maintaining its high standards of scholarship and intellectual rigor, this book brings together contemporary work that is theoretically as well as practically important. It highlights current trends in the area and gives expanded attention to applications to teaching and learning. The…

  13. Accumulating Knowledge: When Are Reading Intervention Results Meaningful?

    ERIC Educational Resources Information Center

    Fletcher, Jack M.; Wagner, Richard K.

    2014-01-01

    The three target articles provide examples of intervention studies that are excellent models for the field. They rely on rigorous and elegant designs, the interventions are motivated by attention to underlying theoretical mechanisms, and longitudinal designs are used to examine the duration of effects of interventions that occur. When studies are…

  14. Researching the Study Abroad Experience

    ERIC Educational Resources Information Center

    McLeod, Mark; Wainwright, Philip

    2009-01-01

    The authors propose a paradigm for rigorous scientific assessment of study abroad programs, with the focus being on how study abroad experiences affect psychological constructs as opposed to looking solely at study-abroad-related outcomes. Social learning theory is used as a possible theoretical basis for making testable hypotheses and guiding…

  15. Testing Adaptive Toolbox Models: A Bayesian Hierarchical Approach

    ERIC Educational Resources Information Center

    Scheibehenne, Benjamin; Rieskamp, Jorg; Wagenmakers, Eric-Jan

    2013-01-01

    Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox…

  16. Sociomateriality: a theoretical framework for studying distributed medical education.

    PubMed

    MacLeod, Anna; Kits, Olga; Whelan, Emma; Fournier, Cathy; Wilson, Keith; Power, Gregory; Mann, Karen; Tummons, Jonathan; Brown, Peggy Alexiadis

    2015-11-01

    Distributed medical education (DME) is a type of distance learning in which students participate in medical education from diverse geographic locations using Web conferencing, videoconferencing, e-learning, and similar tools. DME is becoming increasingly widespread in North America and around the world.Although relatively new to medical education, distance learning has a long history in the broader field of education and a related body of literature that speaks to the importance of engaging in rigorous and theoretically informed studies of distance learning. The existing DME literature is helpful, but it has been largely descriptive and lacks a critical "lens"-that is, a theoretical perspective from which to rigorously conceptualize and interrogate DME's social (relationships, people) and material (technologies, tools) aspects.The authors describe DME and theories about distance learning and show that such theories focus on social, pedagogical, and cognitive considerations without adequately taking into account material factors. They address this gap by proposing sociomateriality as a theoretical framework allowing researchers and educators to study DME and (1) understand and consider previously obscured actors, infrastructure, and other factors that, on the surface, seem unrelated and even unimportant; (2) see clearly how the social and material components of learning are intertwined in fluid, messy, and often uncertain ways; and (3) perhaps think differently, even in ways that disrupt traditional approaches, as they explore DME. The authors conclude that DME brings with it substantial investments of social and material resources, and therefore needs careful study, using approaches that embrace its complexity.

  17. Rigorous Science: a How-To Guide.

    PubMed

    Casadevall, Arturo; Fang, Ferric C

    2016-11-08

    Proposals to improve the reproducibility of biomedical research have emphasized scientific rigor. Although the word "rigor" is widely used, there has been little specific discussion as to what it means and how it can be achieved. We suggest that scientific rigor combines elements of mathematics, logic, philosophy, and ethics. We propose a framework for rigor that includes redundant experimental design, sound statistical analysis, recognition of error, avoidance of logical fallacies, and intellectual honesty. These elements lead to five actionable recommendations for research education. Copyright © 2016 Casadevall and Fang.

  18. Concepts and Synonymy in the UMLS Metathesaurus

    PubMed Central

    Merrill, Gary H.

    2009-01-01

    This paper advances a detailed exploration of the complex relationships among terms, concepts, and syn­onymy in the UMLS (Unified Medical Language System) Metathesaurus, and proposes the study and under­standing of the Metathesaurus from a model-theoretic perspective. Initial sections provide the background and motivation for such an approach, and a careful informal treatment of these notions is offered as a con­text and basis for the formal analysis. What emerges from this is a set of puzzles and confusions in the Metathesaurus and its literature pertaining to synonymy and its relation to terms and concepts. A model theory for a segment of the Metathesaurus is then constructed, and its adequacy relative to the informal treatment is demonstrated. Finally, it is shown how this approach clarifies and addresses the puzzles educed from the informal discussion, and how the model-theoretic perspective may be employed to eval­uate some fundamental criticisms of the Metathesaurus. For users of the UMLS, two significant results of this analysis are a rigorous clarification of the different senses of synonymy that appear in treatments of the Metathesaurus and an illustration of the dangers in computing inferences involving ambiguous terms. PMID:19838995

  19. Human Rights and the Excess of Identity

    PubMed Central

    Al Tamimi, Yussef

    2017-01-01

    Identity is a central theme in contemporary politics, but legal academia lacks a rigorous analysis of this concept. The aim of this article is twofold: (i) firstly, it aims to reveal presumptions on identity in human rights law by mapping how the European Court of Human Rights approaches identity and (ii) secondly, it seeks to analyse these presumptions using theoretical insights on identity. By merging legal and theoretical analysis, this article contributes a reading of the Court’s case law which suggests that the tension between the political and apolitical is visible as a common thread in the Court’s use of identity. In case law concerning paternity, the Court appears to hold a specific view of what is presented as an unquestionable part of identity. This ostensibly pre-political notion of identity becomes untenable in cases where the nature of an identity feature, such as the headscarf, is contended or a minority has adopted a national identity that conflicts with the majoritarian national identity. The Court’s approach to identity in such cases reflects a paradox that is inherent to identity; identity is personal while simultaneously constituted and shaped by overarching power mechanisms. PMID:29881144

  20. Transmittance enhancement of sapphires with antireflective subwavelength grating patterned UV polymer surface structures by soft lithography.

    PubMed

    Lee, Soo Hyun; Leem, Jung Woo; Yu, Jae Su

    2013-12-02

    We report the total and diffuse transmission enhancement of sapphires with the ultraviolet curable SU8 polymer surface structures consisting of conical subwavelength gratings (SWGs) at one- and both-side surfaces for different periods. The SWGs patterns on the silicon templates were transferred into the SU8 polymer film surface on sapphires by a simple and cost-effective soft lithography technique. For the fabricated samples, the surface morphologies, wetting behaviors, and optical characteristics were investigated. For theoretical optical analysis, a rigorous coupled-wave analysis method was used. At a period of 350 nm, the sample with SWGs on SU8 film/sapphire exhibited a hydrophobic surface and higher total transmittance compared to the bare sapphire over a wide wavelength of 450-1000 nm. As the period of SWGs was increased, the low total transmittance region of < 85% was shifted towards the longer wavelengths and became broader while the diffuse transmittance was increased (i.e., larger haze ratio). For the samples with SWGs at both-side surfaces, the total and diffuse transmittance spectra were further enhanced compared to the samples with SWGs at one-side surface. The theoretical optical calculation results showed a similar trend to the experimentally measured data.

  1. An analysis of the surface-normal coupling efficiency of a metal grating coupler embedded in a Scotch tape optical waveguide

    NASA Astrophysics Data System (ADS)

    Barrios, Carlos Angulo; Canalejas-Tejero, Víctor

    2017-01-01

    The coupling efficiency at normal incidence of recently demonstrated aluminum grating couplers integrated in flexible Scotch tape waveguides has been analyzed theoretically and experimentally. Finite difference time domain (FDTD) and rigorously coupled wave analysis (RCWA) methods have been used to optimize the dimensions (duty cycle and metal thickness) of Scotch tape-embedded 1D Al gratings for maximum coupling at 635 nm wavelength. Good dimension and tape refractive index tolerances are predicted. FDTD simulations reveal the incident beam width and impinging position (alignment) values that avoid rediffraction and thus maximize the coupling efficiency. A 1D Al diffraction grating integrated into a Scotch tape optical waveguide has been fabricated and characterized. The fabrication process, based on pattern transfer, has been optimized to allow complete Al grating transfer onto the Scotch tape waveguide. A maximum coupling efficiency of 20% for TM-polarized normal incidence has been measured, which is in good agreement with the theoretical predictions. The measured coupling efficiency is further increased up to 28% for TM polarization under oblique incidence. Temperature dependence measurements have been also achieved and related to the simulations results and fabrication procedure.

  2. A Theoretical Approach to Understanding Population Dynamics with Seasonal Developmental Durations

    NASA Astrophysics Data System (ADS)

    Lou, Yijun; Zhao, Xiao-Qiang

    2017-04-01

    There is a growing body of biological investigations to understand impacts of seasonally changing environmental conditions on population dynamics in various research fields such as single population growth and disease transmission. On the other side, understanding the population dynamics subject to seasonally changing weather conditions plays a fundamental role in predicting the trends of population patterns and disease transmission risks under the scenarios of climate change. With the host-macroparasite interaction as a motivating example, we propose a synthesized approach for investigating the population dynamics subject to seasonal environmental variations from theoretical point of view, where the model development, basic reproduction ratio formulation and computation, and rigorous mathematical analysis are involved. The resultant model with periodic delay presents a novel term related to the rate of change of the developmental duration, bringing new challenges to dynamics analysis. By investigating a periodic semiflow on a suitably chosen phase space, the global dynamics of a threshold type is established: all solutions either go to zero when basic reproduction ratio is less than one, or stabilize at a positive periodic state when the reproduction ratio is greater than one. The synthesized approach developed here is applicable to broader contexts of investigating biological systems with seasonal developmental durations.

  3. Initiation reactions in acetylene pyrolysis

    DOE PAGES

    Zador, Judit; Fellows, Madison D.; Miller, James A.

    2017-05-10

    In gas-phase combustion systems the interest in acetylene stems largely from its role in molecular weight growth processes. The consensus is that above 1500 K acetylene pyrolysis starts mainly with the homolytic fission of the C–H bond creating an ethynyl radical and an H atom. However, below ~1500 K this reaction is too slow to initiate the chain reaction. It has been hypothesized that instead of dissociation, self-reaction initiates this process. Nevertheless, rigorous theoretical or direct experimental evidence is lacking, to an extent that even the molecular mechanism is debated in the literature. In this work we use rigorous abmore » initio transition-state theory master equation methods to calculate pressure- and temperature-dependent rate coefficients for the association of two acetylene molecules and related reactions. We establish the role of vinylidene, the high-energy isomer of acetylene in this process, compare our results with available experimental data, and assess the competition between the first-order and second-order initiation steps. As a result, we also show the effect of the rapid isomerization among the participating wells and highlight the need for time-scale analysis when phenomenological rate coefficients are compared to observed time scales in certain experiments.« less

  4. Theoretical modeling of the uranium 4f XPS for U(VI) and U(IV) oxides

    NASA Astrophysics Data System (ADS)

    Bagus, Paul S.; Nelin, Connie J.; Ilton, Eugene S.

    2013-12-01

    A rigorous study is presented of the physical processes related to X-Ray photoelectron spectroscopy, XPS, in the 4f level of U oxides, which, as well as being of physical interest in themselves, are representative of XPS in heavy metal oxides. In particular, we present compelling evidence for a new view of the screening of core-holes that extends prior understandings. Our analysis of the screening focuses on the covalent mixing of high lying U and O orbitals as opposed to the, more common, use of orbitals that are nominally pure U or pure O. It is shown that this covalent mixing is quite different for the initial and final, core-hole, configurations and that this difference is directly related to the XPS satellite intensity. Furthermore, we show that the high-lying U d orbitals as well as the U(5f) orbital may both contribute to the core-hole screening, in contrast with previous work that has only considered screening through the U(5f) shell. The role of modifying the U-O interaction by changing the U-O distance has been investigated and an unexpected correlation between U-O distance and XPS satellite intensity has been discovered. The role of flourite and octahedral crystal structures for U(IV) oxides has been examined and relationships established between XPS features and the covalent interactions in the different structures. The physical views of XPS satellites as arising from shake processes or as arising from ligand to metal charge transfers are contrasted; our analysis provides strong support that shake processes give a more fundamental physical understanding than charge transfer. Our theoretical studies are based on rigorous, strictly ab initio determinations of the electronic structure of embedded cluster models of U oxides with formal U(VI) and U(IV) oxidation states. Our results provide a foundation that makes it possible to establish quantitative relationships between features of the XPS spectra and materials properties.

  5. Aerodynamic Design of a Propeller for High-Altitude Balloon Trajectory Control

    NASA Technical Reports Server (NTRS)

    Eppler, Richard; Somers, Dan M.

    2012-01-01

    The aerodynamic design of a propeller for the trajectory control of a high-altitude, scientific balloon has been performed using theoretical methods developed especially for such applications. The methods are described. Optimum, nonlinear chord and twist distributions have been developed in conjunction with the design of a family of airfoils, the SE403, SE404, and SE405, for the propeller. The very low Reynolds numbers along the propeller blade fall in a range that has yet to be rigorously investigated, either experimentally or theoretically.

  6. The role of a posteriori mathematics in physics

    NASA Astrophysics Data System (ADS)

    MacKinnon, Edward

    2018-05-01

    The calculus that co-evolved with classical mechanics relied on definitions of functions and differentials that accommodated physical intuitions. In the early nineteenth century mathematicians began the rigorous reformulation of calculus and eventually succeeded in putting almost all of mathematics on a set-theoretic foundation. Physicists traditionally ignore this rigorous mathematics. Physicists often rely on a posteriori math, a practice of using physical considerations to determine mathematical formulations. This is illustrated by examples from classical and quantum physics. A justification of such practice stems from a consideration of the role of phenomenological theories in classical physics and effective theories in contemporary physics. This relates to the larger question of how physical theories should be interpreted.

  7. Topics in Computational Learning Theory and Graph Algorithms.

    ERIC Educational Resources Information Center

    Board, Raymond Acton

    This thesis addresses problems from two areas of theoretical computer science. The first area is that of computational learning theory, which is the study of the phenomenon of concept learning using formal mathematical models. The goal of computational learning theory is to investigate learning in a rigorous manner through the use of techniques…

  8. Comparing an annual and daily time-step model for predicting field-scale P loss

    USDA-ARS?s Scientific Manuscript database

    Several models with varying degrees of complexity are available for describing P movement through the landscape. The complexity of these models is dependent on the amount of data required by the model, the number of model parameters needed to be estimated, the theoretical rigor of the governing equa...

  9. Multiple Imputation of Multilevel Missing Data-Rigor versus Simplicity

    ERIC Educational Resources Information Center

    Drechsler, Jörg

    2015-01-01

    Multiple imputation is widely accepted as the method of choice to address item-nonresponse in surveys. However, research on imputation strategies for the hierarchical structures that are typically found in the data in educational contexts is still limited. While a multilevel imputation model should be preferred from a theoretical point of view if…

  10. The Theoretical and Empirical Basis for Meditation as an Intervention for PTSD

    ERIC Educational Resources Information Center

    Lang, Ariel J.; Strauss, Jennifer L.; Bomyea, Jessica; Bormann, Jill E.; Hickman, Steven D.; Good, Raquel C.; Essex, Michael

    2012-01-01

    In spite of the existence of good empirically supported treatments for posttraumatic stress disorder (PTSD), consumers and providers continue to ask for more options for managing this common and often chronic condition. Meditation-based approaches are being widely implemented, but there is minimal research rigorously assessing their effectiveness.…

  11. Complexity, Representation and Practice: Case Study as Method and Methodology

    ERIC Educational Resources Information Center

    Miles, Rebecca

    2015-01-01

    While case study is considered a common approach to examining specific and particular examples in research disciplines such as law, medicine and psychology, in the social sciences case study is often treated as a lesser, flawed or undemanding methodology which is less valid, reliable or theoretically rigorous than other methodologies. Building on…

  12. Configuration-controlled Au nanocluster arrays on inverse micelle nano-patterns: versatile platforms for SERS and SPR sensors

    NASA Astrophysics Data System (ADS)

    Jang, Yoon Hee; Chung, Kyungwha; Quan, Li Na; Špačková, Barbora; Šípová, Hana; Moon, Seyoung; Cho, Won Joon; Shin, Hae-Young; Jang, Yu Jin; Lee, Ji-Eun; Kochuveedu, Saji Thomas; Yoon, Min Ji; Kim, Jihyeon; Yoon, Seokhyun; Kim, Jin Kon; Kim, Donghyun; Homola, Jiří; Kim, Dong Ha

    2013-11-01

    Nanopatterned 2-dimensional Au nanocluster arrays with controlled configuration are fabricated onto reconstructed nanoporous poly(styrene-block-vinylpyridine) inverse micelle monolayer films. Near-field coupling of localized surface plasmons is studied and compared for disordered and ordered core-centered Au NC arrays. Differences in evolution of the absorption band and field enhancement upon Au nanoparticle adsorption are shown. The experimental results are found to be in good agreement with theoretical studies based on the finite-difference time-domain method and rigorous coupled-wave analysis. The realized Au nanopatterns are exploited as substrates for surface-enhanced Raman scattering and integrated into Kretschmann-type SPR sensors, based on which unprecedented SPR-coupling-type sensors are demonstrated.Nanopatterned 2-dimensional Au nanocluster arrays with controlled configuration are fabricated onto reconstructed nanoporous poly(styrene-block-vinylpyridine) inverse micelle monolayer films. Near-field coupling of localized surface plasmons is studied and compared for disordered and ordered core-centered Au NC arrays. Differences in evolution of the absorption band and field enhancement upon Au nanoparticle adsorption are shown. The experimental results are found to be in good agreement with theoretical studies based on the finite-difference time-domain method and rigorous coupled-wave analysis. The realized Au nanopatterns are exploited as substrates for surface-enhanced Raman scattering and integrated into Kretschmann-type SPR sensors, based on which unprecedented SPR-coupling-type sensors are demonstrated. Electronic supplementary information (ESI) available: TEM image and UV-vis absorption spectrum of citrate-capped Au NPs, AFM images of Au NC arrays on the PS-b-P4VP (41k-24k) template, ImageJ-analyzed results of PS-b-P4VP (41k-24k)-templated Au NC arrays, calculated %-surface coverage values, SEM images of Au NC arrays on the PS-b-P2VP (172k-42k) template for SPR biosensing, corresponding ImageJ-analyzed images by varying the Au NP deposition time and results of image analysis. See DOI: 10.1039/c3nr03860b

  13. An efficient numerical procedure for thermohydrodynamic analysis of cavitating bearings

    NASA Technical Reports Server (NTRS)

    Vijayaraghavan, D.

    1995-01-01

    An efficient and accurate numerical procedure to determine the thermo-hydrodynamic performance of cavitating bearings is described. This procedure is based on the earlier development of Elrod for lubricating films, in which the properties across the film thickness are determined at Lobatto points and their distributions are expressed by collocated polynomials. The cavitated regions and their boundaries are rigorously treated. Thermal boundary conditions at the surfaces, including heat dissipation through the metal to the ambient, are incorporated. Numerical examples are presented comparing the predictions using this procedure with earlier theoretical predictions and experimental data. With a few points across the film thickness and across the journal and the bearing in the radial direction, the temperature profile is very well predicted.

  14. Dynamics of a Chlorophyll Dimer in Collective and Local Thermal Environments

    DOE PAGES

    Merkli, M.; Berman, Gennady Petrovich; Sayre, Richard Thomas; ...

    2016-01-30

    Here we present a theoretical analysis of exciton transfer and decoherence effects in a photosynthetic dimer interacting with collective (correlated) and local (uncorrelated) protein-solvent environments. Our approach is based on the framework of the spin-boson model. We derive explicitly the thermal relaxation and decoherence rates of the exciton transfer process, valid for arbitrary temperatures and for arbitrary (in particular, large) interaction constants between the dimer and the environments. We establish a generalization of the Marcus formula, giving reaction rates for dimer levels possibly individually and asymmetrically coupled to environments. We identify rigorously parameter regimes for the validity of the generalizedmore » Marcus formula. The existence of long living quantum coherences at ambient temperatures emerges naturally from our approach.« less

  15. Topological Isomorphisms of Human Brain and Financial Market Networks

    PubMed Central

    Vértes, Petra E.; Nicol, Ruth M.; Chapman, Sandra C.; Watkins, Nicholas W.; Robertson, Duncan A.; Bullmore, Edward T.

    2011-01-01

    Although metaphorical and conceptual connections between the human brain and the financial markets have often been drawn, rigorous physical or mathematical underpinnings of this analogy remain largely unexplored. Here, we apply a statistical and graph theoretic approach to the study of two datasets – the time series of 90 stocks from the New York stock exchange over a 3-year period, and the fMRI-derived time series acquired from 90 brain regions over the course of a 10-min-long functional MRI scan of resting brain function in healthy volunteers. Despite the many obvious substantive differences between these two datasets, graphical analysis demonstrated striking commonalities in terms of global network topological properties. Both the human brain and the market networks were non-random, small-world, modular, hierarchical systems with fat-tailed degree distributions indicating the presence of highly connected hubs. These properties could not be trivially explained by the univariate time series statistics of stock price returns. This degree of topological isomorphism suggests that brains and markets can be regarded broadly as members of the same family of networks. The two systems, however, were not topologically identical. The financial market was more efficient and more modular – more highly optimized for information processing – than the brain networks; but also less robust to systemic disintegration as a result of hub deletion. We conclude that the conceptual connections between brains and markets are not merely metaphorical; rather these two information processing systems can be rigorously compared in the same mathematical language and turn out often to share important topological properties in common to some degree. There will be interesting scientific arbitrage opportunities in further work at the graph-theoretically mediated interface between systems neuroscience and the statistical physics of financial markets. PMID:22007161

  16. Dendritic solidification. I - Analysis of current theories and models. II - A model for dendritic growth under an imposed thermal gradient

    NASA Technical Reports Server (NTRS)

    Laxmanan, V.

    1985-01-01

    A critical review of the present dendritic growth theories and models is presented. Mathematically rigorous solutions to dendritic growth are found to rely on an ad hoc assumption that dendrites grow at the maximum possible growth rate. This hypothesis is found to be in error and is replaced by stability criteria which consider the conditions under which a dendrite tip advances in a stable fashion in a liquid. The important elements of a satisfactory model for dendritic solidification are summarized and a theoretically consistent model for dendritic growth under an imposed thermal gradient is proposed and described. The model is based on the modification of an analysis due to Burden and Hunt (1974) and predicts correctly in all respects, the transition from a dendritic to a planar interface at both very low and very large growth rates.

  17. Signal and noise modeling in confocal laser scanning fluorescence microscopy.

    PubMed

    Herberich, Gerlind; Windoffer, Reinhard; Leube, Rudolf E; Aach, Til

    2012-01-01

    Fluorescence confocal laser scanning microscopy (CLSM) has revolutionized imaging of subcellular structures in biomedical research by enabling the acquisition of 3D time-series of fluorescently-tagged proteins in living cells, hence forming the basis for an automated quantification of their morphological and dynamic characteristics. Due to the inherently weak fluorescence, CLSM images exhibit a low SNR. We present a novel model for the transfer of signal and noise in CLSM that is both theoretically sound as well as corroborated by a rigorous analysis of the pixel intensity statistics via measurement of the 3D noise power spectra, signal-dependence and distribution. Our model provides a better fit to the data than previously proposed models. Further, it forms the basis for (i) the simulation of the CLSM imaging process indispensable for the quantitative evaluation of CLSM image analysis algorithms, (ii) the application of Poisson denoising algorithms and (iii) the reconstruction of the fluorescence signal.

  18. Well-tempered metadynamics converges asymptotically.

    PubMed

    Dama, James F; Parrinello, Michele; Voth, Gregory A

    2014-06-20

    Metadynamics is a versatile and capable enhanced sampling method for the computational study of soft matter materials and biomolecular systems. However, over a decade of application and several attempts to give this adaptive umbrella sampling method a firm theoretical grounding prove that a rigorous convergence analysis is elusive. This Letter describes such an analysis, demonstrating that well-tempered metadynamics converges to the final state it was designed to reach and, therefore, that the simple formulas currently used to interpret the final converged state of tempered metadynamics are correct and exact. The results do not rely on any assumption that the collective variable dynamics are effectively Brownian or any idealizations of the hill deposition function; instead, they suggest new, more permissive criteria for the method to be well behaved. The results apply to tempered metadynamics with or without adaptive Gaussians or boundary corrections and whether the bias is stored approximately on a grid or exactly.

  19. Well-Tempered Metadynamics Converges Asymptotically

    NASA Astrophysics Data System (ADS)

    Dama, James F.; Parrinello, Michele; Voth, Gregory A.

    2014-06-01

    Metadynamics is a versatile and capable enhanced sampling method for the computational study of soft matter materials and biomolecular systems. However, over a decade of application and several attempts to give this adaptive umbrella sampling method a firm theoretical grounding prove that a rigorous convergence analysis is elusive. This Letter describes such an analysis, demonstrating that well-tempered metadynamics converges to the final state it was designed to reach and, therefore, that the simple formulas currently used to interpret the final converged state of tempered metadynamics are correct and exact. The results do not rely on any assumption that the collective variable dynamics are effectively Brownian or any idealizations of the hill deposition function; instead, they suggest new, more permissive criteria for the method to be well behaved. The results apply to tempered metadynamics with or without adaptive Gaussians or boundary corrections and whether the bias is stored approximately on a grid or exactly.

  20. Information Superiority via Formal Concept Analysis

    NASA Astrophysics Data System (ADS)

    Koester, Bjoern; Schmidt, Stefan E.

    This chapter will show how to get more mileage out of information. To achieve that, we first start with an introduction to the fundamentals of Formal Concept Analysis (FCA). FCA is a highly versatile field of applied lattice theory, which allows hidden relationships to be uncovered in relational data. Moreover, FCA provides a distinguished supporting framework to subsequently find and fill information gaps in a systematic and rigorous way. In addition, we would like to build bridges via a universal approach to other communities which can be related to FCA in order for other research areas to benefit from a theory that has been elaborated for more than twenty years. Last but not least, the essential benefits of FCA will be presented algorithmically as well as theoretically by investigating a real data set from the MIPT Terrorism Knowledge Base and also by demonstrating an application in the field of Web Information Retrieval and Web Intelligence.

  1. Sequential Stereotype Priming: A Meta-Analysis.

    PubMed

    Kidder, Ciara K; White, Katherine R; Hinojos, Michelle R; Sandoval, Mayra; Crites, Stephen L

    2017-08-01

    Psychological interest in stereotype measurement has spanned nearly a century, with researchers adopting implicit measures in the 1980s to complement explicit measures. One of the most frequently used implicit measures of stereotypes is the sequential priming paradigm. The current meta-analysis examines stereotype priming, focusing specifically on this paradigm. To contribute to ongoing discussions regarding methodological rigor in social psychology, one primary goal was to identify methodological moderators of the stereotype priming effect-whether priming is due to a relation between the prime and target stimuli, the prime and target response, participant task, stereotype dimension, stimulus onset asynchrony (SOA), and stimuli type. Data from 39 studies yielded 87 individual effect sizes from 5,497 participants. Analyses revealed that stereotype priming is significantly moderated by the presence of prime-response relations, participant task, stereotype dimension, target stimulus type, SOA, and prime repetition. These results carry both practical and theoretical implications for future research on stereotype priming.

  2. MUSiC - A general search for deviations from monte carlo predictions in CMS

    NASA Astrophysics Data System (ADS)

    Biallass, Philipp A.; CMS Collaboration

    2009-06-01

    A model independent analysis approach in CMS is presented, systematically scanning the data for deviations from the Monte Carlo expectation. Such an analysis can contribute to the understanding of the detector and the tuning of the event generators. Furthermore, due to the minimal theoretical bias this approach is sensitive to a variety of models of new physics, including those not yet thought of. Events are classified into event classes according to their particle content (muons, electrons, photons, jets and missing transverse energy). A broad scan of various distributions is performed, identifying significant deviations from the Monte Carlo simulation. The importance of systematic uncertainties is outlined, which are taken into account rigorously within the algorithm. Possible detector effects and generator issues, as well as models involving Supersymmetry and new heavy gauge bosons are used as an input to the search algorithm.

  3. MUSiC - A Generic Search for Deviations from Monte Carlo Predictions in CMS

    NASA Astrophysics Data System (ADS)

    Hof, Carsten

    2009-05-01

    We present a model independent analysis approach, systematically scanning the data for deviations from the Standard Model Monte Carlo expectation. Such an analysis can contribute to the understanding of the CMS detector and the tuning of the event generators. Furthermore, due to the minimal theoretical bias this approach is sensitive to a variety of models of new physics, including those not yet thought of. Events are classified into event classes according to their particle content (muons, electrons, photons, jets and missing transverse energy). A broad scan of various distributions is performed, identifying significant deviations from the Monte Carlo simulation. We outline the importance of systematic uncertainties, which are taken into account rigorously within the algorithm. Possible detector effects and generator issues, as well as models involving supersymmetry and new heavy gauge bosons have been used as an input to the search algorithm.

  4. MUSiC - Model-independent search for deviations from Standard Model predictions in CMS

    NASA Astrophysics Data System (ADS)

    Pieta, Holger

    2010-02-01

    We present an approach for a model independent search in CMS. Systematically scanning the data for deviations from the standard model Monte Carlo expectations, such an analysis can help to understand the detector and tune event generators. By minimizing the theoretical bias the analysis is furthermore sensitive to a wide range of models for new physics, including the uncounted number of models not-yet-thought-of. After sorting the events into classes defined by their particle content (leptons, photons, jets and missing transverse energy), a minimally prejudiced scan is performed on a number of distributions. Advanced statistical methods are used to determine the significance of the deviating regions, rigorously taking systematic uncertainties into account. A number of benchmark scenarios, including common models of new physics and possible detector effects, have been used to gauge the power of such a method. )

  5. Uncertainty quantification for nuclear density functional theory and information content of new measurements

    DOE PAGES

    McDonnell, J. D.; Schunck, N.; Higdon, D.; ...

    2015-03-24

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squaresmore » optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. In addition, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.« less

  6. Uncertainty quantification for nuclear density functional theory and information content of new measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDonnell, J. D.; Schunck, N.; Higdon, D.

    2015-03-24

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squaresmore » optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. As a result, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.« less

  7. Addressing the Wicked Problem of Quality in Higher Education: Theoretical Approaches and Implications

    ERIC Educational Resources Information Center

    Krause, Kerri-Lee

    2012-01-01

    This article explores the wicked problem of quality in higher education, arguing for a more robust theorising of the subject at national, institutional and local department level. The focus of the discussion rests on principles for theorising in more rigorous ways about the multidimensional issue of quality. Quality in higher education is proposed…

  8. Changes in Residents' Self-Efficacy Beliefs in a Clinically Rich Graduate Teacher Education Program

    ERIC Educational Resources Information Center

    Reynolds, Heather M.; Wagle, A. Tina; Mahar, Donna; Yannuzzi, Leigh; Tramonte, Barbara; King, Joseph

    2016-01-01

    Increasing the clinical preparation of teachers in the United States to meet greater rigor in K-12 education has become a goal of institutions of higher education, especially since the publication of the National Council for the Accreditation of Teacher Education Blue Ribbon Panel Report on Clinical Practice. Using a theoretical framework grounded…

  9. Mexican Educational Ethnography and the Work of the DIE: Crossing the Border and Finding the Historical Everyday [book review].

    ERIC Educational Resources Information Center

    Levinson, Bradley A.

    1998-01-01

    The theoretical insight and ethnographic rigor of this collection of essays from participants at Departamento de Investigaciones Educativas (DIE) of the National Polytechnic Institute about the role of the public school in Mexican social and political life promote understanding of educational processes in different contexts, including rural and…

  10. Polytechnic Engineering Mathematics: Assessing Its Relevance to the Productivity of Industries in Uganda

    ERIC Educational Resources Information Center

    Jehopio, Peter J.; Wesonga, Ronald

    2017-01-01

    Background: The main objective of the study was to examine the relevance of engineering mathematics to the emerging industries. The level of abstraction, the standard of rigor, and the depth of theoretical treatment are necessary skills expected of a graduate engineering technician to be derived from mathematical knowledge. The question of whether…

  11. A Very Short, Fairly Interesting and Reasonably Cheap Book about Qualitative Research. Very Short, Fairly Interesting & Reasonably Cheap Books

    ERIC Educational Resources Information Center

    Silverman, David

    2007-01-01

    In this book, the author shows how good research can be methodologically inventive, empirically rigorous, theoretically-alive and practically relevant. Using materials ranging from photographs to novels and newspaper stories this book demonstrates that getting to grips with these issues means asking fundamental questions about how we are…

  12. The Problem of Bio-Concepts: Biopolitics, Bio-Economy and the Political Economy of Nothing

    ERIC Educational Resources Information Center

    Birch, Kean

    2017-01-01

    Scholars in science and technology studies--and no doubt other fields--have increasingly drawn on Michel Foucault's concept of biopolitics to theorize a variety of new "bio-concepts." While there might be some theoretical value in such exercises, many of these bio-concepts have simply replaced more rigorous--and therefore…

  13. Relevance and Rigor in International Business Teaching: Using the CSA-FSA Matrix

    ERIC Educational Resources Information Center

    Collinson, Simon C.; Rugman, Alan M.

    2011-01-01

    We advance three propositions in this paper. First, teaching international business (IB) at any level needs to be theoretically driven, using mainstream frameworks to organize thinking. Second, these frameworks need to be made relevant to the experiences of the students; for example, by using them in case studies. Third, these parameters of rigor…

  14. BOOK REVIEW: Vortex Methods: Theory and Practice

    NASA Astrophysics Data System (ADS)

    Cottet, G.-H.; Koumoutsakos, P. D.

    2001-03-01

    The book Vortex Methods: Theory and Practice presents a comprehensive account of the numerical technique for solving fluid flow problems. It provides a very nice balance between the theoretical development and analysis of the various techniques and their practical implementation. In fact, the presentation of the rigorous mathematical analysis of these methods instills confidence in their implementation. The book goes into some detail on the more recent developments that attempt to account for viscous effects, in particular the presence of viscous boundary layers in some flows of interest. The presentation is very readable, with most points illustrated with well-chosen examples, some quite sophisticated. It is a very worthy reference book that should appeal to a large body of readers, from those interested in the mathematical analysis of the methods to practitioners of computational fluid dynamics. The use of the book as a text is compromised by its lack of exercises for students, but it could form the basis of a graduate special topics course. Juan Lopez

  15. Review of rigorous coupled-wave analysis and of homogeneous effective medium approximations for high spatial-frequency surface-relief gratings

    NASA Technical Reports Server (NTRS)

    Glytsis, Elias N.; Brundrett, David L.; Gaylord, Thomas K.

    1993-01-01

    A review of the rigorous coupled-wave analysis as applied to the diffraction of electro-magnetic waves by gratings is presented. The analysis is valid for any polarization, angle of incidence, and conical diffraction. Cascaded and/or multiplexed gratings as well as material anisotropy can be incorporated under the same formalism. Small period rectangular groove gratings can also be modeled using approximately equivalent uniaxial homogeneous layers (effective media). The ordinary and extraordinary refractive indices of these layers depend on the gratings filling factor, the refractive indices of the substrate and superstrate, and the ratio of the freespace wavelength to grating period. Comparisons of the homogeneous effective medium approximations with the rigorous coupled-wave analysis are presented. Antireflection designs (single-layer or multilayer) using the effective medium models are presented and compared. These ultra-short period antireflection gratings can also be used to produce soft x-rays. Comparisons of the rigorous coupled-wave analysis with experimental results on soft x-ray generation by gratings are also included.

  16. Rigorous Science: a How-To Guide

    PubMed Central

    Fang, Ferric C.

    2016-01-01

    ABSTRACT Proposals to improve the reproducibility of biomedical research have emphasized scientific rigor. Although the word “rigor” is widely used, there has been little specific discussion as to what it means and how it can be achieved. We suggest that scientific rigor combines elements of mathematics, logic, philosophy, and ethics. We propose a framework for rigor that includes redundant experimental design, sound statistical analysis, recognition of error, avoidance of logical fallacies, and intellectual honesty. These elements lead to five actionable recommendations for research education. PMID:27834205

  17. University Students' Strategies for Constructing Hypothesis when Tackling Paper-and-Pencil Tasks in Physics

    NASA Astrophysics Data System (ADS)

    Guisasola, Jenaro; Ceberio, Mikel; Zubimendi, José Luis

    2006-09-01

    The study we present tries to explore how first year engineering students formulate hypotheses in order to construct their own problem solving structure when confronted with problems in physics. Under the constructivistic perspective of the teaching-learning process, the formulation of hypotheses plays a key role in contrasting the coherence of the students' ideas with the theoretical frame. The main research instrument used to identify students' reasoning is the written report by the student on how they have attempted four problem solving tasks in which they have been asked explicitly to formulate hypotheses. The protocols used in the assessment of the solutions consisted of a semi-quantitative study based on grids designed for the analysis of written answers. In this paper we have included two of the tasks used and the corresponding scheme for the categorisation of the answers. Details of the other two tasks are also outlined. According to our findings we would say that the majority of students judge a hypothesis to be plausible if it is congruent with their previous knowledge without rigorously checking it against the theoretical framework explained in class.

  18. Theoretical study of strain-dependent optical absorption in a doped self-assembled InAs/InGaAs/GaAs/AlGaAs quantum dot

    PubMed Central

    Tankasala, Archana; Hsueh, Yuling; Charles, James; Fonseca, Jim; Povolotskyi, Michael; Kim, Jun Oh; Krishna, Sanjay; Allen, Monica S; Allen, Jeffery W; Rahman, Rajib; Klimeck, Gerhard

    2018-01-01

    A detailed theoretical study of the optical absorption in doped self-assembled quantum dots is presented. A rigorous atomistic strain model as well as a sophisticated 20-band tight-binding model are used to ensure accurate prediction of the single particle states in these devices. We also show that for doped quantum dots, many-particle configuration interaction is also critical to accurately capture the optical transitions of the system. The sophisticated models presented in this work reproduce the experimental results for both undoped and doped quantum dot systems. The effects of alloy mole fraction of the strain controlling layer and quantum dot dimensions are discussed. Increasing the mole fraction of the strain controlling layer leads to a lower energy gap and a larger absorption wavelength. Surprisingly, the absorption wavelength is highly sensitive to the changes in the diameter, but almost insensitive to the changes in dot height. This behavior is explained by a detailed sensitivity analysis of different factors affecting the optical transition energy. PMID:29719758

  19. Effect of Profilin on Actin Critical Concentration: A Theoretical Analysis

    PubMed Central

    Yarmola, Elena G.; Dranishnikov, Dmitri A.; Bubb, Michael R.

    2008-01-01

    To explain the effect of profilin on actin critical concentration in a manner consistent with thermodynamic constraints and available experimental data, we built a thermodynamically rigorous model of actin steady-state dynamics in the presence of profilin. We analyzed previously published mechanisms theoretically and experimentally and, based on our analysis, suggest a new explanation for the effect of profilin. It is based on a general principle of indirect energy coupling. The fluctuation-based process of exchange diffusion indirectly couples the energy of ATP hydrolysis to actin polymerization. Profilin modulates this coupling, producing two basic effects. The first is based on the acceleration of exchange diffusion by profilin, which indicates, paradoxically, that a faster rate of actin depolymerization promotes net polymerization. The second is an affinity-based mechanism similar to the one suggested in 1993 by Pantaloni and Carlier although based on indirect rather than direct energy coupling. In the model by Pantaloni and Carlier, transformation of chemical energy of ATP hydrolysis into polymerization energy is regulated by direct association of each step in the hydrolysis reaction with a corresponding step in polymerization. Thus, hydrolysis becomes a time-limiting step in actin polymerization. In contrast, indirect coupling allows ATP hydrolysis to lag behind actin polymerization, consistent with experimental results. PMID:18835900

  20. Solubility advantage of amorphous pharmaceuticals: I. A thermodynamic analysis.

    PubMed

    Murdande, Sharad B; Pikal, Michael J; Shanker, Ravi M; Bogner, Robin H

    2010-03-01

    In recent years there has been growing interest in advancing amorphous pharmaceuticals as an approach for achieving adequate solubility. Due to difficulties in the experimental measurement of solubility, a reliable estimate of the solubility enhancement ratio of an amorphous form of a drug relative to its crystalline counterpart would be highly useful. We have developed a rigorous thermodynamic approach to estimate enhancement in solubility that can be achieved by conversion of a crystalline form to the amorphous form. We rigorously treat the three factors that contribute to differences in solubility between amorphous and crystalline forms. First, we calculate the free energy difference between amorphous and crystalline forms from thermal properties measured by modulated differential scanning calorimetry (MDSC). Secondly, since an amorphous solute can absorb significant amounts of water, which reduces its activity and solubility, a correction is made using water sorption isotherm data and the Gibbs-Duhem equation. Next, a correction is made for differences in the degree of ionization due to differences in solubilities of the two forms. Utilizing this approach the theoretically estimated solubility enhancement ratio of 7.0 for indomethacin (amorphous/gamma-crystal) was found to be in close agreement with the experimentally determined ratio of 4.9. 2009 Wiley-Liss, Inc. and the American Pharmacists Association

  1. Dynamic Characteristics of Micro-Beams Considering the Effect of Flexible Supports

    PubMed Central

    Zhong, Zuo-Yang; Zhang, Wen-Ming; Meng, Guang

    2013-01-01

    Normally, the boundaries are assumed to allow small deflections and moments for MEMS beams with flexible supports. The non-ideal boundary conditions have a significant effect on the qualitative dynamical behavior. In this paper, by employing the principle of energy equivalence, rigorous theoretical solutions of the tangential and rotational equivalent stiffness are derived based on the Boussinesq's and Cerruti's displacement equations. The non-dimensional differential partial equation of the motion, as well as coupled boundary conditions, are solved analytically using the method of multiple time scales. The closed-form solution provides a direct insight into the relationship between the boundary conditions and vibration characteristics of the dynamic system, in which resonance frequencies increase with the nonlinear mechanical spring effect but decrease with the effect of flexible supports. The obtained results of frequencies and mode shapes are compared with the cases of ideal boundary conditions, and the differences between them are contrasted on frequency response curves. The influences of the support material property on the equivalent stiffness and resonance frequency shift are also discussed. It is demonstrated that the proposed model with the flexible supports boundary conditions has significant effect on the rigorous quantitative dynamical analysis of the MEMS beams. Moreover, the proposed analytical solutions are in good agreement with those obtained from finite element analyses.

  2. Retrieval Induces Forgetting, but Only When Nontested Items Compete for Retrieval: Implication for Interference, Inhibition, and Context Reinstatement

    ERIC Educational Resources Information Center

    Chan, Jason C. K.; Erdman, Matthew R.; Davis, Sara D.

    2015-01-01

    The mechanism responsible for retrieval-induced forgetting has been the subject of rigorous theoretical debate, with some researchers postulating that retrieval-induced forgetting can be explained by interference (J. G .W. Raaijmakers & E. Jakab, 2013) or context reinstatement (T. R. Jonker, P. Seli, & C. M. MacLeod, 2013), whereas others…

  3. Implementation of rigorous renormalization group method for ground space and low-energy states of local Hamiltonians

    NASA Astrophysics Data System (ADS)

    Roberts, Brenden; Vidick, Thomas; Motrunich, Olexei I.

    2017-12-01

    The success of polynomial-time tensor network methods for computing ground states of certain quantum local Hamiltonians has recently been given a sound theoretical basis by Arad et al. [Math. Phys. 356, 65 (2017), 10.1007/s00220-017-2973-z]. The convergence proof, however, relies on "rigorous renormalization group" (RRG) techniques which differ fundamentally from existing algorithms. We introduce a practical adaptation of the RRG procedure which, while no longer theoretically guaranteed to converge, finds matrix product state ansatz approximations to the ground spaces and low-lying excited spectra of local Hamiltonians in realistic situations. In contrast to other schemes, RRG does not utilize variational methods on tensor networks. Rather, it operates on subsets of the system Hilbert space by constructing approximations to the global ground space in a treelike manner. We evaluate the algorithm numerically, finding similar performance to density matrix renormalization group (DMRG) in the case of a gapped nondegenerate Hamiltonian. Even in challenging situations of criticality, large ground-state degeneracy, or long-range entanglement, RRG remains able to identify candidate states having large overlap with ground and low-energy eigenstates, outperforming DMRG in some cases.

  4. On the probability density function and characteristic function moments of image steganalysis in the log prediction error wavelet subband

    NASA Astrophysics Data System (ADS)

    Bao, Zhenkun; Li, Xiaolong; Luo, Xiangyang

    2017-01-01

    Extracting informative statistic features is the most essential technical issue of steganalysis. Among various steganalysis methods, probability density function (PDF) and characteristic function (CF) moments are two important types of features due to the excellent ability for distinguishing the cover images from the stego ones. The two types of features are quite similar in definition. The only difference is that the PDF moments are computed in the spatial domain, while the CF moments are computed in the Fourier-transformed domain. Then, the comparison between PDF and CF moments is an interesting question of steganalysis. Several theoretical results have been derived, and CF moments are proved better than PDF moments in some cases. However, in the log prediction error wavelet subband of wavelet decomposition, some experiments show that the result is opposite and lacks a rigorous explanation. To solve this problem, a comparison result based on the rigorous proof is presented: the first-order PDF moment is proved better than the CF moment, while the second-order CF moment is better than the PDF moment. It tries to open the theoretical discussion on steganalysis and the question of finding suitable statistical features.

  5. Fast Exact Search in Hamming Space With Multi-Index Hashing.

    PubMed

    Norouzi, Mohammad; Punjani, Ali; Fleet, David J

    2014-06-01

    There is growing interest in representing image data and feature descriptors using compact binary codes for fast near neighbor search. Although binary codes are motivated by their use as direct indices (addresses) into a hash table, codes longer than 32 bits are not being used as such, as it was thought to be ineffective. We introduce a rigorous way to build multiple hash tables on binary code substrings that enables exact k-nearest neighbor search in Hamming space. The approach is storage efficient and straight-forward to implement. Theoretical analysis shows that the algorithm exhibits sub-linear run-time behavior for uniformly distributed codes. Empirical results show dramatic speedups over a linear scan baseline for datasets of up to one billion codes of 64, 128, or 256 bits.

  6. [Significance of occupational and interpersonal relationships among residents during the specialization training course].

    PubMed

    Cumplido-Hernández, Gustavo; Campos-Arciniega, María Faustina; Chávez-López, Arturo

    2007-01-01

    Medical specialty training courses have peculiar characteristics that probably influence the learning process of the residents. These training courses take place in large hospitals; the residents are subjected to a rigorous selection process, and at the same time they are affiliated employees of the institution. They work long shifts and are immersed in complex academic and occupational relationships. This study aims to ascertain the significance that these future specialists give to the environment where the training course takes place in relation with their learning process. We used the social anthropology narrative analysis method. A theoretical social perspective was used to emphasize on the context to explain the reality in which the residents live. Discipline, workload, conflictive relationships and strength of family ties were the most significant elements.

  7. The typological approach in child and family psychology: a review of theory, methods, and research.

    PubMed

    Mandara, Jelani

    2003-06-01

    The purpose of this paper was to review the theoretical underpinnings, major concepts, and methods of the typological approach. It was argued that the typological approach offers a systematic, empirically rigorous and reliable way to synthesize the nomothetic variable-centered approach with the idiographic case-centered approach. Recent advances in cluster analysis validation make it a promising method for uncovering natural typologies. This paper also reviewed findings from personality and family studies that have revealed 3 prototypical personalities and parenting styles: Adjusted/Authoritative, Overcontrolled/Authoritarian, and Undercontrolled/Permissive. These prototypes are theorized to be synonymous with attractor basins in psychological state space. The connection between family types and personality structure as well as future directions of typological research were also discussed.

  8. Leveraging complex understandings of urban education for transformative science pedagogy

    NASA Astrophysics Data System (ADS)

    Davis, Natalie R.; Ingber, Jenny; McLaughlin, Cheryl A.

    2014-12-01

    Despite the abundance of literature that attests to a myriad of complex and entrenched problems within urban education, the authors of this forum maintain that in addition to shedding light on oppressive structures and ideologies, critical pedagogues must remain steadfastly engaged in solution-oriented endeavors. Using Cheryl McLaughlin's analysis as a worthy starting point, we consider both old paradigms and new ideas regarding the transformation of public science education. Topics for discussion include theoretical framing, urban teacher preparation, science education reform approaches, and the role of scholarship. While it is evident that securing rigorous and empowering science education for diverse learners will require arduous, collective effort on many fronts, this text upholds a sense of optimism in the potential of activist-teacher-researchers to overcome barriers to liberatory science teaching and learning.

  9. Modeling non-harmonic behavior of materials from experimental inelastic neutron scattering and thermal expansion measurements

    NASA Astrophysics Data System (ADS)

    Bansal, Dipanshu; Aref, Amjad; Dargush, Gary; Delaire, Olivier

    2016-09-01

    Based on thermodynamic principles, we derive expressions quantifying the non-harmonic vibrational behavior of materials, which are rigorous yet easily evaluated from experimentally available data for the thermal expansion coefficient and the phonon density of states. These experimentally-derived quantities are valuable to benchmark first-principles theoretical predictions of harmonic and non-harmonic thermal behaviors using perturbation theory, ab initio molecular-dynamics, or Monte-Carlo simulations. We illustrate this analysis by computing the harmonic, dilational, and anharmonic contributions to the entropy, internal energy, and free energy of elemental aluminum and the ordered compound \\text{FeSi} over a wide range of temperature. Results agree well with previous data in the literature and provide an efficient approach to estimate anharmonic effects in materials.

  10. The sympathy of two pendulum clocks: beyond Huygens' observations.

    PubMed

    Peña Ramirez, Jonatan; Olvera, Luis Alberto; Nijmeijer, Henk; Alvarez, Joaquin

    2016-03-29

    This paper introduces a modern version of the classical Huygens' experiment on synchronization of pendulum clocks. The version presented here consists of two monumental pendulum clocks--ad hoc designed and fabricated--which are coupled through a wooden structure. It is demonstrated that the coupled clocks exhibit 'sympathetic' motion, i.e. the pendula of the clocks oscillate in consonance and in the same direction. Interestingly, when the clocks are synchronized, the common oscillation frequency decreases, i.e. the clocks become slow and inaccurate. In order to rigorously explain these findings, a mathematical model for the coupled clocks is obtained by using well-established physical and mechanical laws and likewise, a theoretical analysis is conducted. Ultimately, the sympathy of two monumental pendulum clocks, interacting via a flexible coupling structure, is experimentally, numerically, and analytically demonstrated.

  11. Error analysis in inverse scatterometry. I. Modeling.

    PubMed

    Al-Assaad, Rayan M; Byrne, Dale M

    2007-02-01

    Scatterometry is an optical technique that has been studied and tested in recent years in semiconductor fabrication metrology for critical dimensions. Previous work presented an iterative linearized method to retrieve surface-relief profile parameters from reflectance measurements upon diffraction. With the iterative linear solution model in this work, rigorous models are developed to represent the random and deterministic or offset errors in scatterometric measurements. The propagation of different types of error from the measurement data to the profile parameter estimates is then presented. The improvement in solution accuracies is then demonstrated with theoretical and experimental data by adjusting for the offset errors. In a companion paper (in process) an improved optimization method is presented to account for unknown offset errors in the measurements based on the offset error model.

  12. Characterization of anisotropically shaped silver nanoparticle arrays via spectroscopic ellipsometry supported by numerical optical modeling

    NASA Astrophysics Data System (ADS)

    Gkogkou, Dimitra; Shaykhutdinov, Timur; Oates, Thomas W. H.; Gernert, Ulrich; Schreiber, Benjamin; Facsko, Stefan; Hildebrandt, Peter; Weidinger, Inez M.; Esser, Norbert; Hinrichs, Karsten

    2017-11-01

    The present investigation aims to study the optical response of anisotropic Ag nanoparticle arrays deposited on rippled silicon substrates by performing a qualitative comparison between experimental and theoretical results. Spectroscopic ellipsometry was used along with numerical calculations using finite-difference time-domain (FDTD) method and rigorous coupled wave analysis (RCWA) to reveal trends in the optical and geometrical properties of the nanoparticle array. Ellipsometric data show two resonances, in the orthogonal x and y directions, that originate from localized plasmon resonances as demonstrated by the calculated near-fields from FDTD calculations. The far-field calculations by RCWA point to decoupled resonances in x direction and possible coupling effects in y direction, corresponding to the short and long axis of the anisotropic nanoparticles, respectively.

  13. Dual-function beam splitter of a subwavelength fused-silica grating.

    PubMed

    Feng, Jijun; Zhou, Changhe; Zheng, Jiangjun; Cao, Hongchao; Lv, Peng

    2009-05-10

    We present the design and fabrication of a novel dual-function subwavelength fused-silica grating that can be used as a polarization-selective beam splitter. For TM polarization, the grating can be used as a two-port beam splitter at a wavelength of 1550 nm with a total diffraction efficiency of 98%. For TE polarization, the grating can function as a high-efficiency grating, and the diffraction efficiency of the -1st order is 95% under Littrow mounting. This dual-function grating design is based on a simplified modal method. By using the rigorous coupled-wave analysis, the optimum grating parameters can be determined. Holographic recording technology and inductively coupled plasma etching are used to manufacture the fused-silica grating. Experimental results are in agreement with the theoretical values.

  14. Effect of vorticity flip-over on the premixed flame structure: Experimental observation of type-I inflection flames

    NASA Astrophysics Data System (ADS)

    El-Rabii, Hazem; Kazakov, Kirill A.

    2015-12-01

    Premixed flames propagating in horizontal tubes are observed to take on a convex shape towards the fresh mixture, which is commonly explained as a buoyancy effect. A recent rigorous analysis has shown, on the contrary, that this process is driven by the balance of vorticity generated by a curved flame front with the baroclinic vorticity, and predicted existence of a regime in which the leading edge of the flame front is concave. We report experimental realization of this regime. Our experiments on ethane and n -butane mixtures with air show that flames with an inflection point on the front are regularly produced in lean mixtures, provided that a sufficiently weak ignition is used. The observed flame shape perfectly agrees with that theoretically predicted.

  15. Identification of dynamic systems, theory and formulation

    NASA Technical Reports Server (NTRS)

    Maine, R. E.; Iliff, K. W.

    1985-01-01

    The problem of estimating parameters of dynamic systems is addressed in order to present the theoretical basis of system identification and parameter estimation in a manner that is complete and rigorous, yet understandable with minimal prerequisites. Maximum likelihood and related estimators are highlighted. The approach used requires familiarity with calculus, linear algebra, and probability, but does not require knowledge of stochastic processes or functional analysis. The treatment emphasizes unification of the various areas in estimation in dynamic systems is treated as a direct outgrowth of the static system theory. Topics covered include basic concepts and definitions; numerical optimization methods; probability; statistical estimators; estimation in static systems; stochastic processes; state estimation in dynamic systems; output error, filter error, and equation error methods of parameter estimation in dynamic systems, and the accuracy of the estimates.

  16. A Historical Survey of the Contributions of Francois-Joseph Servois to the Development of the Rigorous Calculus

    ERIC Educational Resources Information Center

    Petrilli, Salvatore John, Jr.

    2009-01-01

    Historians of mathematics considered the nineteenth century to be the Golden Age of mathematics. During this time period many areas of mathematics, such as algebra and geometry, were being placed on rigorous foundations. Another area of mathematics which experienced fundamental change was analysis. The drive for rigor in calculus began in 1797…

  17. Do behavioral scientists really understand HIV-related sexual risk behavior? A systematic review of longitudinal and experimental studies predicting sexual behavior.

    PubMed

    Huebner, David M; Perry, Nicholas S

    2015-10-01

    Behavioral interventions to reduce sexual risk behavior depend on strong health behavior theory. By identifying the psychosocial variables that lead causally to sexual risk, theories provide interventionists with a guide for how to change behavior. However, empirical research is critical to determining whether a particular theory adequately explains sexual risk behavior. A large body of cross-sectional evidence, which has been reviewed elsewhere, supports the notion that certain theory-based constructs (e.g., self-efficacy) are correlates of sexual behavior. However, given the limitations of inferring causality from correlational research, it is essential that we review the evidence from more methodologically rigorous studies (i.e., longitudinal and experimental designs). This systematic review identified 44 longitudinal studies in which investigators attempted to predict sexual risk from psychosocial variables over time. We also found 134 experimental studies (i.e., randomized controlled trials of HIV interventions), but of these only 9 (6.7 %) report the results of mediation analyses that might provide evidence for the validity of health behavior theories in predicting sexual behavior. Results show little convergent support across both types of studies for most traditional, theoretical predictors of sexual behavior. This suggests that the field must expand the body of empirical work that utilizes the most rigorous study designs to test our theoretical assumptions. The inconsistent results of existing research would indicate that current theoretical models of sexual risk behavior are inadequate, and may require expansion or adaptation.

  18. Cultural Competence in the Treatment of Addictions: Theory, Practice and Evidence.

    PubMed

    Gainsbury, Sally M

    2017-07-01

    Culturally and linguistically diverse (CALD) populations often have high rates of addictive disorders, but lower rates of treatment seeking and completion than the mainstream population. A significant barrier to treatment is the lack of culturally relevant and appropriate treatment. A literature review was conducted to identify relevant literature related to cultural competence in mental health services delivery and specifically treatment for addictive disorders. Several theoretical models of cultural competence in therapy have been developed, but the lack of rigorous research limits the empirical evidence available. Research indicates that culturally competent treatment practices including providing therapy and materials in the client's language, knowledge, understanding and appreciation for cultural perspectives and nuances, involving the wider family and community and training therapists can enhance client engagement, retention and treatment outcomes for substance use and gambling. Further methodologically rigorous research is needed to isolate the impact of cultural competence for the treatment of addictions and guide research to determine treatment efficacy within specific CALD populations. Training therapists and recruiting therapists and researchers from CALD communities is important to ensure an ongoing focus and improved outcomes for CALD populations due to the importance of engaging these populations with addiction treatment. Copyright © 2016 John Wiley & Sons, Ltd. Key Practitioner Message: The treatment needs of culturally diverse individuals with addictions are often not met. Theoretical models can guide therapists in incorporating cultural competence. Culturally targeted treatments increase recruitment, retention and treatment outcomes. Cultural competence includes matching clinicians and clients on linguistic and cultural backgrounds as well as being mindful of the impact of culture on client's experience of addiction problems. Few methodologically rigorous trials have been conducted to guide treatment practices and research needs to be incorporated into existing culturally relevant treatment services. Copyright © 2016 John Wiley & Sons, Ltd.

  19. Review of finite fields: Applications to discrete Fourier, transforms and Reed-Solomon coding

    NASA Technical Reports Server (NTRS)

    Wong, J. S. L.; Truong, T. K.; Benjauthrit, B.; Mulhall, B. D. L.; Reed, I. S.

    1977-01-01

    An attempt is made to provide a step-by-step approach to the subject of finite fields. Rigorous proofs and highly theoretical materials are avoided. The simple concepts of groups, rings, and fields are discussed and developed more or less heuristically. Examples are used liberally to illustrate the meaning of definitions and theories. Applications include discrete Fourier transforms and Reed-Solomon coding.

  20. Theoretical and Experimental Investigations of Coincidences in Poisson Distributed Pulse Trains and Spectral Distortion Caused by Pulse Pileup.

    NASA Astrophysics Data System (ADS)

    Bristow, Quentin

    1990-01-01

    Part one of this two-part study is concerned with the multiple coincidences in pulse trains from X-ray and gamma radiation detectors which are the cause of pulse pileup. A sequence of pulses with inter-arrival times less than tau, the resolving time of the pulse-height analysis system used to acquire spectra, is called a multiple pulse string. Such strings can be classified on the basis of the number of pulses they contain, or the number of resolving times they cover. The occurrence rates of such strings are derived from theoretical considerations. Logic circuits were devised to make experimental measurements of multiple pulse string occurrence rates in the output from a NaI(Tl) scintillation detector over a wide range of count rates. Markov process theory was used to predict state transition rates in the logic circuits, enabling the experimental data to be checked rigorously for conformity with those predicted for a Poisson distribution. No fundamental discrepancies were observed. Part two of the study is concerned with a theoretical analysis of pulse pileup and the development of a discrete correction algorithm, based on the use of a function to simulate the coincidence spectrum produced by partial sums of pulses. Monte Carlo simulations, incorporating criteria for pulse pileup inherent in the operation of modern ADC's, were used to generate pileup spectra due to coincidences between two pulses, (1st order pileup) and three pulses (2nd order pileup), for different semi-Gaussian pulse shapes. Coincidences between pulses in a single channel produced a basic probability density function spectrum which can be regarded as an impulse response for a particular pulse shape. The use of a flat spectrum (identical count rates in all channels) in the simulations, and in a parallel theoretical analysis, showed the 1st order pileup distorted the spectrum to a linear ramp with a pileup tail. The correction algorithm was successfully applied to correct entire spectra for 1st and 2nd order pileup; both those generated by Monte Carlo simulations and in addition some real spectra acquired with a laboratory multichannel analysis system.

  1. Harmonic analysis of traction power supply system based on wavelet decomposition

    NASA Astrophysics Data System (ADS)

    Dun, Xiaohong

    2018-05-01

    With the rapid development of high-speed railway and heavy-haul transport, AC drive electric locomotive and EMU large-scale operation in the country on the ground, the electrified railway has become the main harmonic source of China's power grid. In response to this phenomenon, the need for timely monitoring of power quality problems of electrified railway, assessment and governance. Wavelet transform is developed on the basis of Fourier analysis, the basic idea comes from the harmonic analysis, with a rigorous theoretical model, which has inherited and developed the local thought of Garbor transformation, and has overcome the disadvantages such as window fixation and lack of discrete orthogonally, so as to become a more recently studied spectral analysis tool. The wavelet analysis takes the gradual and precise time domain step in the high frequency part so as to focus on any details of the signal being analyzed, thereby comprehensively analyzing the harmonics of the traction power supply system meanwhile use the pyramid algorithm to increase the speed of wavelet decomposition. The matlab simulation shows that the use of wavelet decomposition of the traction power supply system for harmonic spectrum analysis is effective.

  2. Children's antisocial behavior, mental health, drug use, and educational performance after parental incarceration: a systematic review and meta-analysis.

    PubMed

    Murray, Joseph; Farrington, David P; Sekol, Ivana

    2012-03-01

    Unprecedented numbers of children experience parental incarceration worldwide. Families and children of prisoners can experience multiple difficulties after parental incarceration, including traumatic separation, loneliness, stigma, confused explanations to children, unstable childcare arrangements, strained parenting, reduced income, and home, school, and neighborhood moves. Children of incarcerated parents often have multiple, stressful life events before parental incarceration. Theoretically, children with incarcerated parents may be at risk for a range of adverse behavioral outcomes. A systematic review was conducted to synthesize empirical evidence on associations between parental incarceration and children's later antisocial behavior, mental health problems, drug use, and educational performance. Results from 40 studies (including 7,374 children with incarcerated parents and 37,325 comparison children in 50 samples) were pooled in a meta-analysis. The most rigorous studies showed that parental incarceration is associated with higher risk for children's antisocial behavior, but not for mental health problems, drug use, or poor educational performance. Studies that controlled for parental criminality or children's antisocial behavior before parental incarceration had a pooled effect size of OR = 1.4 (p < .01), corresponding to about 10% increased risk for antisocial behavior among children with incarcerated parents, compared with peers. Effect sizes did not decrease with number of covariates controlled. However, the methodological quality of many studies was poor. More rigorous tests of the causal effects of parental incarceration are needed, using randomized designs and prospective longitudinal studies. Criminal justice reforms and national support systems might be needed to prevent harmful consequences of parental incarceration for children.

  3. Communication: Rigorous quantum dynamics of O + O{sub 2} exchange reactions on an ab initio potential energy surface substantiate the negative temperature dependence of rate coefficients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Yaqin; Sun, Zhigang, E-mail: zsun@dicp.ac.cn, E-mail: dawesr@mst.edu, E-mail: hguo@unm.edu; Center for Advanced Chemical Physics, University of Science and Technology of China, 96 Jinzhai Road, Hefei 230026

    2014-08-28

    The kinetics and dynamics of several O + O{sub 2} isotope exchange reactions have been investigated on a recently determined accurate global O{sub 3} potential energy surface using a time-dependent wave packet method. The agreement between calculated and measured rate coefficients is significantly improved over previous work. More importantly, the experimentally observed negative temperature dependence of the rate coefficients is for the first time rigorously reproduced theoretically. This negative temperature dependence can be attributed to the absence in the new potential energy surface of a submerged “reef” structure, which was present in all previous potential energy surfaces. In addition, contributionsmore » of rotational excited states of the diatomic reactant further accentuate the negative temperature dependence.« less

  4. Coexistence and local μ-stability of multiple equilibrium points for memristive neural networks with nonmonotonic piecewise linear activation functions and unbounded time-varying delays.

    PubMed

    Nie, Xiaobing; Zheng, Wei Xing; Cao, Jinde

    2016-12-01

    In this paper, the coexistence and dynamical behaviors of multiple equilibrium points are discussed for a class of memristive neural networks (MNNs) with unbounded time-varying delays and nonmonotonic piecewise linear activation functions. By means of the fixed point theorem, nonsmooth analysis theory and rigorous mathematical analysis, it is proven that under some conditions, such n-neuron MNNs can have 5 n equilibrium points located in ℜ n , and 3 n of them are locally μ-stable. As a direct application, some criteria are also obtained on the multiple exponential stability, multiple power stability, multiple log-stability and multiple log-log-stability. All these results reveal that the addressed neural networks with activation functions introduced in this paper can generate greater storage capacity than the ones with Mexican-hat-type activation function. Numerical simulations are presented to substantiate the theoretical results. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Rigorous coupled wave analysis of acousto-optics with relativistic considerations.

    PubMed

    Xia, Guoqiang; Zheng, Weijian; Lei, Zhenggang; Zhang, Ruolan

    2015-09-01

    A relativistic analysis of acousto-optics is presented, and a rigorous coupled wave analysis is generalized for the diffraction of the acousto-optical effect. An acoustic wave generates a grating with temporally and spatially modulated permittivity, hindering direct applications of the rigorous coupled wave analysis for the acousto-optical effect. In a reference frame which moves with the acoustic wave, the grating is static, the medium moves, and the coupled wave equations for the static grating may be derived. Floquet's theorem is then applied to cast these equations into an eigenproblem. Using a Lorentz transformation, the electromagnetic fields in the grating region are transformed to the lab frame where the medium is at rest, and relativistic Doppler frequency shifts are introduced into various diffraction orders. In the lab frame, the boundary conditions are considered and the diffraction efficiencies of various orders are determined. This method is rigorous and general, and the plane waves in the resulting expansion satisfy the dispersion relation of the medium and are propagation modes. Properties of various Bragg diffractions are results, rather than preconditions, of this method. Simulations of an acousto-optical tunable filter made by paratellurite, TeO(2), are given as examples.

  6. Revival of oscillations from deaths in diffusively coupled nonlinear systems: Theory and experiment

    NASA Astrophysics Data System (ADS)

    Zou, Wei; Sebek, Michael; Kiss, István Z.; Kurths, Jürgen

    2017-06-01

    Amplitude death (AD) and oscillation death (OD) are two structurally different oscillation quenching phenomena in coupled nonlinear systems. As a reverse issue of AD and OD, revival of oscillations from deaths attracts an increasing attention recently. In this paper, we clearly disclose that a time delay in the self-feedback component of the coupling destabilizes not only AD but also OD, and even the AD to OD transition in paradigmatic models of coupled Stuart-Landau oscillators under diverse death configurations. Using a rigorous analysis, the effectiveness of this self-feedback delay in revoking AD is theoretically proved to be valid in an arbitrary network of coupled Stuart-Landau oscillators with generally distributed propagation delays. Moreover, the role of self-feedback delay in reviving oscillations from AD is experimentally verified in two delay-coupled electrochemical reactions.

  7. Speech coding and compression using wavelets and lateral inhibitory networks

    NASA Astrophysics Data System (ADS)

    Ricart, Richard

    1990-12-01

    The purpose of this thesis is to introduce the concept of lateral inhibition as a generalized technique for compressing time/frequency representations of electromagnetic and acoustical signals, particularly speech. This requires at least a rudimentary treatment of the theory of frames- which generalizes most commonly known time/frequency distributions -the biology of hearing, and digital signal processing. As such, this material, along with the interrelationships of the disparate subjects, is presented in a tutorial style. This may leave the mathematician longing for more rigor, the neurophysiological psychologist longing for more substantive support of the hypotheses presented, and the engineer longing for a reprieve from the theoretical barrage. Despite the problems that arise when trying to appeal to too wide an audience, this thesis should be a cogent analysis of the compression of time/frequency distributions via lateral inhibitory networks.

  8. The sympathy of two pendulum clocks: beyond Huygens’ observations

    PubMed Central

    Peña Ramirez, Jonatan; Olvera, Luis Alberto; Nijmeijer, Henk; Alvarez, Joaquin

    2016-01-01

    This paper introduces a modern version of the classical Huygens’ experiment on synchronization of pendulum clocks. The version presented here consists of two monumental pendulum clocks—ad hoc designed and fabricated—which are coupled through a wooden structure. It is demonstrated that the coupled clocks exhibit ‘sympathetic’ motion, i.e. the pendula of the clocks oscillate in consonance and in the same direction. Interestingly, when the clocks are synchronized, the common oscillation frequency decreases, i.e. the clocks become slow and inaccurate. In order to rigorously explain these findings, a mathematical model for the coupled clocks is obtained by using well-established physical and mechanical laws and likewise, a theoretical analysis is conducted. Ultimately, the sympathy of two monumental pendulum clocks, interacting via a flexible coupling structure, is experimentally, numerically, and analytically demonstrated. PMID:27020903

  9. A mass-energy preserving Galerkin FEM for the coupled nonlinear fractional Schrödinger equations

    NASA Astrophysics Data System (ADS)

    Zhang, Guoyu; Huang, Chengming; Li, Meng

    2018-04-01

    We consider the numerical simulation of the coupled nonlinear space fractional Schrödinger equations. Based on the Galerkin finite element method in space and the Crank-Nicolson (CN) difference method in time, a fully discrete scheme is constructed. Firstly, we focus on a rigorous analysis of conservation laws for the discrete system. The definitions of discrete mass and energy here correspond with the original ones in physics. Then, we prove that the fully discrete system is uniquely solvable. Moreover, we consider the unconditionally convergent properties (that is to say, we complete the error estimates without any mesh ratio restriction). We derive L2-norm error estimates for the nonlinear equations and L^{∞}-norm error estimates for the linear equations. Finally, some numerical experiments are included showing results in agreement with the theoretical predictions.

  10. Reliable spacecraft rendezvous without velocity measurement

    NASA Astrophysics Data System (ADS)

    He, Shaoming; Lin, Defu

    2018-03-01

    This paper investigates the problem of finite-time velocity-free autonomous rendezvous for spacecraft in the presence of external disturbances during the terminal phase. First of all, to address the problem of lack of relative velocity measurement, a robust observer is proposed to estimate the unknown relative velocity information in a finite time. It is shown that the effect of external disturbances on the estimation precision can be suppressed to a relatively low level. With the reconstructed velocity information, a finite-time output feedback control law is then formulated to stabilize the rendezvous system. Theoretical analysis and rigorous proof show that the relative position and its rate can converge to a small compacted region in finite time. Numerical simulations are performed to evaluate the performance of the proposed approach in the presence of external disturbances and actuator faults.

  11. Entropy and complexity analysis of hydrogenic Rydberg atoms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lopez-Rosa, S.; Departamento de Fisica Aplicada II, Universidad de Sevilla, 41012-Sevilla; Toranzo, I. V.

    The internal disorder of hydrogenic Rydberg atoms as contained in their position and momentum probability densities is examined by means of the following information-theoretic spreading quantities: the radial and logarithmic expectation values, the Shannon entropy, and the Fisher information. As well, the complexity measures of Cramer-Rao, Fisher-Shannon, and Lopez Ruiz-Mancini-Calvet types are investigated in both reciprocal spaces. The leading term of these quantities is rigorously calculated by use of the asymptotic properties of the concomitant entropic functionals of the Laguerre and Gegenbauer orthogonal polynomials which control the wavefunctions of the Rydberg states in both position and momentum spaces. The associatedmore » generalized Heisenberg-like, logarithmic and entropic uncertainty relations are also given. Finally, application to linear (l= 0), circular (l=n- 1), and quasicircular (l=n- 2) states is explicitly done.« less

  12. Revival of oscillations from deaths in diffusively coupled nonlinear systems: Theory and experiment.

    PubMed

    Zou, Wei; Sebek, Michael; Kiss, István Z; Kurths, Jürgen

    2017-06-01

    Amplitude death (AD) and oscillation death (OD) are two structurally different oscillation quenching phenomena in coupled nonlinear systems. As a reverse issue of AD and OD, revival of oscillations from deaths attracts an increasing attention recently. In this paper, we clearly disclose that a time delay in the self-feedback component of the coupling destabilizes not only AD but also OD, and even the AD to OD transition in paradigmatic models of coupled Stuart-Landau oscillators under diverse death configurations. Using a rigorous analysis, the effectiveness of this self-feedback delay in revoking AD is theoretically proved to be valid in an arbitrary network of coupled Stuart-Landau oscillators with generally distributed propagation delays. Moreover, the role of self-feedback delay in reviving oscillations from AD is experimentally verified in two delay-coupled electrochemical reactions.

  13. Modeling non-harmonic behavior of materials from experimental inelastic neutron scattering and thermal expansion measurements

    DOE PAGES

    Bansal, Dipanshu; Aref, Amjad; Dargush, Gary; ...

    2016-07-20

    Based on thermodynamic principles, we derive expressions quantifying the non-harmonic vibrational behavior of materials, which are rigorous yet easily evaluated from experimentally available data for the thermal expansion coefficient and the phonon density of states. These experimentally-derived quantities are valuable to benchmark first-principles theoretical predictions of harmonic and non-harmonic thermal behaviors using perturbation theory, ab initio molecular-dynamics, or Monte-Carlo simulations. In this study, we illustrate this analysis by computing the harmonic, dilational, and anharmonic contributions to the entropy, internal energy, and free energy of elemental aluminum and the ordered compound FeSi over a wide range of temperature. Our results agreemore » well with previous data in the literature and provide an efficient approach to estimate anharmonic effects in materials.« less

  14. Modeling non-harmonic behavior of materials from experimental inelastic neutron scattering and thermal expansion measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bansal, Dipanshu; Aref, Amjad; Dargush, Gary

    Based on thermodynamic principles, we derive expressions quantifying the non-harmonic vibrational behavior of materials, which are rigorous yet easily evaluated from experimentally available data for the thermal expansion coefficient and the phonon density of states. These experimentally-derived quantities are valuable to benchmark first-principles theoretical predictions of harmonic and non-harmonic thermal behaviors using perturbation theory, ab initio molecular-dynamics, or Monte-Carlo simulations. In this study, we illustrate this analysis by computing the harmonic, dilational, and anharmonic contributions to the entropy, internal energy, and free energy of elemental aluminum and the ordered compound FeSi over a wide range of temperature. Our results agreemore » well with previous data in the literature and provide an efficient approach to estimate anharmonic effects in materials.« less

  15. Pharmacists as agents of change for rational drug therapy.

    PubMed

    Lipton, H L; Byrns, P J; Soumerai, S B; Chrischilles, E A

    1995-01-01

    We analyze what is known and unknown about the contribution of the pharmacist as patient educator, physician consultant, and agent to affect patient outcomes in ambulatory settings. The need for pharmacist services is discussed, as are the theoretical underpinnings and quality of the scientific evidence to support their efficacy. The analysis is conducted in the context of a shift in pharmacists' roles from product to patient orientation as well as recent U.S. legislation mandating enhanced pharmacists' roles via drug utilization review for all Medicaid patients. We conclude with a research and action agenda, calling for stronger research designs in evaluating pharmacists' interventions. The shifting paradigm in the pharmacy profession, coupled with the implementation of the Omnibus Budget Reconciliation Act of 1990, provide unique opportunities for rigorous evaluations of pharmacists as agents of change for rational drug therapy.

  16. Technological characteristics of pre- and post-rigor deboned beef mixtures from Holstein steers and quality attributes of cooked beef sausage.

    PubMed

    Sukumaran, Anuraj T; Holtcamp, Alexander J; Campbell, Yan L; Burnett, Derris; Schilling, Mark W; Dinh, Thu T N

    2018-06-07

    The objective of this study was to determine the effects of deboning time (pre- and post-rigor), processing steps (grinding - GB; salting - SB; batter formulation - BB), and storage time on the quality of raw beef mixtures and vacuum-packaged cooked sausage, produced using a commercial formulation with 0.25% phosphate. The pH was greater in pre-rigor GB and SB than in post-rigor GB and SB (P < .001). However, deboning time had no effect on metmyoglobin reducing activity, cooking loss, and color of raw beef mixtures. Protein solubility of pre-rigor beef mixtures (124.26 mg/kg) was greater than that of post-rigor beef (113.93 mg/kg; P = .071). TBARS were increased in BB but decreased during vacuum storage of cooked sausage (P ≤ .018). Except for chewiness and saltiness being 52.9 N-mm and 0.3 points greater in post-rigor sausage (P = .040 and 0.054, respectively), texture profile analysis and trained panelists detected no difference in texture between pre- and post-rigor sausage. Published by Elsevier Ltd.

  17. The relationship of rain-induced cross-polarization discrimination to attenuation for 10 to 30 GHz earth-space radio links

    NASA Technical Reports Server (NTRS)

    Stutzman, W. L.; Runyon, D. L.

    1984-01-01

    Rain depolarization is quantified through the cross-polarization discrimination (XPD) versus attenuation relationship. Such a relationship is derived by curve fitting to a rigorous theoretical model (the multiple scattering model) to determine the variation of the parameters involved. This simple isolation model (SIM) is compared to data from several earth-space link experiments and to three other models.

  18. How to Map Theory: Reliable Methods Are Fruitless Without Rigorous Theory.

    PubMed

    Gray, Kurt

    2017-09-01

    Good science requires both reliable methods and rigorous theory. Theory allows us to build a unified structure of knowledge, to connect the dots of individual studies and reveal the bigger picture. Some have criticized the proliferation of pet "Theories," but generic "theory" is essential to healthy science, because questions of theory are ultimately those of validity. Although reliable methods and rigorous theory are synergistic, Action Identification suggests psychological tension between them: The more we focus on methodological details, the less we notice the broader connections. Therefore, psychology needs to supplement training in methods (how to design studies and analyze data) with training in theory (how to connect studies and synthesize ideas). This article provides a technique for visually outlining theory: theory mapping. Theory mapping contains five elements, which are illustrated with moral judgment and with cars. Also included are 15 additional theory maps provided by experts in emotion, culture, priming, power, stress, ideology, morality, marketing, decision-making, and more (see all at theorymaps.org ). Theory mapping provides both precision and synthesis, which helps to resolve arguments, prevent redundancies, assess the theoretical contribution of papers, and evaluate the likelihood of surprising effects.

  19. Evaluative criteria for qualitative research in health care: controversies and recommendations.

    PubMed

    Cohen, Deborah J; Crabtree, Benjamin F

    2008-01-01

    We wanted to review and synthesize published criteria for good qualitative research and develop a cogent set of evaluative criteria. We identified published journal articles discussing criteria for rigorous research using standard search strategies then examined reference sections of relevant journal articles to identify books and book chapters on this topic. A cross-publication content analysis allowed us to identify criteria and understand the beliefs that shape them. Seven criteria for good qualitative research emerged: (1) carrying out ethical research; (2) importance of the research; (3) clarity and coherence of the research report; (4) use of appropriate and rigorous methods; (5) importance of reflexivity or attending to researcher bias; (6) importance of establishing validity or credibility; and (7) importance of verification or reliability. General agreement was observed across publications on the first 4 quality dimensions. On the last 3, important divergent perspectives were observed in how these criteria should be applied to qualitative research, with differences based on the paradigm embraced by the authors. Qualitative research is not a unified field. Most manuscript and grant reviewers are not qualitative experts and are likely to embrace a generic set of criteria rather than those relevant to the particular qualitative approach proposed or reported. Reviewers and researchers need to be aware of this tendency and educate health care researchers about the criteria appropriate for evaluating qualitative research from within the theoretical and methodological framework from which it emerges.

  20. Clinical studies in orthodontics--an overview of NIDR-sponsored clinical orthodontic studies in the US.

    PubMed

    Baumrind, S

    1998-11-01

    A number of clinical trials sponsored by the National Institutes of Health (NIH) use rigorous methods of data acquisition and analysis previously developed in fundamental biology and the physical sciences. The naive expectation that these trials would lead relatively rapidly to definitive answers concerning the therapeutic strategies and techniques under study is dispelled. This presentation focuses on delineating differences between the study of central tendencies and individual variation, more specifically on the strategy to study this variation: measure additional sources of variance within each patient at more timepoints and perhaps with greater precision. As rigorous orthodontic research is still in its infancy, the problem of defining the proper mix between prospective and retrospective trials is discussed. In view of the high costs of prospective clinical trials, many of the questions germane to orthodontics can be answered by well-conducted retrospective trials, assuming that properly randomized sampling procedures are employed. Definitive clinical trials are likely to require better theoretical constructs, better instrumentation, and better measures than now available. Reasons for concern are the restricted resources available and the fact that current mensurational approaches may not detect many of the individual differences. The task of constructing sharable databases and record bases stored in digital form and available either remotely from servers, or locally from CD-ROMs or optical disks, is crucial to the optimization of future investigations.

  1. Weblog patterns and human dynamics with decreasing interest

    NASA Astrophysics Data System (ADS)

    Guo, J.-L.; Fan, C.; Guo, Z.-H.

    2011-06-01

    In order to describe the phenomenon that people's interest in doing something always keep high in the beginning while gradually decreases until reaching the balance, a model which describes the attenuation of interest is proposed to reflect the fact that people's interest becomes more stable after a long time. We give a rigorous analysis on this model by non-homogeneous Poisson processes. Our analysis indicates that the interval distribution of arrival-time is a mixed distribution with exponential and power-law feature, which is a power law with an exponential cutoff. After that, we collect blogs in ScienceNet.cn and carry on empirical study on the interarrival time distribution. The empirical results agree well with the theoretical analysis, obeying a special power law with the exponential cutoff, that is, a special kind of Gamma distribution. These empirical results verify the model by providing an evidence for a new class of phenomena in human dynamics. It can be concluded that besides power-law distributions, there are other distributions in human dynamics. These findings demonstrate the variety of human behavior dynamics.

  2. Using Velocity Anisotropy to Analyze Magnetohydrodynamic Turbulence in Giant Molecular Clouds

    NASA Astrophysics Data System (ADS)

    Madrid, Alecio; Hernandez, Audra

    2018-01-01

    Structure function (SF) analysis is a strong tool for gaging the Alfvénic properties of magnetohydrodynamic (MHD) simulations, yet there is a lack of literature rigorously investigating limitations in the context of radio spectroscopy. This study takes an in depth approach to studying the limitations of SF analysis for analyzing MHD turbulence in giant molecular cloud (GMC) spectroscopy data. MHD turbulence plays a critical role in the structure and evolution of GMCs as well as in the formation of sub-structures known to spawn stellar progenitors. Existing methods of detection are neither economical nor robust (e.g. dust polarization), and nowhere is this more clear than in the theoretical-observational divide in current literature. A significant limitation of GMC spectroscopy results from the large variation in methods used for extracting GMCs from survey data. Thus, a robust method for studying MHD turbulence must correctly gauge physical properties regardless of the data extraction method used. While SF analysis has demonstrated strong potential across a range of simulated conditions, this study finds significant concern regarding its feasibility as a robust tool in GMC spectroscopy.

  3. Children facing a family member's acute illness: a review of intervention studies.

    PubMed

    Spath, Mary L

    2007-07-01

    A review of psycho-educational intervention studies to benefit children adapting to a close (parent, sibling, or grandparent) family member's serious illness was conducted. To review the literature on studies addressing this topic, critique research methods, describe clinical outcomes, and make recommendations for future research efforts. Research citations from 1990 to 2005 from Medline, CINAHL, Health Source: Nursing/Academic Edition, PsycARTICLES, and PsycINFO databases were identified. Citations were reviewed and evaluated for sample, design, theoretical framework, intervention, threats to validity, and outcomes. Reviewed studies were limited to those that included statistical analysis to evaluate interventions and outcomes. Six studies were reviewed. Positive outcomes were reported for all of the interventional strategies used in the studies. Reviewed studies generally lacked a theoretical framework and a control group, were generally composed of small convenience samples, and primarily used non-tested investigator instruments. They were diverse in terms of intervention length and intensity, and measured short-term outcomes related to participant program satisfaction, rather than participant cognitive and behavioral change. The paucity of interventional studies and lack of systematic empirical precision to evaluate intervention effectiveness necessitates future studies that are methodologically rigorous.

  4. Theoretical and experimental investigations of coincidences in Poisson distributed pulse trains and spectral distortion caused by pulse pileup

    NASA Astrophysics Data System (ADS)

    Bristow, Quentin

    1990-03-01

    The occurrence rates of pulse strings, or sequences of pulses with interarrival times less than the resolving time of the pulse-height analysis system used to acquire spectra, are derived from theoretical considerations. Logic circuits were devised to make experimental measurements of multiple pulse string occurrence rates in the output from a scintillation detector over a wide range of count rates. Markov process theory was used to predict state transition rates in the logic circuits, enabling the experimental data to be checked rigorously for conformity with those predicted for a Poisson distribution. No fundamental discrepancies were observed. Monte Carlo simulations, incorporating criteria for pulse pileup inherent in the operation of modern analog to digital converters, were used to generate pileup spectra due to coincidences between two pulses (first order pileup) and three pulses (second order pileup) for different semi-Gaussian pulse shapes. Coincidences between pulses in a single channel produced a basic probability density function spectrum. The use of a flat spectrum showed the first order pileup distorted the spectrum to a linear ramp with a pileup tail. A correction algorithm was successfully applied to correct entire spectra (simulated and real) for first and second order pileups.

  5. Threshold for extinction and survival in stochastic tumor immune system

    NASA Astrophysics Data System (ADS)

    Li, Dongxi; Cheng, Fangjuan

    2017-10-01

    This paper mainly investigates the stochastic character of tumor growth and extinction in the presence of immune response of a host organism. Firstly, the mathematical model describing the interaction and competition between the tumor cells and immune system is established based on the Michaelis-Menten enzyme kinetics. Then, the threshold conditions for extinction, weak persistence and stochastic persistence of tumor cells are derived by the rigorous theoretical proofs. Finally, stochastic simulation are taken to substantiate and illustrate the conclusion we have derived. The modeling results will be beneficial to understand to concept of immunoediting, and develop the cancer immunotherapy. Besides, our simple theoretical model can help to obtain new insight into the complexity of tumor growth.

  6. Determination of effective mass of heavy hole from phonon-assisted excitonic luminescence spectra in ZnO

    NASA Astrophysics Data System (ADS)

    Shi, S. L.; Xu, S. J.

    2011-03-01

    Longitudinal optical (LO) phonon-assisted luminescence spectra of free excitons in high-quality ZnO crystal were investigated both experimentally and theoretically. By using the rigorous Segall-Mahan model based on the Green's function, good agreement between the experimental emission spectra involving one or two LO phonons and theoretical spectra can be achieved when only one adjustable parameter (effective mass of heavy hole) was adopted. This leads to determination of the heavy-hole effective mass mh⊥ = (0.8 m0 and mh∥ = 5.0 m0) in ZnO. Influence of anisotropic effective masses of heavy holes on the phonon sidebands is also discussed.

  7. Polarization sensitivity testing of off-plane reflection gratings

    NASA Astrophysics Data System (ADS)

    Marlowe, Hannah; McEntaffer, Randal L.; DeRoo, Casey T.; Miles, Drew M.; Tutt, James H.; Laubis, Christian; Soltwisch, Victor

    2015-09-01

    Off-Plane reflection gratings were previously predicted to have different efficiencies when the incident light is polarized in the transverse-magnetic (TM) versus transverse-electric (TE) orientations with respect to the grating grooves. However, more recent theoretical calculations which rigorously account for finitely conducting, rather than perfectly conducting, grating materials no longer predict significant polarization sensitivity. We present the first empirical results for radially ruled, laminar groove profile gratings in the off-plane mount which demonstrate no difference in TM versus TE efficiency across our entire 300-1500 eV bandpass. These measurements together with the recent theoretical results confirm that grazing incidence off-plane reflection gratings using real, not perfectly conducting, materials are not polarization sensitive.

  8. Single-case synthesis tools I: Comparing tools to evaluate SCD quality and rigor.

    PubMed

    Zimmerman, Kathleen N; Ledford, Jennifer R; Severini, Katherine E; Pustejovsky, James E; Barton, Erin E; Lloyd, Blair P

    2018-03-03

    Tools for evaluating the quality and rigor of single case research designs (SCD) are often used when conducting SCD syntheses. Preferred components include evaluations of design features related to the internal validity of SCD to obtain quality and/or rigor ratings. Three tools for evaluating the quality and rigor of SCD (Council for Exceptional Children, What Works Clearinghouse, and Single-Case Analysis and Design Framework) were compared to determine if conclusions regarding the effectiveness of antecedent sensory-based interventions for young children changed based on choice of quality evaluation tool. Evaluation of SCD quality differed across tools, suggesting selection of quality evaluation tools impacts evaluation findings. Suggestions for selecting an appropriate quality and rigor assessment tool are provided and across-tool conclusions are drawn regarding the quality and rigor of studies. Finally, authors provide guidance for using quality evaluations in conjunction with outcome analyses when conducting syntheses of interventions evaluated in the context of SCD. Copyright © 2018 Elsevier Ltd. All rights reserved.

  9. Monitoring muscle optical scattering properties during rigor mortis

    NASA Astrophysics Data System (ADS)

    Xia, J.; Ranasinghesagara, J.; Ku, C. W.; Yao, G.

    2007-09-01

    Sarcomere is the fundamental functional unit in skeletal muscle for force generation. In addition, sarcomere structure is also an important factor that affects the eating quality of muscle food, the meat. The sarcomere structure is altered significantly during rigor mortis, which is the critical stage involved in transforming muscle to meat. In this paper, we investigated optical scattering changes during the rigor process in Sternomandibularis muscles. The measured optical scattering parameters were analyzed along with the simultaneously measured passive tension, pH value, and histology analysis. We found that the temporal changes of optical scattering, passive tension, pH value and fiber microstructures were closely correlated during the rigor process. These results suggested that sarcomere structure changes during rigor mortis can be monitored and characterized by optical scattering, which may find practical applications in predicting meat quality.

  10. A Theory of Material Spike Formation in Flow Separation

    NASA Astrophysics Data System (ADS)

    Serra, Mattia; Haller, George

    2017-11-01

    We develop a frame-invariant theory of material spike formation during flow separation over a no-slip boundary in two-dimensional flows with arbitrary time dependence. This theory identifies both fixed and moving separation, is effective also over short-time intervals, and admits a rigorous instantaneous limit. Our theory is based on topological properties of material lines, combining objectively stretching- and rotation-based kinematic quantities. The separation profile identified here serves as the theoretical backbone for the material spike from its birth to its fully developed shape, and remains hidden to existing approaches. Finally, our theory can be used to rigorously explain the perception of off-wall separation in unsteady flows, and more importantly, provide the conditions under which such a perception is justified. We illustrate our results in several examples including steady, time-periodic and unsteady analytic velocity fields with flat and curved boundaries, and an experimental dataset.

  11. A case of instantaneous rigor?

    PubMed

    Pirch, J; Schulz, Y; Klintschar, M

    2013-09-01

    The question of whether instantaneous rigor mortis (IR), the hypothetic sudden occurrence of stiffening of the muscles upon death, actually exists has been controversially debated over the last 150 years. While modern German forensic literature rejects this concept, the contemporary British literature is more willing to embrace it. We present the case of a young woman who suffered from diabetes and who was found dead in an upright standing position with back and shoulders leaned against a punchbag and a cupboard. Rigor mortis was fully established, livor mortis was strong and according to the position the body was found in. After autopsy and toxicological analysis, it was stated that death most probably occurred due to a ketoacidotic coma with markedly increased values of glucose and lactate in the cerebrospinal fluid as well as acetone in blood and urine. Whereas the position of the body is most unusual, a detailed analysis revealed that it is a stable position even without rigor mortis. Therefore, this case does not further support the controversial concept of IR.

  12. The logic of counterfactual analysis in case-study explanation.

    PubMed

    Mahoney, James; Barrenechea, Rodrigo

    2017-12-19

    In this paper, we develop a set-theoretic and possible worlds approach to counterfactual analysis in case-study explanation. Using this approach, we first consider four kinds of counterfactuals: necessary condition counterfactuals, SUIN condition counterfactuals, sufficient condition counterfactuals, and INUS condition counterfactuals. We explore the distinctive causal claims entailed in each, and conclude that necessary condition and SUIN condition counterfactuals are the most useful types for hypothesis assessment in case-study research. We then turn attention to the development of a rigorous understanding of the 'minimal-rewrite' rule, linking this rule to insights from set theory about the relative importance of necessary conditions. We show why, logically speaking, a comparative analysis of two necessary condition counterfactuals will tend to favour small events and contingent happenings. A third section then presents new tools for specifying the level of generality of the events in a counterfactual. We show why and how the goals of formulating empirically important versus empirically plausible counterfactuals stand in tension with one another. Finally, we use our framework to link counterfactual analysis to causal sequences, which in turn provides advantages for conducting counterfactual projections. © London School of Economics and Political Science 2017.

  13. Automated inference procedure for the determination of cell growth parameters

    NASA Astrophysics Data System (ADS)

    Harris, Edouard A.; Koh, Eun Jee; Moffat, Jason; McMillen, David R.

    2016-01-01

    The growth rate and carrying capacity of a cell population are key to the characterization of the population's viability and to the quantification of its responses to perturbations such as drug treatments. Accurate estimation of these parameters necessitates careful analysis. Here, we present a rigorous mathematical approach for the robust analysis of cell count data, in which all the experimental stages of the cell counting process are investigated in detail with the machinery of Bayesian probability theory. We advance a flexible theoretical framework that permits accurate estimates of the growth parameters of cell populations and of the logical correlations between them. Moreover, our approach naturally produces an objective metric of avoidable experimental error, which may be tracked over time in a laboratory to detect instrumentation failures or lapses in protocol. We apply our method to the analysis of cell count data in the context of a logistic growth model by means of a user-friendly computer program that automates this analysis, and present some samples of its output. Finally, we note that a traditional least squares fit can provide misleading estimates of parameter values, because it ignores available information with regard to the way in which the data have actually been collected.

  14. Surface conservation laws at microscopically diffuse interfaces.

    PubMed

    Chu, Kevin T; Bazant, Martin Z

    2007-11-01

    In studies of interfaces with dynamic chemical composition, bulk and interfacial quantities are often coupled via surface conservation laws of excess surface quantities. While this approach is easily justified for microscopically sharp interfaces, its applicability in the context of microscopically diffuse interfaces is less theoretically well-established. Furthermore, surface conservation laws (and interfacial models in general) are often derived phenomenologically rather than systematically. In this article, we first provide a mathematically rigorous justification for surface conservation laws at diffuse interfaces based on an asymptotic analysis of transport processes in the boundary layer and derive general formulae for the surface and normal fluxes that appear in surface conservation laws. Next, we use nonequilibrium thermodynamics to formulate surface conservation laws in terms of chemical potentials and provide a method for systematically deriving the structure of the interfacial layer. Finally, we derive surface conservation laws for a few examples from diffusive and electrochemical transport.

  15. Proper motion and secular variations of Keplerian orbital elements

    NASA Astrophysics Data System (ADS)

    Butkevich, Alexey G.

    2018-05-01

    High-precision observations require accurate modelling of secular changes in the orbital elements in order to extrapolate measurements over long time intervals, and to detect deviation from pure Keplerian motion caused, for example, by other bodies or relativistic effects. We consider the evolution of the Keplerian elements resulting from the gradual change of the apparent orbit orientation due to proper motion. We present rigorous formulae for the transformation of the orbit inclination, longitude of the ascending node and argument of the pericenter from one epoch to another, assuming uniform stellar motion and taking radial velocity into account. An approximate treatment, accurate to the second-order terms in time, is also given. The proper motion effects may be significant for long-period transiting planets. These theoretical results are applicable to the modelling of planetary transits and precise Doppler measurements as well as analysis of pulsar and eclipsing binary timing observations.

  16. Theoretical and experimental analysis of the structural pattern responsible for the iridescence of Morpho butterflies.

    PubMed

    Siddique, Radwanul Hasan; Diewald, Silvia; Leuthold, Juerg; Hölscher, Hendrik

    2013-06-17

    Morpho butterflies are well-known for their iridescence originating from nanostructures in the scales of their wings. These optical active structures integrate three design principles leading to the wide angle reflection: alternating lamellae layers, "Christmas tree" like shape, and offsets between neighboring ridges. We study their individual effects rigorously by 2D FEM simulations of the nanostructures of the Morpho sulkowskyi butterfly and show how the reflection spectrum can be controlled by the design of the nanostructures. The width of the spectrum is broad (≈ 90 nm) for alternating lamellae layers (or "brunches") of the structure while the "Christmas tree" pattern together with a height offset between neighboring ridges reduces the directionality of the reflectance. Furthermore, we fabricated the simulated structures by e-beam lithography. The resulting samples mimicked all important optical features of the original Morpho butterfly scales and feature the intense blue iridescence with a wide angular range of reflection.

  17. SCOTCH: Secure Counting Of encrypTed genomiC data using a Hybrid approach.

    PubMed

    Chenghong, Wang; Jiang, Yichen; Mohammed, Noman; Chen, Feng; Jiang, Xiaoqian; Al Aziz, Md Momin; Sadat, Md Nazmus; Wang, Shuang

    2017-01-01

    As genomic data are usually at large scale and highly sensitive, it is essential to enable both efficient and secure analysis, by which the data owner can securely delegate both computation and storage on untrusted public cloud. Counting query of genotypes is a basic function for many downstream applications in biomedical research (e.g., computing allele frequency, calculating chi-squared statistics, etc.). Previous solutions show promise on secure counting of outsourced data but the efficiency is still a big limitation for real world applications. In this paper, we propose a novel hybrid solution to combine a rigorous theoretical model (homomorphic encryption) and the latest hardware-based infrastructure (i.e., Software Guard Extensions) to speed up the computation while preserving the privacy of both data owners and data users. Our results demonstrated efficiency by using the real data from the personal genome project.

  18. Network-based stochastic semisupervised learning.

    PubMed

    Silva, Thiago Christiano; Zhao, Liang

    2012-03-01

    Semisupervised learning is a machine learning approach that is able to employ both labeled and unlabeled samples in the training process. In this paper, we propose a semisupervised data classification model based on a combined random-preferential walk of particles in a network (graph) constructed from the input dataset. The particles of the same class cooperate among themselves, while the particles of different classes compete with each other to propagate class labels to the whole network. A rigorous model definition is provided via a nonlinear stochastic dynamical system and a mathematical analysis of its behavior is carried out. A numerical validation presented in this paper confirms the theoretical predictions. An interesting feature brought by the competitive-cooperative mechanism is that the proposed model can achieve good classification rates while exhibiting low computational complexity order in comparison to other network-based semisupervised algorithms. Computer simulations conducted on synthetic and real-world datasets reveal the effectiveness of the model.

  19. Solar energy enhancement using down-converting particles: A rigorous approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abrams, Ze’ev R.; Niv, Avi; Zhang, Xiang

    2011-06-01

    The efficiency of a single band-gap solar cell is specified by the Shockley-Queisser limit, which defines the maximal output power as a function of the solar cell’s band-gap. One way to overcome this limit is by using a down-conversion process whereupon a high energy photon is split into two lower energy photons, thereby increasing the current of the cell. Here, we provide a full analysis of the possible efficiency increase when placing a down-converting material on top of a pre-existing solar cell. We show that a total 7% efficiency improvement is possible for a perfectly efficient down-converting material. Our analysismore » covers both lossless and lossy theoretical limits, as well as a thermodynamic evaluation. Finally, we describe the advantages of nanoparticles as a possible choice for a down-converting material.« less

  20. Unified beam splitter of fused silica grating under the second Bragg incidence.

    PubMed

    Sun, Zhumei; Zhou, Changhe; Cao, Hongchao; Wu, Jun

    2015-11-01

    A unified design for a 1×2 beam splitter of dielectric rectangular transmission gratings under the second Bragg incidence is theoretically investigated for TE- and TM-polarized light. The empirical equations of the relative grating parameters (ratio of the absolute one to incidence wavelength) for this design are also obtained with the simplified modal method (SMM). The influences of polarization of incident light and relative grating parameters on the performance of the beam splitter are thoroughly studied based on the SMM and rigorous coupled-wave analysis. Two specific gratings are demonstrated with an even split and high diffraction efficiency (>94% for TE polarization and >97% for the TM counterpart). The unified profiles of the 1×2 beam splitter are independent from the incidence wavelength since the refractive index of fused silica is roughly a constant over a wide range of wavelengths, which should be promising for future applications.

  1. Fabrication of the polarization independent spectral beam combining grating

    NASA Astrophysics Data System (ADS)

    Liu, Quan; Jin, Yunxia; Wu, Jianhong; Guo, Peiliang

    2016-03-01

    Owing to damage, thermal issues, and nonlinear optical effects, the output power of fiber laser has been proven to be limited. Beam combining techniques are the attractive solutions to achieve high-power high-brightness fiber laser output. The spectral beam combining (SBC) is a promising method to achieve high average power output without influencing the beam quality. A polarization independent spectral beam combining grating is one of the key elements in the SBC. In this paper the diffraction efficiency of the grating is investigated by rigorous coupled-wave analysis (RCWA). The theoretical -1st order diffraction efficiency of the grating is more than 95% from 1010nm to 1080nm for both TE and TM polarizations. The fabrication tolerance is analyzed. The polarization independent spectral beam combining grating with the period of 1.04μm has been fabricated by holographic lithography - ion beam etching, which are within the fabrication tolerance.

  2. Standard representation and unified stability analysis for dynamic artificial neural network models.

    PubMed

    Kim, Kwang-Ki K; Patrón, Ernesto Ríos; Braatz, Richard D

    2018-02-01

    An overview is provided of dynamic artificial neural network models (DANNs) for nonlinear dynamical system identification and control problems, and convex stability conditions are proposed that are less conservative than past results. The three most popular classes of dynamic artificial neural network models are described, with their mathematical representations and architectures followed by transformations based on their block diagrams that are convenient for stability and performance analyses. Classes of nonlinear dynamical systems that are universally approximated by such models are characterized, which include rigorous upper bounds on the approximation errors. A unified framework and linear matrix inequality-based stability conditions are described for different classes of dynamic artificial neural network models that take additional information into account such as local slope restrictions and whether the nonlinearities within the DANNs are odd. A theoretical example shows reduced conservatism obtained by the conditions. Copyright © 2017. Published by Elsevier Ltd.

  3. Computational Approaches to the Chemical Equilibrium Constant in Protein-ligand Binding.

    PubMed

    Montalvo-Acosta, Joel José; Cecchini, Marco

    2016-12-01

    The physiological role played by protein-ligand recognition has motivated the development of several computational approaches to the ligand binding affinity. Some of them, termed rigorous, have a strong theoretical foundation but involve too much computation to be generally useful. Some others alleviate the computational burden by introducing strong approximations and/or empirical calibrations, which also limit their general use. Most importantly, there is no straightforward correlation between the predictive power and the level of approximation introduced. Here, we present a general framework for the quantitative interpretation of protein-ligand binding based on statistical mechanics. Within this framework, we re-derive self-consistently the fundamental equations of some popular approaches to the binding constant and pinpoint the inherent approximations. Our analysis represents a first step towards the development of variants with optimum accuracy/efficiency ratio for each stage of the drug discovery pipeline. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Analysis of localized surface plasmon resonances in gold nanoparticles surrounded by copper oxides

    NASA Astrophysics Data System (ADS)

    Stamatelatos, A.; Sousanis, A.; Chronis, A. G.; Sigalas, M. M.; Grammatikopoulos, S.; Poulopoulos, P.

    2018-02-01

    Au-doped Cu thin films are produced by co-deposition of Au and Cu via radiofrequency magnetron sputtering in a vacuum chamber with a base pressure of 1 × 10-7 mbar. After post annealing in a furnace with air, one may obtain either Au-Cu2O or Au-CuO nanocomposite thin films. The presence of Au does not have any considerable influence on the position of the optical band gap of the oxides. Only the Au-CuO system shows well-formed localized surface plasmonic resonances with Gaussian shape. We study systematically the plasmonic behavior of the nanocomposites as a function of the gold concentration, annealing time, and film thickness. The intensity of the resonances, their position, and width are intensely affected by all these parameters. The experimental results are compared with respect to rigorous theoretical calculations. The similarities and differences between experiment and theory are discussed.

  5. Imaging the Localized Plasmon Resonance Modes in Graphene Nanoribbons

    DOE PAGES

    Hu, F.; Luan, Y.; Fei, Z.; ...

    2017-08-14

    Here, we report a nanoinfrared (IR) imaging study of the localized plasmon resonance modes of graphene nanoribbons (GNRs) using a scattering-type scanning near-field optical microscope (s-SNOM). By comparing the imaging data of GNRs that are aligned parallel and perpendicular to the in-plane component of the excitation laser field, we observed symmetric and asymmetric plasmonic interference fringes, respectively. Theoretical analysis indicates that the asymmetric fringes are formed due to the interplay between the localized surface plasmon resonance (SPR) mode excited by the GNRs and the propagative surface plasmon polariton (SPP) mode launched by the s-SNOM tip. And with rigorous simulations, wemore » reproduce the observed fringe patterns and address quantitatively the role of the s-SNOM tip on both the SPR and SPP modes. Moreover, we have seen real-space signatures of both the dipole and higher-order SPR modes by varying the ribbon width.« less

  6. SCOTCH: Secure Counting Of encrypTed genomiC data using a Hybrid approach

    PubMed Central

    Chenghong, Wang; Jiang, Yichen; Mohammed, Noman; Chen, Feng; Jiang, Xiaoqian; Al Aziz, Md Momin; Sadat, Md Nazmus; Wang, Shuang

    2017-01-01

    As genomic data are usually at large scale and highly sensitive, it is essential to enable both efficient and secure analysis, by which the data owner can securely delegate both computation and storage on untrusted public cloud. Counting query of genotypes is a basic function for many downstream applications in biomedical research (e.g., computing allele frequency, calculating chi-squared statistics, etc.). Previous solutions show promise on secure counting of outsourced data but the efficiency is still a big limitation for real world applications. In this paper, we propose a novel hybrid solution to combine a rigorous theoretical model (homomorphic encryption) and the latest hardware-based infrastructure (i.e., Software Guard Extensions) to speed up the computation while preserving the privacy of both data owners and data users. Our results demonstrated efficiency by using the real data from the personal genome project. PMID:29854245

  7. Modeling and empirical characterization of the polarization response of off-plane reflection gratings.

    PubMed

    Marlowe, Hannah; McEntaffer, Randall L; Tutt, James H; DeRoo, Casey T; Miles, Drew M; Goray, Leonid I; Soltwisch, Victor; Scholze, Frank; Herrero, Analia Fernandez; Laubis, Christian

    2016-07-20

    Off-plane reflection gratings were previously predicted to have different efficiencies when the incident light is polarized in the transverse-magnetic (TM) versus transverse-electric (TE) orientations with respect to the grating grooves. However, more recent theoretical calculations which rigorously account for finitely conducting, rather than perfectly conducting, grating materials no longer predict significant polarization sensitivity. We present the first empirical results for radially ruled, laminar groove profile gratings in the off-plane mount, which demonstrate no difference in TM versus TE efficiency across our entire 300-1500 eV bandpass. These measurements together with the recent theoretical results confirm that grazing incidence off-plane reflection gratings using real, not perfectly conducting, materials are not polarization sensitive.

  8. Response to Ridgeway, Dunston, and Qian: On Methodological Rigor: Has Rigor Mortis Set In?

    ERIC Educational Resources Information Center

    Baldwin, R. Scott; Vaughn, Sharon

    1993-01-01

    Responds to an article in the same issue of the journal presenting a meta-analysis of reading research. Expresses concern that the authors' conclusions will promote a slavish adherence to a methodology and a rigidity of thought that reading researchers can ill afford. (RS)

  9. An Assessment of Cost Improvements in the NASA COTS - CRS Program and Implications for Future NASA Missions

    NASA Technical Reports Server (NTRS)

    Zapata, Edgar

    2017-01-01

    This review brings rigorous life cycle cost (LCC) analysis into discussions about COTS program costs. We gather publicly available cost data, review the data for credibility, check for consistency among sources, and rigorously define and analyze specific cost metrics.

  10. Systemic Planning: An Annotated Bibliography and Literature Guide. Exchange Bibliography No. 91.

    ERIC Educational Resources Information Center

    Catanese, Anthony James

    Systemic planning is an operational approach to using scientific rigor and qualitative judgment in a complementary manner. It integrates rigorous techniques and methods from systems analysis, cybernetics, decision theory, and work programing. The annotated reference sources in this bibliography include those works that have been most influential…

  11. Are computational models of any use to psychiatry?

    PubMed

    Huys, Quentin J M; Moutoussis, Michael; Williams, Jonathan

    2011-08-01

    Mathematically rigorous descriptions of key hypotheses and theories are becoming more common in neuroscience and are beginning to be applied to psychiatry. In this article two fictional characters, Dr. Strong and Mr. Micawber, debate the use of such computational models (CMs) in psychiatry. We present four fundamental challenges to the use of CMs in psychiatry: (a) the applicability of mathematical approaches to core concepts in psychiatry such as subjective experiences, conflict and suffering; (b) whether psychiatry is mature enough to allow informative modelling; (c) whether theoretical techniques are powerful enough to approach psychiatric problems; and (d) the issue of communicating clinical concepts to theoreticians and vice versa. We argue that CMs have yet to influence psychiatric practice, but that they help psychiatric research in two fundamental ways: (a) to build better theories integrating psychiatry with neuroscience; and (b) to enforce explicit, global and efficient testing of hypotheses through more powerful analytical methods. CMs allow the complexity of a hypothesis to be rigorously weighed against the complexity of the data. The paper concludes with a discussion of the path ahead. It points to stumbling blocks, like the poor communication between theoretical and medical communities. But it also identifies areas in which the contributions of CMs will likely be pivotal, like an understanding of social influences in psychiatry, and of the co-morbidity structure of psychiatric diseases. Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. Instrument Selection for Randomized Controlled Trials Why This and Not That?

    PubMed Central

    Records, Kathie; Keller, Colleen; Ainsworth, Barbara; Permana, Paska

    2011-01-01

    A fundamental linchpin for obtaining rigorous findings in quantitative research involves the selection of survey instruments. Psychometric recommendations are available for the processes for scale development and testing and guidance for selection of established scales. These processes are necessary to address the validity link between the phenomena under investigation, the empirical measures and, ultimately, the theoretical ties between these and the world views of the participants. Detailed information is most often provided about study design and protocols, but far less frequently is a detailed theoretical explanation provided for why specific instruments are chosen. Guidance to inform choices is often difficult to find when scales are needed for specific cultural, ethnic, or racial groups. This paper details the rationale underlying instrument selection for measurement of the major processes (intervention, mediator and moderator variables, outcome variables) in an ongoing study of postpartum Latinas, Madres para la Salud [Mothers for Health]. The rationale underpinning our choices includes a discussion of alternatives, when appropriate. These exemplars may provide direction for other intervention researchers who are working with specific cultural, racial, or ethnic groups or for other investigators who are seeking to select the ‘best’ instrument. Thoughtful consideration of measurement and articulation of the rationale underlying our choices facilitates the maintenance of rigor within the study design and improves our ability to assess study outcomes. PMID:21986392

  13. A biomechanical model for fibril recruitment: Evaluation in tendons and arteries.

    PubMed

    Bevan, Tim; Merabet, Nadege; Hornsby, Jack; Watton, Paul N; Thompson, Mark S

    2018-06-06

    Simulations of soft tissue mechanobiological behaviour are increasingly important for clinical prediction of aneurysm, tendinopathy and other disorders. Mechanical behaviour at low stretches is governed by fibril straightening, transitioning into load-bearing at recruitment stretch, resulting in a tissue stiffening effect. Previous investigations have suggested theoretical relationships between stress-stretch measurements and recruitment probability density function (PDF) but not derived these rigorously nor evaluated these experimentally. Other work has proposed image-based methods for measurement of recruitment but made use of arbitrary fibril critical straightness parameters. The aim of this work was to provide a sound theoretical basis for estimating recruitment PDF from stress-stretch measurements and to evaluate this relationship using image-based methods, clearly motivating the choice of fibril critical straightness parameter in rat tail tendon and porcine artery. Rigorous derivation showed that the recruitment PDF may be estimated from the second stretch derivative of the first Piola-Kirchoff tissue stress. Image-based fibril recruitment identified the fibril straightness parameter that maximised Pearson correlation coefficients (PCC) with estimated PDFs. Using these critical straightness parameters the new method for estimating recruitment PDF showed a PCC with image-based measures of 0.915 and 0.933 for tendons and arteries respectively. This method may be used for accurate estimation of fibril recruitment PDF in mechanobiological simulation where fibril-level mechanical parameters are important for predicting cell behaviour. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. What Do We Mean By Sensitivity Analysis? The Need For A Comprehensive Characterization Of Sensitivity In Earth System Models

    NASA Astrophysics Data System (ADS)

    Razavi, S.; Gupta, H. V.

    2014-12-01

    Sensitivity analysis (SA) is an important paradigm in the context of Earth System model development and application, and provides a powerful tool that serves several essential functions in modelling practice, including 1) Uncertainty Apportionment - attribution of total uncertainty to different uncertainty sources, 2) Assessment of Similarity - diagnostic testing and evaluation of similarities between the functioning of the model and the real system, 3) Factor and Model Reduction - identification of non-influential factors and/or insensitive components of model structure, and 4) Factor Interdependence - investigation of the nature and strength of interactions between the factors, and the degree to which factors intensify, cancel, or compensate for the effects of each other. A variety of sensitivity analysis approaches have been proposed, each of which formally characterizes a different "intuitive" understanding of what is meant by the "sensitivity" of one or more model responses to its dependent factors (such as model parameters or forcings). These approaches are based on different philosophies and theoretical definitions of sensitivity, and range from simple local derivatives and one-factor-at-a-time procedures to rigorous variance-based (Sobol-type) approaches. In general, each approach focuses on, and identifies, different features and properties of the model response and may therefore lead to different (even conflicting) conclusions about the underlying sensitivity. This presentation revisits the theoretical basis for sensitivity analysis, and critically evaluates existing approaches so as to demonstrate their flaws and shortcomings. With this background, we discuss several important properties of response surfaces that are associated with the understanding and interpretation of sensitivity. Finally, a new approach towards global sensitivity assessment is developed that is consistent with important properties of Earth System model response surfaces.

  15. Volume Holograms in Photopolymers: Comparison between Analytical and Rigorous Theories

    PubMed Central

    Gallego, Sergi; Neipp, Cristian; Estepa, Luis A.; Ortuño, Manuel; Márquez, Andrés; Francés, Jorge; Pascual, Inmaculada; Beléndez, Augusto

    2012-01-01

    There is no doubt that the concept of volume holography has led to an incredibly great amount of scientific research and technological applications. One of these applications is the use of volume holograms as optical memories, and in particular, the use of a photosensitive medium like a photopolymeric material to record information in all its volume. In this work we analyze the applicability of Kogelnik’s Coupled Wave theory to the study of volume holograms recorded in photopolymers. Some of the theoretical models in the literature describing the mechanism of hologram formation in photopolymer materials use Kogelnik’s theory to analyze the gratings recorded in photopolymeric materials. If Kogelnik’s theory cannot be applied is necessary to use a more general Coupled Wave theory (CW) or the Rigorous Coupled Wave theory (RCW). The RCW does not incorporate any approximation and thus, since it is rigorous, permits judging the accurateness of the approximations included in Kogelnik’s and CW theories. In this article, a comparison between the predictions of the three theories for phase transmission diffraction gratings is carried out. We have demonstrated the agreement in the prediction of CW and RCW and the validity of Kogelnik’s theory only for gratings with spatial frequencies higher than 500 lines/mm for the usual values of the refractive index modulations obtained in photopolymers.

  16. High and low rigor temperature effects on sheep meat tenderness and ageing.

    PubMed

    Devine, Carrick E; Payne, Steven R; Peachey, Bridget M; Lowe, Timothy E; Ingram, John R; Cook, Christian J

    2002-02-01

    Immediately after electrical stimulation, the paired m. longissimus thoracis et lumborum (LT) of 40 sheep were boned out and wrapped tightly with a polyethylene cling film. One of the paired LT's was chilled in 15°C air to reach a rigor mortis (rigor) temperature of 18°C and the other side was placed in a water bath at 35°C and achieved rigor at this temperature. Wrapping reduced rigor shortening and mimicked meat left on the carcass. After rigor, the meat was aged at 15°C for 0, 8, 26 and 72 h and then frozen. The frozen meat was cooked to 75°C in an 85°C water bath and shear force values obtained from a 1×1 cm cross-section. The shear force values of meat for 18 and 35°C rigor were similar at zero ageing, but as ageing progressed, the 18 rigor meat aged faster and became more tender than meat that went into rigor at 35°C (P<0.001). The mean sarcomere length values of meat samples for 18 and 35°C rigor at each ageing time were significantly different (P<0.001), the samples at 35°C being shorter. When the short sarcomere length values and corresponding shear force values were removed for further data analysis, the shear force values for the 35°C rigor were still significantly greater. Thus the toughness of 35°C meat was not a consequence of muscle shortening and appears to be due to both a faster rate of tenderisation and the meat tenderising to a greater extent at the lower temperature. The cook loss at 35°C rigor (30.5%) was greater than that at 18°C rigor (28.4%) (P<0.01) and the colour Hunter L values were higher at 35°C (P<0.01) compared with 18°C, but there were no significant differences in a or b values.

  17. An ex post facto evaluation framework for place-based police interventions.

    PubMed

    Braga, Anthony A; Hureau, David M; Papachristos, Andrew V

    2011-12-01

    A small but growing body of research evidence suggests that place-based police interventions generate significant crime control gains. While place-based policing strategies have been adopted by a majority of U.S. police departments, very few agencies make a priori commitments to rigorous evaluations. Recent methodological developments were applied to conduct a rigorous ex post facto evaluation of the Boston Police Department's Safe Street Team (SST) hot spots policing program. A nonrandomized quasi-experimental design was used to evaluate the violent crime control benefits of the SST program at treated street segments and intersections relative to untreated street segments and intersections. Propensity score matching techniques were used to identify comparison places in Boston. Growth curve regression models were used to analyze violent crime trends at treatment places relative to control places. UNITS OF ANALYSIS: Using computerized mapping and database software, a micro-level place database of violent index crimes at all street segments and intersections in Boston was created. Yearly counts of violent index crimes between 2000 and 2009 at the treatment and comparison street segments and intersections served as the key outcome measure. The SST program was associated with a statistically significant reduction in violent index crimes at the treatment places relative to the comparison places without displacing crime into proximate areas. To overcome the challenges of evaluation in real-world settings, evaluators need to continuously develop innovative approaches that take advantage of new theoretical and methodological approaches.

  18. Social media and outbreaks of emerging infectious diseases: A systematic review of literature.

    PubMed

    Tang, Lu; Bie, Bijie; Park, Sung-Eun; Zhi, Degui

    2018-04-05

    The public often turn to social media for information during emerging infectious diseases (EIDs) outbreaks. This study identified the major approaches and assessed the rigors in published research articles on EIDs and social media. We searched 5 databases for published journal articles on EIDs and social media. We then evaluated these articles in terms of EIDs studied, social media examined, theoretical frameworks, methodologic approaches, and research findings. Thirty articles were included in the analysis (published between January 1, 2010, and March 1, 2016). EIDs that received most scholarly attention were H1N1 (or swine flu, n = 15), Ebola virus (n = 10), and H7N9 (or avian flu/bird flu, n = 2). Twitter was the most often studied social media (n = 17), followed by YouTube (n = 6), Facebook (n = 6), and blogs (n = 6). Three major approaches in this area of inquiry are identified: (1) assessment of the public's interest in and responses to EIDs, (2) examination of organizations' use of social media in communicating EIDs, and (3) evaluation of the accuracy of EID-related medical information on social media. Although academic studies of EID communication on social media are on the rise, they still suffer from a lack of theorization and a need for more methodologic rigor. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  19. Evaluative Criteria for Qualitative Research in Health Care: Controversies and Recommendations

    PubMed Central

    Cohen, Deborah J.; Crabtree, Benjamin F.

    2008-01-01

    PURPOSE We wanted to review and synthesize published criteria for good qualitative research and develop a cogent set of evaluative criteria. METHODS We identified published journal articles discussing criteria for rigorous research using standard search strategies then examined reference sections of relevant journal articles to identify books and book chapters on this topic. A cross-publication content analysis allowed us to identify criteria and understand the beliefs that shape them. RESULTS Seven criteria for good qualitative research emerged: (1) carrying out ethical research; (2) importance of the research; (3) clarity and coherence of the research report; (4) use of appropriate and rigorous methods; (5) importance of reflexivity or attending to researcher bias; (6) importance of establishing validity or credibility; and (7) importance of verification or reliability. General agreement was observed across publications on the first 4 quality dimensions. On the last 3, important divergent perspectives were observed in how these criteria should be applied to qualitative research, with differences based on the paradigm embraced by the authors. CONCLUSION Qualitative research is not a unified field. Most manuscript and grant reviewers are not qualitative experts and are likely to embrace a generic set of criteria rather than those relevant to the particular qualitative approach proposed or reported. Reviewers and researchers need to be aware of this tendency and educate health care researchers about the criteria appropriate for evaluating qualitative research from within the theoretical and methodological framework from which it emerges. PMID:18626033

  20. Sub-Doppler spectroscopy of the trans-HOCO radical in the OH stretching mode.

    PubMed

    Chang, Chih-Hsuan; Buckingham, Grant T; Nesbitt, David J

    2013-12-19

    Rovibrational spectroscopy of the fundamental OH stretching mode of the trans-HOCO radical has been studied via sub-Doppler high-resolution infrared laser absorption in a discharge slit-jet expansion. The trans-HOCO radical is formed by discharge dissociation of H2O to form OH, which then combines with CO and cools in the Ne expansion to a rotational temperature of 13.0(6) K. Rigorous assignment of both a-type and b-type spectral transitions is made possible by two-line combination differences from microwave studies, with full rovibrational analysis of the spectrum based on a Watson asymmetric top Hamiltonian. Additionally, fine structure splittings of each line due to electron spin are completely resolved, thus permitting all three ε(aa), ε(bb), ε(cc) spin-rotation constants to be experimentally determined in the vibrationally excited state. Furthermore, as both a- and b-type transitions for trans-HOCO are observed for the first time, the ratio of transition dipole moment projections along the a and b principal axes is determined to be μ(a)/μ(b) = 1.78(5), which is in close agreement with density functional quantum theoretical predictions (B3LYP/6-311++g(3df,3pd), μ(a)/μ(b) = 1.85). Finally, we note the energetic possibility in the excited OH stretch state for predissociation dynamics (i.e., trans-HOCO → H + CO2), with the present sub-Doppler line widths providing a rigorous upper limit of >2.7 ns for the predissociation lifetime.

  1. "Everybody knows psychology is not a real science": Public perceptions of psychology and how we can improve our relationship with policymakers, the scientific community, and the general public.

    PubMed

    Ferguson, Christopher J

    2015-09-01

    In a recent seminal article, Lilienfeld (2012) argued that psychological science is experiencing a public perception problem that has been caused by both public misconceptions about psychology, as well as the psychological science community's failure to distinguish itself from pop psychology and questionable therapeutic practices. Lilienfeld's analysis is an important and cogent synopsis of external problems that have limited psychological science's penetration into public knowledge. The current article expands upon this by examining internal problems, or problems within psychological science that have potentially limited its impact with policymakers, other scientists, and the public. These problems range from the replication crisis and defensive reactions to it, overuse of politicized policy statements by professional advocacy groups such as the American Psychological Association (APA), and continued overreliance on mechanistic models of human behavior. It is concluded that considerable problems arise from psychological science's tendency to overcommunicate mechanistic concepts based on weak and often unreplicated (or unreplicable) data that do not resonate with the everyday experiences of the general public or the rigor of other scholarly fields. It is argued that a way forward can be seen by, on one hand, improving the rigor and transparency of psychological science, and making theoretical innovations that better acknowledge the complexities of the human experience. (PsycINFO Database Record (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  2. Subject order-independent group ICA (SOI-GICA) for functional MRI data analysis.

    PubMed

    Zhang, Han; Zuo, Xi-Nian; Ma, Shuang-Ye; Zang, Yu-Feng; Milham, Michael P; Zhu, Chao-Zhe

    2010-07-15

    Independent component analysis (ICA) is a data-driven approach to study functional magnetic resonance imaging (fMRI) data. Particularly, for group analysis on multiple subjects, temporally concatenation group ICA (TC-GICA) is intensively used. However, due to the usually limited computational capability, data reduction with principal component analysis (PCA: a standard preprocessing step of ICA decomposition) is difficult to achieve for a large dataset. To overcome this, TC-GICA employs multiple-stage PCA data reduction. Such multiple-stage PCA data reduction, however, leads to variable outputs due to different subject concatenation orders. Consequently, the ICA algorithm uses the variable multiple-stage PCA outputs and generates variable decompositions. In this study, a rigorous theoretical analysis was conducted to prove the existence of such variability. Simulated and real fMRI experiments were used to demonstrate the subject-order-induced variability of TC-GICA results using multiple PCA data reductions. To solve this problem, we propose a new subject order-independent group ICA (SOI-GICA). Both simulated and real fMRI data experiments demonstrated the high robustness and accuracy of the SOI-GICA results compared to those of traditional TC-GICA. Accordingly, we recommend SOI-GICA for group ICA-based fMRI studies, especially those with large data sets. Copyright 2010 Elsevier Inc. All rights reserved.

  3. Volume phase holographic grating used for beams combination of RGB primary colors

    NASA Astrophysics Data System (ADS)

    Liu, Hui; Zhang, Xizhao; Tang, Minxue

    2013-12-01

    Volume phase holographic grating (VPHG) has the characteristics of high diffraction efficiency, high signal to noise ratio, high wavelength and angular selectivity, low scattering , low absorption and low cost. It has been widely used in high resolution spectrometer, wavelength division multiplexing and pulse compression technique. In this paper, a novel kind of RGB primary colors beams combiner which is consisted of a transmission VPHG and a reflection VPHG as core components is proposed. The design idea of the element is described in detail. Based on the principle of VPHG, the rigorous coupled wave analysis (RCWA) and Kogelnik's coupled wave theory, diffraction properties of the transmission and reflection VPHG are studied theoretically. As an example, three primary colors at wavelengths of 632.8nm, 532nm and 476.5nm are taken into account. Dichromated gelatin (DCG) is used as the holographic recording material. The grating parameters are determined by the Bragg conditions. The TE and TM wave diffraction efficiency, the wavelength selectivity and the angular selectivity of the transmission and reflection VPHG are calculated and optimized by setting the amplitude of the index modulation (Δn) and the thickness of the gelatin layer (d) by applying Kogelnik's coupled wave theory and G-solver software, respectively. The theoretical calculating results give guidance for further manufacture of the element.

  4. MRF energy minimization and beyond via dual decomposition.

    PubMed

    Komodakis, Nikos; Paragios, Nikos; Tziritas, Georgios

    2011-03-01

    This paper introduces a new rigorous theoretical framework to address discrete MRF-based optimization in computer vision. Such a framework exploits the powerful technique of Dual Decomposition. It is based on a projected subgradient scheme that attempts to solve an MRF optimization problem by first decomposing it into a set of appropriately chosen subproblems, and then combining their solutions in a principled way. In order to determine the limits of this method, we analyze the conditions that these subproblems have to satisfy and demonstrate the extreme generality and flexibility of such an approach. We thus show that by appropriately choosing what subproblems to use, one can design novel and very powerful MRF optimization algorithms. For instance, in this manner we are able to derive algorithms that: 1) generalize and extend state-of-the-art message-passing methods, 2) optimize very tight LP-relaxations to MRF optimization, and 3) take full advantage of the special structure that may exist in particular MRFs, allowing the use of efficient inference techniques such as, e.g., graph-cut-based methods. Theoretical analysis on the bounds related with the different algorithms derived from our framework and experimental results/comparisons using synthetic and real data for a variety of tasks in computer vision demonstrate the extreme potentials of our approach.

  5. Management matters: the link between hospital organisation and quality of patient care

    PubMed Central

    West, E.

    2001-01-01

    Some hospital trusts and health authorities consistently outperform others on different dimensions of performance. Why? There is some evidence that "management matters", as well as the combined efforts of individual clinicians and teams. However, studies that have been conducted on the link between the organisation and management of services and quality of patient care can be criticised both theoretically and methodologically. A larger, and arguably more rigorous, body of work exists on the performance of firms in the private sector, often conducted within the disciplines of organisational behaviour or human resource management. Studies in these traditions have focused on the effects of decentralisation, participation, innovative work practices, and "complementarities" on outcome variables such as job satisfaction and performance. The aim of this paper is to identify a number of reviews and research traditions that might bring new ideas into future work on the determinants of hospital performance. Ideally, future research should be more theoretically informed and should use longitudinal rather than cross sectional research designs. The use of statistical methods such as multilevel modelling, which allow for the inclusion of variables at different levels of analysis, would enable estimation of the separate contribution that structure and process make to hospital outcomes. Key Words: hospital organisation; hospital performance; management; quality of care PMID:11239143

  6. Assessing collaborative computing: development of the Collaborative-Computing Observation Instrument (C-COI)

    NASA Astrophysics Data System (ADS)

    Israel, Maya; Wherfel, Quentin M.; Shehab, Saadeddine; Ramos, Evan A.; Metzger, Adam; Reese, George C.

    2016-07-01

    This paper describes the development, validation, and uses of the Collaborative Computing Observation Instrument (C-COI), a web-based analysis instrument that classifies individual and/or collaborative behaviors of students during computing problem-solving (e.g. coding, programming). The C-COI analyzes data gathered through video and audio screen recording software that captures students' computer screens as they program, and their conversations with their peers or adults. The instrument allows researchers to organize and quantify these data to track behavioral patterns that could be further analyzed for deeper understanding of persistence and/or collaborative interactions. The article provides a rationale for the C-COI including the development of a theoretical framework for measuring collaborative interactions in computer-mediated environments. This theoretical framework relied on the computer-supported collaborative learning literature related to adaptive help seeking, the joint problem-solving space in which collaborative computing occurs, and conversations related to outcomes and products of computational activities. Instrument development and validation also included ongoing advisory board feedback from experts in computer science, collaborative learning, and K-12 computing as well as classroom observations to test out the constructs in the C-COI. These processes resulted in an instrument with rigorous validation procedures and a high inter-rater reliability.

  7. Theoretical and experimental analysis of the electromechanical behavior of a compact spherical loudspeaker array for directivity control.

    PubMed

    Pasqual, Alexander Mattioli; Herzog, Philippe; Arruda, José Roberto de França

    2010-12-01

    Sound directivity control is made possible by a compact array of independent loudspeakers operating at the same frequency range. The drivers are usually distributed over a sphere-like frame according to a Platonic solid geometry to obtain a highly symmetrical configuration. The radiation pattern of spherical loudspeaker arrays has been predicted from the surface velocity pattern by approximating the drivers membranes as rigid vibrating spherical caps, although a rigorous assessment of this model has not been provided so far. Many aspects concerning compact array electromechanics remain unclear, such as the effects on the acoustical performance of the drivers interaction inside the array cavity, or the fact that voltages rather than velocities are controlled in practice. This work presents a detailed investigation of the electromechanical behavior of spherical loudspeaker arrays. Simulation results are shown to agree with laser vibrometer measurements and experimental sound power data obtained for a 12-driver spherical array prototype at low frequencies, whereas the non-rigid body motion and the first cavity eigenfrequency yield a discrepancy between theoretical and experimental results at high frequencies. Finally, although the internal acoustic coupling affects the drivers vibration in the low-frequency range, it does not play an important role on the radiated sound power.

  8. Institutions and national development in Latin America: a comparative study

    PubMed Central

    Portes, Alejandro; Smith, Lori D.

    2013-01-01

    We review the theoretical and empirical literatures on the role of institutions on national development as a prelude to present a more rigorous and measurable definition of the concept and a methodology to study this relationship at the national and subnational levels. The existing research literature features conflicting definitions of the concept of “institutions” and empirical tests based mostly on reputational indices, with countries as units of analysis. The present study’s methodology is based on a set of five strategic organizations studied comparatively in five Latin American countries. These include key federal agencies, public administrative organizations, and stock exchanges. Systematic analysis of results show a pattern of differences between economically-oriented institutions and those entrusted with providing basic services to the general population. Consistent differences in institutional quality also emerge across countries, despite similar levels of economic development. Using the algebraic methods developed by Ragin, we test six hypotheses about factors determining the developmental character of particular institutions. Implications of results for theory and for methodological practices of future studies in this field are discussed. PMID:26543407

  9. Mobility measurement by analysis of fluorescence photobleaching recovery kinetics.

    PubMed Central

    Axelrod, D; Koppel, D E; Schlessinger, J; Elson, E; Webb, W W

    1976-01-01

    Fluorescence photobleaching recovery (FPR) denotes a method for measuring two-dimensional lateral mobility of fluorescent particles, for example, the motion of fluorescently labeled molecules in approximately 10 mum2 regions of a single cell surface. A small spot on the fluorescent surface is photobleached by a brief exposure to an intense focused laser beam, and the subsequent recovery of the fluorescence is monitored by the same, but attenuated, laser beam. Recovery occurs by replenishment of intact fluorophore in the bleached spot by lateral transport from the surrounding surface. We present the theoretical basis and some practical guidelines for simple, rigorous analysis of FPR experiments. Information obtainable from FPR experiments includes: (a) identification of transport process type, i.e. the admixture of random diffusion and uniform directed flow; (b) determination of the absolute mobility coefficient, i.e. the diffusion constant and/or flow velocity; and (c) the fraction of total fluorophore which is mobile. To illustrate the experimental method and to verify the theory for diffusion, we describe some model experiments on aqueous solutions of rhodamine 6G. PMID:786399

  10. Effect of etching parameters on antireflection properties of Si subwavelength grating structures for solar cell applications

    NASA Astrophysics Data System (ADS)

    Leem, J. W.; Song, Y. M.; Lee, Y. T.; Yu, J. S.

    2010-09-01

    Silicon (Si) subwavelength grating (SWG) structures were fabricated on Si substrates by holographic lithography and subsequent inductively coupled plasma (ICP) etching process using SiCl4 with or without Ar addition for solar cell applications. To ensure a good nanosized pattern transfer into the underlying Si layer, the etch selectivity of Si over the photoresist mask is optimized by varying the etching parameters, thus improving antireflection characteristics. For antireflection analysis of Si SWG surfaces, the optical reflectivity is measured experimentally and it is also calculated theoretically by a rigorous coupled-wave analysis. The reflectance depends on the height, period, and shape of two-dimensional periodic Si subwavelength structures, correlated with ICP etching parameters. The optimized Si SWG structure exhibits a dramatic decrease in optical reflection of the Si surface over a wide angle of incident light ( θ i ), i.e. less than 5% at wavelengths of 300-1100 nm, leading to good wide-angle antireflection characteristics (i.e. solar-weighted reflection of 1.7-4.9% at θ i <50°) of Si solar cells.

  11. Broadband and wide-angle distributed Bragg reflectors based on amorphous germanium films by glancing angle deposition.

    PubMed

    Leem, Jung Woo; Yu, Jae Su

    2012-08-27

    We fabricated the distributed Bragg reflectors (DBRs) with amorphous germanium (a-Ge) films consisted of the same materials at a center wavelength (λc) of 1.33 μm by the glancing angle deposition. Their optical reflectance properties were investigated in the infrared wavelength region of 1-1.9 μm at incident light angles (θ inc) of 8-70°, together with the theoretical analysis using a rigorous coupled-wave analysis simulation. The two alternating a-Ge films at the incident vapor flux angles of 0 and 75° were formed as the high and low refractive index materials, respectively. The a-Ge DBR with only 5 periods exhibited a normalized stop bandwidth (∆λ/λ c) of ~24.1%, maintaining high reflectance (R) values of > 99%. Even at a high θ inc of 70°, the ∆λ/λ c was ~21.9%, maintaining R values of > 85%. The a-Ge DBR with good uniformity was obtained over the area of a 2 inch Si wafer. The calculated reflectance results showed a similar tendency to the measured data.

  12. Effect of Exergames on Depression: A Systematic Review and Meta-Analysis.

    PubMed

    Li, Jinhui; Theng, Yin-Leng; Foo, Schubert

    2016-01-01

    Depression is a major public health concern in current society. In recent years many studies began to investigate the potential benefits of exergames on depression. The current study aimed to provide a systematic review to synthesize the existing studies and discover the overall effect size of exergames on treating depression. A comprehensive literature search was conducted among major bibliographic databases in computer technology, psychology, and medical science. Key study characteristics of participants, interventions, and experiment were extracted in the systematic review. Both studies using independent groups and matched groups were included in meta-analysis. Overall effect size of Hedges' g was calculated, followed by subgroup analyses. Nine studies included in the review, while eight studies applying exergames of Nintendo's Wii or Wii Fit. A random effects meta-analysis on eight studies resulted an overall significant effect size of g = 0.21. Demographic factors, depression severity, number of session, and game type were found to be significant moderators for the effectiveness. The study has not only supported the positive effect of exergames on alleviating depression, but also provided many theoretical and practical implications for health professionals and police makers. More rigorous experimental controlled studies are needed in this new research field.

  13. Enhancing the quality and credibility of qualitative analysis.

    PubMed

    Patton, M Q

    1999-12-01

    Varying philosophical and theoretical orientations to qualitative inquiry remind us that issues of quality and credibility intersect with audience and intended research purposes. This overview examines ways of enhancing the quality and credibility of qualitative analysis by dealing with three distinct but related inquiry concerns: rigorous techniques and methods for gathering and analyzing qualitative data, including attention to validity, reliability, and triangulation; the credibility, competence, and perceived trustworthiness of the qualitative researcher; and the philosophical beliefs of evaluation users about such paradigm-based preferences as objectivity versus subjectivity, truth versus perspective, and generalizations versus extrapolations. Although this overview examines some general approaches to issues of credibility and data quality in qualitative analysis, it is important to acknowledge that particular philosophical underpinnings, specific paradigms, and special purposes for qualitative inquiry will typically include additional or substitute criteria for assuring and judging quality, validity, and credibility. Moreover, the context for these considerations has evolved. In early literature on evaluation methods the debate between qualitative and quantitative methodologists was often strident. In recent years the debate has softened. A consensus has gradually emerged that the important challenge is to match appropriately the methods to empirical questions and issues, and not to universally advocate any single methodological approach for all problems.

  14. Space radiator simulation system analysis

    NASA Technical Reports Server (NTRS)

    Black, W. Z.; Wulff, W.

    1972-01-01

    A transient heat transfer analysis was carried out on a space radiator heat rejection system exposed to an arbitrarily prescribed combination of aerodynamic heating, solar, albedo, and planetary radiation. A rigorous analysis was carried out for the radiation panel and tubes lying in one plane and an approximate analysis was used to extend the rigorous analysis to the case of a curved panel. The analysis permits the consideration of both gaseous and liquid coolant fluids, including liquid metals, under prescribed, time dependent inlet conditions. The analysis provided a method for predicting: (1) transient and steady-state, two dimensional temperature profiles, (2) local and total heat rejection rates, (3) coolant flow pressure in the flow channel, and (4) total system weight and protection layer thickness.

  15. Second-harmonic generation from a positive-negative index material heterostructure.

    PubMed

    Mattiucci, Nadia; D'Aguanno, Giuseppe; Bloemer, Mark J; Scalora, Michael

    2005-12-01

    Resonant cavities have been widely used in the past to enhance material, nonlinear response. Traditional mirrors include metallic films and distributed Bragg reflectors. In this paper we propose negative index material mirrors as a third alternative. With the help of a rigorous Green function approach, we investigate second harmonic generation from single and coupled cavities, and theoretically prove that negative index material mirrors can raise the nonlinear conversion efficiency of a bulk material by at least four orders of magnitude compared to a bulk medium.

  16. Optical diffraction by ordered 2D arrays of silica microspheres

    NASA Astrophysics Data System (ADS)

    Shcherbakov, A. A.; Shavdina, O.; Tishchenko, A. V.; Veillas, C.; Verrier, I.; Dellea, O.; Jourlin, Y.

    2017-03-01

    The article presents experimental and theoretical studies of angular dependent diffraction properties of 2D monolayer arrays of silica microspheres. High-quality large area defect-free monolayers of 1 μm diameter silica microspheres were deposited by the Langmuir-Blodgett technique under an accurate optical control. Measured angular dependencies of zeroth and one of the first order diffraction efficiencies produced by deposited samples were simulated by the rigorous Generalized Source Method taking into account particle size dispersion and lattice nonideality.

  17. A Theoretically Driven Investigation of the Efficacy of an Immersive Interactive Avatar Rich Virtual Environment in Pre-deployment Nursing Knowledge and Teamwork Skills Training

    DTIC Science & Technology

    2013-05-01

    estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data... pedagogy , and instructional quality. Measures of effectiveness data is minimal and often has not been conducted in a rigorous manner. To be clear...instructional pedagogy and instructional quality between the programs offered. Efficacy studies beyond student satisfaction scores have not been done in a

  18. Consumer Opinion of Emergency/Assault Food Packet under Rigorous Field Conditions in a Cold Weather Environment

    DTIC Science & Technology

    1980-01-01

    and Pudding Bars Although the beverage and pudding bars were designed to be eaten either dry, as packaged, or rehydrated, theoretically with hot water ...pudding bars being rehydrated with hot water and no cases at all of beverage bars being so rehydrated, data will be pre- sented only for the dry and...rehydrated with cold water , is shown in Table 6, which also shows the order in which the beverage and pudding bars were ranked by the survey respondents

  19. Self-consistent multidimensional electron kinetic model for inductively coupled plasma sources

    NASA Astrophysics Data System (ADS)

    Dai, Fa Foster

    Inductively coupled plasma (ICP) sources have received increasing interest in microelectronics fabrication and lighting industry. In 2-D configuration space (r, z) and 2-D velocity domain (νθ,νz), a self- consistent electron kinetic analytic model is developed for various ICP sources. The electromagnetic (EM) model is established based on modal analysis, while the kinetic analysis gives the perturbed Maxwellian distribution of electrons by solving Boltzmann-Vlasov equation. The self- consistent algorithm combines the EM model and the kinetic analysis by updating their results consistently until the solution converges. The closed-form solutions in the analytical model provide rigorous and fast computing for the EM fields and the electron kinetic behavior. The kinetic analysis shows that the RF energy in an ICP source is extracted by a collisionless dissipation mechanism, if the electron thermovelocity is close to the RF phase velocities. A criterion for collisionless damping is thus given based on the analytic solutions. To achieve uniformly distributed plasma for plasma processing, we propose a novel discharge structure with both planar and vertical coil excitations. The theoretical results demonstrate improved uniformity for the excited azimuthal E-field in the chamber. Non-monotonic spatial decay in electric field and space current distributions was recently observed in weakly- collisional plasmas. The anomalous skin effect is found to be responsible for this phenomenon. The proposed model successfully models the non-monotonic spatial decay effect and achieves good agreements with the measurements for different applied RF powers. The proposed analytical model is compared with other theoretical models and different experimental measurements. The developed model is also applied to two kinds of ICP discharges used for electrodeless light sources. One structure uses a vertical internal coil antenna to excite plasmas and another has a metal shield to prevent the electromagnetic radiation. The theoretical results delivered by the proposed model agree quite well with the experimental measurements in many aspects. Therefore, the proposed self-consistent model provides an efficient and reliable means for designing ICP sources in various applications such as VLSI fabrication and electrodeless light sources.

  20. Evolution of sex: Using experimental genomics to select among competing theories.

    PubMed

    Sharp, Nathaniel P; Otto, Sarah P

    2016-08-01

    Few topics have intrigued biologists as much as the evolution of sex. Understanding why sex persists despite its costs requires not just rigorous theoretical study, but also empirical data on related fundamental issues, including the nature of genetic variance for fitness, patterns of genetic interactions, and the dynamics of adaptation. The increasing feasibility of examining genomes in an experimental context is now shedding new light on these problems. Using this approach, McDonald et al. recently demonstrated that sex uncouples beneficial and deleterious mutations, allowing selection to proceed more effectively with sex than without. Here we discuss the insights provided by this study, along with other recent empirical work, in the context of the major theoretical models for the evolution of sex. © 2016 WILEY Periodicals, Inc.

  1. OCT Amplitude and Speckle Statistics of Discrete Random Media.

    PubMed

    Almasian, Mitra; van Leeuwen, Ton G; Faber, Dirk J

    2017-11-01

    Speckle, amplitude fluctuations in optical coherence tomography (OCT) images, contains information on sub-resolution structural properties of the imaged sample. Speckle statistics could therefore be utilized in the characterization of biological tissues. However, a rigorous theoretical framework relating OCT speckle statistics to structural tissue properties has yet to be developed. As a first step, we present a theoretical description of OCT speckle, relating the OCT amplitude variance to size and organization for samples of discrete random media (DRM). Starting the calculations from the size and organization of the scattering particles, we analytically find expressions for the OCT amplitude mean, amplitude variance, the backscattering coefficient and the scattering coefficient. We assume fully developed speckle and verify the validity of this assumption by experiments on controlled samples of silica microspheres suspended in water. We show that the OCT amplitude variance is sensitive to sub-resolution changes in size and organization of the scattering particles. Experimentally determined and theoretically calculated optical properties are compared and in good agreement.

  2. Crisis in science: in search for new theoretical foundations.

    PubMed

    Schroeder, Marcin J

    2013-09-01

    Recognition of the need for theoretical biology more than half century ago did not bring substantial progress in this direction. Recently, the need for new methods in science, including physics became clear. The breakthrough should be sought in answering the question "What is life?", which can help to explain the mechanisms of consciousness and consequently give insight into the way we comprehend reality. This could help in the search for new methods in the study of both physical and biological phenomena. However, to achieve this, new theoretical discipline will have to be developed with a very general conceptual framework and rigor of mathematical reasoning, allowing it to assume the leading role in science. Since its foundations are in the recognition of the role of life and consciousness in the epistemic process, it could be called biomathics. The prime candidates proposed here for being the fundamental concepts for biomathics are 'information' and 'information integration', with an appropriately general mathematical formalism. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Information-Theoretic Assessment of Sample Imaging Systems

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.; Alter-Gartenberg, Rachel; Park, Stephen K.; Rahman, Zia-ur

    1999-01-01

    By rigorously extending modern communication theory to the assessment of sampled imaging systems, we develop the formulations that are required to optimize the performance of these systems within the critical constraints of image gathering, data transmission, and image display. The goal of this optimization is to produce images with the best possible visual quality for the wide range of statistical properties of the radiance field of natural scenes that one normally encounters. Extensive computational results are presented to assess the performance of sampled imaging systems in terms of information rate, theoretical minimum data rate, and fidelity. Comparisons of this assessment with perceptual and measurable performance demonstrate that (1) the information rate that a sampled imaging system conveys from the captured radiance field to the observer is closely correlated with the fidelity, sharpness and clarity with which the observed images can be restored and (2) the associated theoretical minimum data rate is closely correlated with the lowest data rate with which the acquired signal can be encoded for efficient transmission.

  4. The Alleged Crisis and the Illusion of Exact Replication.

    PubMed

    Stroebe, Wolfgang; Strack, Fritz

    2014-01-01

    There has been increasing criticism of the way psychologists conduct and analyze studies. These critiques as well as failures to replicate several high-profile studies have been used as justification to proclaim a "replication crisis" in psychology. Psychologists are encouraged to conduct more "exact" replications of published studies to assess the reproducibility of psychological research. This article argues that the alleged "crisis of replicability" is primarily due to an epistemological misunderstanding that emphasizes the phenomenon instead of its underlying mechanisms. As a consequence, a replicated phenomenon may not serve as a rigorous test of a theoretical hypothesis because identical operationalizations of variables in studies conducted at different times and with different subject populations might test different theoretical constructs. Therefore, we propose that for meaningful replications, attempts at reinstating the original circumstances are not sufficient. Instead, replicators must ascertain that conditions are realized that reflect the theoretical variable(s) manipulated (and/or measured) in the original study. © The Author(s) 2013.

  5. Towards rigorous analysis of the Levitov-Mirlin-Evers recursion

    NASA Astrophysics Data System (ADS)

    Fyodorov, Y. V.; Kupiainen, A.; Webb, C.

    2016-12-01

    This paper aims to develop a rigorous asymptotic analysis of an approximate renormalization group recursion for inverse participation ratios P q of critical powerlaw random band matrices. The recursion goes back to the work by Mirlin and Evers (2000 Phys. Rev. B 62 7920) and earlier works by Levitov (1990 Phys. Rev. Lett. 64 547, 1999 Ann. Phys. 8 697-706) and is aimed to describe the ensuing multifractality of the eigenvectors of such matrices. We point out both similarities and dissimilarities between the LME recursion and those appearing in the theory of multiplicative cascades and branching random walks and show that the methods developed in those fields can be adapted to the present case. In particular the LME recursion is shown to exhibit a phase transition, which we expect is a freezing transition, where the role of temperature is played by the exponent q. However, the LME recursion has features that make its rigorous analysis considerably harder and we point out several open problems for further study.

  6. An information-theoretic approach to the modeling and analysis of whole-genome bisulfite sequencing data.

    PubMed

    Jenkinson, Garrett; Abante, Jordi; Feinberg, Andrew P; Goutsias, John

    2018-03-07

    DNA methylation is a stable form of epigenetic memory used by cells to control gene expression. Whole genome bisulfite sequencing (WGBS) has emerged as a gold-standard experimental technique for studying DNA methylation by producing high resolution genome-wide methylation profiles. Statistical modeling and analysis is employed to computationally extract and quantify information from these profiles in an effort to identify regions of the genome that demonstrate crucial or aberrant epigenetic behavior. However, the performance of most currently available methods for methylation analysis is hampered by their inability to directly account for statistical dependencies between neighboring methylation sites, thus ignoring significant information available in WGBS reads. We present a powerful information-theoretic approach for genome-wide modeling and analysis of WGBS data based on the 1D Ising model of statistical physics. This approach takes into account correlations in methylation by utilizing a joint probability model that encapsulates all information available in WGBS methylation reads and produces accurate results even when applied on single WGBS samples with low coverage. Using the Shannon entropy, our approach provides a rigorous quantification of methylation stochasticity in individual WGBS samples genome-wide. Furthermore, it utilizes the Jensen-Shannon distance to evaluate differences in methylation distributions between a test and a reference sample. Differential performance assessment using simulated and real human lung normal/cancer data demonstrate a clear superiority of our approach over DSS, a recently proposed method for WGBS data analysis. Critically, these results demonstrate that marginal methods become statistically invalid when correlations are present in the data. This contribution demonstrates clear benefits and the necessity of modeling joint probability distributions of methylation using the 1D Ising model of statistical physics and of quantifying methylation stochasticity using concepts from information theory. By employing this methodology, substantial improvement of DNA methylation analysis can be achieved by effectively taking into account the massive amount of statistical information available in WGBS data, which is largely ignored by existing methods.

  7. Rigorous evaluation of chemical measurement uncertainty: liquid chromatographic analysis methods using detector response factor calibration

    NASA Astrophysics Data System (ADS)

    Toman, Blaza; Nelson, Michael A.; Bedner, Mary

    2017-06-01

    Chemical measurement methods are designed to promote accurate knowledge of a measurand or system. As such, these methods often allow elicitation of latent sources of variability and correlation in experimental data. They typically implement measurement equations that support quantification of effects associated with calibration standards and other known or observed parametric variables. Additionally, multiple samples and calibrants are usually analyzed to assess accuracy of the measurement procedure and repeatability by the analyst. Thus, a realistic assessment of uncertainty for most chemical measurement methods is not purely bottom-up (based on the measurement equation) or top-down (based on the experimental design), but inherently contains elements of both. Confidence in results must be rigorously evaluated for the sources of variability in all of the bottom-up and top-down elements. This type of analysis presents unique challenges due to various statistical correlations among the outputs of measurement equations. One approach is to use a Bayesian hierarchical (BH) model which is intrinsically rigorous, thus making it a straightforward method for use with complex experimental designs, particularly when correlations among data are numerous and difficult to elucidate or explicitly quantify. In simpler cases, careful analysis using GUM Supplement 1 (MC) methods augmented with random effects meta analysis yields similar results to a full BH model analysis. In this article we describe both approaches to rigorous uncertainty evaluation using as examples measurements of 25-hydroxyvitamin D3 in solution reference materials via liquid chromatography with UV absorbance detection (LC-UV) and liquid chromatography mass spectrometric detection using isotope dilution (LC-IDMS).

  8. Resonant tunneling assisted propagation and amplification of plasmons in high electron mobility transistors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhardwaj, Shubhendu; Sensale-Rodriguez, Berardi; Xing, Huili Grace

    A rigorous theoretical and computational model is developed for the plasma-wave propagation in high electron mobility transistor structures with electron injection from a resonant tunneling diode at the gate. We discuss the conditions in which low-loss and sustainable plasmon modes can be supported in such structures. The developed analytical model is used to derive the dispersion relation for these plasmon-modes. A non-linear full-wave-hydrodynamic numerical solver is also developed using a finite difference time domain algorithm. The developed analytical solutions are validated via the numerical solution. We also verify previous observations that were based on a simplified transmission line model. Itmore » is shown that at high levels of negative differential conductance, plasmon amplification is indeed possible. The proposed rigorous models can enable accurate design and optimization of practical resonant tunnel diode-based plasma-wave devices for terahertz sources, mixers, and detectors, by allowing a precise representation of their coupling when integrated with other electromagnetic structures.« less

  9. Research that Helps Move Us Closer to a World where Each Child Thrives

    PubMed Central

    Diamond, Adele

    2015-01-01

    Schools are curtailing programs in arts, physical exercise, and play so more time and resources can be devoted to academic instruction. Yet indications are that the arts (e.g., music, dance, or theatre) and physical activity (e.g., sports, martial arts, or youth circus) are crucial for all aspects of children’s development – including success in school. Thus in cutting those activities, schools may be impeding academic success, not aiding it. Correlational and retrospective studies have laid the groundwork, as have moving personal accounts, case studies, and theoretical arguments. The time is ripe for rigorous studies to investigate causality (Do arts and physical activities actually produce academic benefits or would kids in those activities have succeeded anyway?) and what characteristics of programs account for the benefits. Instead of simply claiming that the arts and/or physical activities can transform kids’ lives, that needs to be demonstrated, and granting agencies need to be more open to funding rigorous research of real-world arts and physical-activity programs. PMID:26635510

  10. Thermal fatigue and thermal shock in bedrock: An attempt to unravel the geomorphic processes and products

    NASA Astrophysics Data System (ADS)

    Hall, Kevin; Thorn, Colin E.

    2014-02-01

    Widespread acceptance in science at-large notwithstanding, the ability of thermal stresses to produce thermal fatigue (TF) and/or thermal shock (TS) in bedrock and coarse debris in the field is often doubted. Commonly called insolation weathering in geomorphology, the results of questionable laboratory experiments have led many geomorphologists to consider terrestrial temperatures to be inadequate to generate thermally induced stresses leading to rock failure; the exceptions are the action of fire or lightning. We comprehensively survey the general scientific literature on TF and TS while rigorously scrutinizing that relating to geomorphology. Findings indicate theoretical and experimental information is adequate to establish the feasibility of TF and TS in rock stemming from rock temperatures monitored in the field. While TS may exhibit fracture patterns that are uniquely diagnostic, those of TF lack any such attributes. It would appear unlikely that TF can prepare or weaken rock to increase the likelihood of TS. The question of whether widespread polygonal versus rectilinear cracking is diagnostic of TS is presently an open one as possible explanations invoke process(es) and/or host material(s) and, consequently, to assign palaeoenvironmental significance to such fracture patterns is premature at this time. Further geomorphological laboratory research into TF and TS is merited as sufficient theoretical underpinning already exists. However, laboratory experimentation needs to be much more rigorously defined and executed and is faced with significant hurdles if it is to be effectively linked to field observations.

  11. Critical Analysis of Strategies for Determining Rigor in Qualitative Inquiry.

    PubMed

    Morse, Janice M

    2015-09-01

    Criteria for determining the trustworthiness of qualitative research were introduced by Guba and Lincoln in the 1980s when they replaced terminology for achieving rigor, reliability, validity, and generalizability with dependability, credibility, and transferability. Strategies for achieving trustworthiness were also introduced. This landmark contribution to qualitative research remains in use today, with only minor modifications in format. Despite the significance of this contribution over the past four decades, the strategies recommended to achieve trustworthiness have not been critically examined. Recommendations for where, why, and how to use these strategies have not been developed, and how well they achieve their intended goal has not been examined. We do not know, for example, what impact these strategies have on the completed research. In this article, I critique these strategies. I recommend that qualitative researchers return to the terminology of social sciences, using rigor, reliability, validity, and generalizability. I then make recommendations for the appropriate use of the strategies recommended to achieve rigor: prolonged engagement, persistent observation, and thick, rich description; inter-rater reliability, negative case analysis; peer review or debriefing; clarifying researcher bias; member checking; external audits; and triangulation. © The Author(s) 2015.

  12. Music Therapy for Posttraumatic Stress in Adults: A Theoretical Review

    PubMed Central

    Landis-Shack, Nora; Heinz, Adrienne J.; Bonn-Miller, Marcel O.

    2017-01-01

    Music therapy has been employed as a therapeutic intervention to facilitate healing across a variety of clinical populations. There is theoretical and empirical evidence to suggest that individuals with trauma exposure and Posttraumatic Stress Disorder (PTSD), a condition characterized by enduring symptoms of distressing memory intrusions, avoidance, emotional disturbance, and hyperarousal, may derive benefits from music therapy. The current narrative review describes the practice of music therapy and presents a theoretically-informed assessment and model of music therapy as a tool for addressing symptoms of PTSD. The review also presents key empirical studies that support the theoretical assessment. Social, cognitive, and neurobiological mechanisms (e.g., community building, emotion regulation, increased pleasure, anxiety reduction) that promote music therapy’s efficacy as an adjunctive treatment for individuals with posttraumatic stress are discussed. It is concluded that music therapy may be a useful therapeutic tool to reduce symptoms and improve functioning among individuals with trauma exposure and PTSD, though more rigorous empirical study is required. In addition, music therapy may help foster resilience and engage individuals who struggle with stigma associated with seeking professional help. Practical recommendations for incorporating music therapy into clinical practice are offered along with several suggestions for future research. PMID:29290641

  13. Theoretical analyses of resonant frequency shift in anomalous dispersion enhanced resonant optical gyroscopes.

    PubMed

    Lin, Jian; Liu, Jiaming; Zhang, Hao; Li, Wenxiu; Zhao, Lu; Jin, Junjie; Huang, Anping; Zhang, Xiaofu; Xiao, Zhisong

    2016-12-12

    Rigorous expressions of resonant frequency shift (RFS) in anomalous dispersion enhanced resonant optical gyroscopes (ADEROGs) are deduced without making approximation, which provides a precise theoretical guidance to achieve ultra-sensitive ADEROGs. A refractive index related modification factor is introduced when considering special theory of relativity (STR). We demonstrate that the RFS will not be "infinitely large" by using critical anomalous dispersion (CAD) and negative modification does not exist, which make the mechanism of anomalous dispersion enhancement clear and coherent. Although step change of RFS will happen when the anomalous dispersion condition varies, the amplification of RFS is limited by attainable variation of refractive index in practice. Moreover, it is shown that the properties of anomalous dispersion will influence not only the amplification of RFS, but also the detection range of ADEROGs.

  14. EOSlib, Version 3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woods, Nathan; Menikoff, Ralph

    2017-02-03

    Equilibrium thermodynamics underpins many of the technologies used throughout theoretical physics, yet verification of the various theoretical models in the open literature remains challenging. EOSlib provides a single, consistent, verifiable implementation of these models, in a single, easy-to-use software package. It consists of three parts: a software library implementing various published equation-of-state (EOS) models; a database of fitting parameters for various materials for these models; and a number of useful utility functions for simplifying thermodynamic calculations such as computing Hugoniot curves or Riemann problem solutions. Ready availability of this library will enable reliable code-to- code testing of equation-of-state implementations, asmore » well as a starting point for more rigorous verification work. EOSlib also provides a single, consistent API for its analytic and tabular EOS models, which simplifies the process of comparing models for a particular application.« less

  15. Children's Antisocial Behavior, Mental Health, Drug Use, and Educational Performance After Parental Incarceration

    PubMed Central

    Murray, Joseph; Farrington, David P.; Sekol, Ivana

    2012-01-01

    Unprecedented numbers of children experience parental incarceration worldwide. Families and children of prisoners can experience multiple difficulties after parental incarceration, including traumatic separation, loneliness, stigma, confused explanations to children, unstable childcare arrangements, strained parenting, reduced income, and home, school, and neighborhood moves. Children of incarcerated parents often have multiple, stressful life events before parental incarceration. Theoretically, children with incarcerated parents may be at risk for a range of adverse behavioral outcomes. A systematic review was conducted to synthesize empirical evidence on associations between parental incarceration and children's later antisocial behavior, mental health problems, drug use, and educational performance. Results from 40 studies (including 7,374 children with incarcerated parents and 37,325 comparison children in 50 samples) were pooled in a meta-analysis. The most rigorous studies showed that parental incarceration is associated with higher risk for children's antisocial behavior, but not for mental health problems, drug use, or poor educational performance. Studies that controlled for parental criminality or children's antisocial behavior before parental incarceration had a pooled effect size of OR = 1.4 (p < .01), corresponding to about 10% increased risk for antisocial behavior among children with incarcerated parents, compared with peers. Effect sizes did not decrease with number of covariates controlled. However, the methodological quality of many studies was poor. More rigorous tests of the causal effects of parental incarceration are needed, using randomized designs and prospective longitudinal studies. Criminal justice reforms and national support systems might be needed to prevent harmful consequences of parental incarceration for children. PMID:22229730

  16. Tackling complexities in understanding the social determinants of health: the contribution of ethnographic research.

    PubMed

    Bandyopadhyay, Mridula

    2011-11-25

    The complexities inherent in understanding the social determinants of health are often not well-served by quantitative approaches. My aim is to show that well-designed and well-conducted ethnographic studies have an important contribution to make in this regard. Ethnographic research designs are a difficult but rigorous approach to research questions that require us to understand the complexity of people's social and cultural lives. I draw on an ethnographic study to describe the complexities of studying maternal health in a rural area in India. I then show how the lessons learnt in that setting and context can be applied to studies done in very different settings. I show how ethnographic research depends for rigour on a theoretical framework for sample selection; why immersion in the community under study, and rapport building with research participants, is important to ensure rich and meaningful data; and how flexible approaches to data collection lead to the gradual emergence of an analysis based on intense cross-referencing with community views and thus a conclusion that explains the similarities and differences observed. When using ethnographic research design it can be difficult to specify in advance the exact details of the study design. Researchers can encounter issues in the field that require them to change what they planned on doing. In rigorous ethnographic studies, the researcher in the field is the research instrument and needs to be well trained in the method. Ethnographic research is challenging, but nevertheless provides a rewarding way of researching complex health problems that require an understanding of the social and cultural determinants of health.

  17. Landauer-Büttiker and Thouless Conductance

    NASA Astrophysics Data System (ADS)

    Bruneau, L.; Jakšić, V.; Last, Y.; Pillet, C.-A.

    2015-08-01

    In the independent electron approximation, the average (energy/charge/entropy) current flowing through a finite sample connected to two electronic reservoirs can be computed by scattering theoretic arguments which lead to the famous Landauer-Büttiker formula. Another well known formula has been proposed by Thouless on the basis of a scaling argument. The Thouless formula relates the conductance of the sample to the width of the spectral bands of the infinite crystal obtained by periodic juxtaposition of . In this spirit, we define Landauer-Büttiker crystalline currents by extending the Landauer-Büttiker formula to a setup where the sample is replaced by a periodic structure whose unit cell is . We argue that these crystalline currents are closely related to the Thouless currents. For example, the crystalline heat current is bounded above by the Thouless heat current, and this bound saturates iff the coupling between the reservoirs and the sample is reflectionless. Our analysis leads to a rigorous derivation of the Thouless formula from the first principles of quantum statistical mechanics.

  18. Surface plasmon-enhanced optical absorption in monolayer MoS2 with one-dimensional Au grating

    NASA Astrophysics Data System (ADS)

    Song, Jinlin; Lu, Lu; Cheng, Qiang; Luo, Zixue

    2018-05-01

    The optical absorption of a composite photonic structure, namely monolayer molybdenum disulfide (MoS2)-covered Au grating, is theoretically investigated using a rigorous coupled-wave analysis algorithm. The enhancement of localized electromagnetic field due to surface plasmon polaritons supported by Au grating can be utilized to enhance the absorption of MoS2. The remarkable enhancement of absorption due to exciton transition can also be realized. When the period of grating is 600 nm, the local absorption of the monolayer MoS2 on Au grating is nearly 7 times higher than the intrinsic absorption due to B exciton transition. A further study reveals that the absorption properties of Au grating can be tailored by altering number of MoS2 layers, changing to a MoS2 nanoribbon array, and inserting a hafnium dioxide (HfO2) spacer. This work will contribute to the design of MoS2-based optical and optoelectronic devices.

  19. Probabilistic Inference in General Graphical Models through Sampling in Stochastic Networks of Spiking Neurons

    PubMed Central

    Pecevski, Dejan; Buesing, Lars; Maass, Wolfgang

    2011-01-01

    An important open problem of computational neuroscience is the generic organization of computations in networks of neurons in the brain. We show here through rigorous theoretical analysis that inherent stochastic features of spiking neurons, in combination with simple nonlinear computational operations in specific network motifs and dendritic arbors, enable networks of spiking neurons to carry out probabilistic inference through sampling in general graphical models. In particular, it enables them to carry out probabilistic inference in Bayesian networks with converging arrows (“explaining away”) and with undirected loops, that occur in many real-world tasks. Ubiquitous stochastic features of networks of spiking neurons, such as trial-to-trial variability and spontaneous activity, are necessary ingredients of the underlying computational organization. We demonstrate through computer simulations that this approach can be scaled up to neural emulations of probabilistic inference in fairly large graphical models, yielding some of the most complex computations that have been carried out so far in networks of spiking neurons. PMID:22219717

  20. Theoretical study of surface plasmon resonance sensors based on 2D bimetallic alloy grating

    NASA Astrophysics Data System (ADS)

    Dhibi, Abdelhak; Khemiri, Mehdi; Oumezzine, Mohamed

    2016-11-01

    A surface plasmon resonance (SPR) sensor based on 2D alloy grating with a high performance is proposed. The grating consists of homogeneous alloys of formula MxAg1-x, where M is gold, copper, platinum and palladium. Compared to the SPR sensors based a pure metal, the sensor based on angular interrogation with silver exhibits a sharper (i.e. larger depth-to-width ratio) reflectivity dip, which provides a big detection accuracy, whereas the sensor based on gold exhibits the broadest dips and the highest sensitivity. The detection accuracy of SPR sensor based a metal alloy is enhanced by the increase of silver composition. In addition, the composition of silver which is around 0.8 improves the sensitivity and the quality of SPR sensor of pure metal. Numerical simulations based on rigorous coupled wave analysis (RCWA) show that the sensor based on a metal alloy not only has a high sensitivity and a high detection accuracy, but also exhibits a good linearity and a good quality.

  1. The long tail of a demon drug: The 'bath salts' risk environment.

    PubMed

    Elliott, Luther; Benoit, Ellen; Campos, Stephanie; Dunlap, Eloise

    2018-01-01

    Using the case of synthetic cathinones (commonly referred to as 'bath salts' in the US context), this paper analyses structural factors surrounding novel psychoactive substances (NPS) as contributing to the unique risk environment surrounding their use. Drawing on interviews with 39 people who use bath salts from four U.S. cities and analysis of the infrastructural, social, economic, and policy contexts, we document the unique harms related to changing contexts for illicit drug regulation, manufacture, and consumption. Findings suggest that NPS and designer drug markets, which are highly reliant upon the internet, share characteristics of the entertainment industry which has come to rely more heavily upon profits derived from the 'long tail' of myriad lesser-known products and the diminished centrality of 'superstars' and 'hits'. Findings point toward increased theoretical and policy attention to changing drug market structures, more rigorous evaluations of drug 'analogues' legislation and greater involvement with NPS education and testing by harm reduction agencies. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. The Contribution of Environmental Assessment to Sustainable Development: Toward a Richer Empirical Understanding

    NASA Astrophysics Data System (ADS)

    Cashmore, Matthew; Bond, Alan; Cobb, Dick

    2007-09-01

    It has long been suggested that environmental assessment has the potential to contribute to sustainable development through mechanisms above and beyond informing design and consent decisions, and while theories have been proposed to explain how this might occur, few have been subjected to rigorous empirical validation. This research advances the theoretical debate by building a rich empirical understanding of environmental assessment’s practical outcomes, from which its potential to contribute to sustainable development can be gauged. Three case study environmental assessment processes in England were investigated using a combination of data generated from content analysis, in-depth interviews, and a questionnaire survey. Four categories of outcomes are delineated based on the research data: learning outcomes; governance outcomes; attitudinal and value changes; and developmental outcomes. The data provide a robust critique of mainstream theory, with its focus on design and consent decisions. The article concludes with an examination of the consequences of the context-specific nature of environmental assessment practices in terms of developing theory and focusing future research.

  3. Laser Assisted Micro Wire GMAW and Droplet Welding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    FUERSCHBACH, PHILLIP W.; LUCK, D. L.; BERTRAM, LEE A.

    2002-03-01

    Laser beam welding is the principal welding process for the joining of Sandia weapon components because it can provide a small fusion zone with low overall heating. Improved process robustness is desired since laser energy absorption is extremely sensitive to joint variation and filler metal is seldom added. This project investigated the experimental and theoretical advantages of combining a fiber optic delivered Nd:YAG laser with a miniaturized GMAW system. Consistent gas metal arc droplet transfer employing a 0.25 mm diameter wire was only obtained at high currents in the spray transfer mode. Excessive heating of the workpiece in this modemore » was considered an impractical result for most Sandia micro-welding applications. Several additional droplet detachment approaches were investigated and analyzed including pulsed tungsten arc transfer(droplet welding), servo accelerated transfer, servo dip transfer, and electromechanically braked transfer. Experimental observations and rigorous analysis of these approaches indicate that decoupling droplet detachment from the arc melting process is warranted and may someday be practical.« less

  4. Dual RBFNNs-Based Model-Free Adaptive Control With Aspen HYSYS Simulation.

    PubMed

    Zhu, Yuanming; Hou, Zhongsheng; Qian, Feng; Du, Wenli

    2017-03-01

    In this brief, we propose a new data-driven model-free adaptive control (MFAC) method with dual radial basis function neural networks (RBFNNs) for a class of discrete-time nonlinear systems. The main novelty lies in that it provides a systematic design method for controller structure by the direct usage of I/O data, rather than using the first-principle model or offline identified plant model. The controller structure is determined by equivalent-dynamic-linearization representation of the ideal nonlinear controller, and the controller parameters are tuned by the pseudogradient information extracted from the I/O data of the plant, which can deal with the unknown nonlinear system. The stability of the closed-loop control system and the stability of the training process for RBFNNs are guaranteed by rigorous theoretical analysis. Meanwhile, the effectiveness and the applicability of the proposed method are further demonstrated by the numerical example and Aspen HYSYS simulation of distillation column in crude styrene produce process.

  5. Neutron-Helium-3 Analyzing Power at 4.05 and 5.54 MeV*

    NASA Astrophysics Data System (ADS)

    Esterline, J. H.; Howell, C. R.; Macri, R. A.; Tajima, S.; Tornow, W.; Crowe, B.; Pedroni, R. S.; Weisel, G. J.

    2004-10-01

    It has been proposed that, to better understand long-standing discrepancies between calculated and measured analyzing powers in the three-nucleon system, an investigation of analyzing powers be undertaken in the four-nucleon system, in which similar discrepancies have recently been observed. To this end, the analyzing power for polarized neutron-helion scattering has been measured at Triangle Universities Nuclear Laboratory (TUNL) at 27 angles for both incident neutron energies of 4.05 and 5.54 MeV. These data were obtained with neutrons generated by the polarization-transfer reaction D(d,n)He-3, with neutron polarizations of approximately .4 and .5, respectively, for the two energies. Preliminary analysis yields uncertainties in the analyzing powers not exceeding .03 at the cross section minima, at which point the analyzing powers achieve values in excess of .60. Since rigorous theoretical calculations are presently unavailable for neutron-helion scattering due to complications involving isospin structure, the data are compared favorably to previously obtained proton-triton data corrected for the Coulomb barrier.

  6. Theory of chaotic orbital variations confirmed by Cretaceous geological evidence

    NASA Astrophysics Data System (ADS)

    Ma, Chao; Meyers, Stephen R.; Sageman, Bradley B.

    2017-02-01

    Variations in the Earth’s orbit and spin vector are a primary control on insolation and climate; their recognition in the geological record has revolutionized our understanding of palaeoclimate dynamics, and has catalysed improvements in the accuracy and precision of the geological timescale. Yet the secular evolution of the planetary orbits beyond 50 million years ago remains highly uncertain, and the chaotic dynamical nature of the Solar System predicted by theoretical models has yet to be rigorously confirmed by well constrained (radioisotopically calibrated and anchored) geological data. Here we present geological evidence for a chaotic resonance transition associated with interactions between the orbits of Mars and the Earth, using an integrated radioisotopic and astronomical timescale from the Cretaceous Western Interior Basin of what is now North America. This analysis confirms the predicted chaotic dynamical behaviour of the Solar System, and provides a constraint for refining numerical solutions for insolation, which will enable a more precise and accurate geological timescale to be produced.

  7. Random sequential adsorption of straight rigid rods on a simple cubic lattice

    NASA Astrophysics Data System (ADS)

    García, G. D.; Sanchez-Varretti, F. O.; Centres, P. M.; Ramirez-Pastor, A. J.

    2015-10-01

    Random sequential adsorption of straight rigid rods of length k (k-mers) on a simple cubic lattice has been studied by numerical simulations and finite-size scaling analysis. The k-mers were irreversibly and isotropically deposited into the lattice. The calculations were performed by using a new theoretical scheme, whose accuracy was verified by comparison with rigorous analytical data. The results, obtained for k ranging from 2 to 64, revealed that (i) the jamming coverage for dimers (k = 2) is θj = 0.918388(16) . Our result corrects the previously reported value of θj = 0.799(2) (Tarasevich and Cherkasova, 2007); (ii) θj exhibits a decreasing function when it is plotted in terms of the k-mer size, being θj(∞) = 0.4045(19) the value of the limit coverage for large k's; and (iii) the ratio between percolation threshold and jamming coverage shows a non-universal behavior, monotonically decreasing to zero with increasing k.

  8. Microscopic optical model potential based on a Dirac Brueckner Hartree Fock approach and the relevant uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Xu, Ruirui; Ma, Zhongyu; Muether, Herbert; van Dalen, E. N. E.; Liu, Tinjin; Zhang, Yue; Zhang, Zhi; Tian, Yuan

    2017-09-01

    A relativistic microscopic optical model potential, named CTOM, for nucleon-nucleus scattering is investigated in the framework of Dirac-Brueckner-Hartree-Fock approach. The microscopic feature of CTOM is guaranteed through rigorously adopting the isospin dependent DBHF calculation within the subtracted T matrix scheme. In order to verify its prediction power, a global study n, p+ A scattering are carried out. The predicted scattering observables coincide with experimental data within a good accuracy over a broad range of targets and a large region of energies only with two free items, namely the free-range factor t in the applied improved local density approximation and minor adjustments of the scalar and vector potentials in the low-density region. In addition, to estimate the uncertainty of the theoretical results, the deterministic simple least square approach is preliminarily employed to derive the covariance of predicted angular distributions, which is also briefly contained in this paper.

  9. Preparation and physical characterization of pure beta-carotene.

    PubMed

    Laughlin, Robert G; Bunke, Gregory M; Eads, Charles D; Laidig, William D; Shelley, John C

    2002-05-01

    Pure all-trans beta-carotene has been prepared on the 10's of grams scale by isothermal Fractional Dissolution (FD) of commercial laboratory samples in tetrahydrofuran (THF). beta-Carotene purified in this way is black, with a faint brownish tinge. The electronic spectra of black samples extend into the near infrared, with end-absorption past 750 nm. Black samples react directly with dioxygen under mild conditions to yield the familiar orange or red powders. Pure beta-carotene rigorously obeys Beer's Law in octane over the entire UV-Vis spectral range, while commercial laboratory samples and recrystallized samples do not. NMR self-diffusion coefficient data demonstrate that beta-carotene exists as simple molecular solutions in octane and toluene. The anomalously high crystallinity of beta-carotene can be attributed (from analysis using molecular mechanics) to the facts that: (1) the number of theoretically possible conformers of beta-carotene is extremely small, and (2) only a small fraction of these (ca. 12%, or 127) may actually exist in fluid phases.

  10. Turbine blade-tip clearance excitation forces

    NASA Technical Reports Server (NTRS)

    Martinez-Sanchez, M.; Greitzer, E. M.

    1985-01-01

    The results of an effort to assess the existing knowledge and plan the required experimentation in the area of turbine blade tip excitation forces is summarized. The work was carried out in three phases. The first was a literature search and evaluation, which served to highlight the state of the art and to expose the need for an articulated theoretical experimental effort to provide not only design data, but also a rational framework for their extrapolation to new configurations and regimes. The second phase was a start in this direction, in which several of the explicit or implicit assumptions contained in the usual formulations of the Alford force effect were removed and a rigorous linearized flow analysis of the behavior of a nonsymmetric actuator disc was carried out. In the third phase a preliminary design of a turbine test facility that would be used to measure both the excitation forces themselves and the flow patterns responsible for them were conducted and do so over a realistic range of dimensionless parameters.

  11. Theory of chaotic orbital variations confirmed by Cretaceous geological evidence.

    PubMed

    Ma, Chao; Meyers, Stephen R; Sageman, Bradley B

    2017-02-22

    Variations in the Earth's orbit and spin vector are a primary control on insolation and climate; their recognition in the geological record has revolutionized our understanding of palaeoclimate dynamics, and has catalysed improvements in the accuracy and precision of the geological timescale. Yet the secular evolution of the planetary orbits beyond 50 million years ago remains highly uncertain, and the chaotic dynamical nature of the Solar System predicted by theoretical models has yet to be rigorously confirmed by well constrained (radioisotopically calibrated and anchored) geological data. Here we present geological evidence for a chaotic resonance transition associated with interactions between the orbits of Mars and the Earth, using an integrated radioisotopic and astronomical timescale from the Cretaceous Western Interior Basin of what is now North America. This analysis confirms the predicted chaotic dynamical behaviour of the Solar System, and provides a constraint for refining numerical solutions for insolation, which will enable a more precise and accurate geological timescale to be produced.

  12. Mean field study of a propagation-turnover lattice model for the dynamics of histone marking

    NASA Astrophysics Data System (ADS)

    Yao, Fan; Li, FangTing; Li, TieJun

    2017-02-01

    We present a mean field study of a propagation-turnover lattice model, which was proposed by Hodges and Crabtree [Proc. Nat. Acad. Sci. 109, 13296 (2012)] for understanding how posttranslational histone marks modulate gene expression in mammalian cells. The kinetics of the lattice model consists of nucleation, propagation and turnover mechanisms, and exhibits second-order phase transition for the histone marking domain. We showed rigorously that the dynamics essentially depends on a non-dimensional parameter κ = k +/ k -, the ratio between the propagation and turnover rates, which has been observed in the simulations. We then studied the lowest order mean field approximation, and observed the phase transition with an analytically obtained critical parameter. The boundary layer analysis was utilized to investigate the structure of the decay profile of the mark density. We also studied the higher order mean field approximation to achieve sharper estimate of the critical transition parameter and more detailed features. The comparison between the simulation and theoretical results shows the validity of our theory.

  13. Stability switches of arbitrary high-order consensus in multiagent networks with time delays.

    PubMed

    Yang, Bo

    2013-01-01

    High-order consensus seeking, in which individual high-order dynamic agents share a consistent view of the objectives and the world in a distributed manner, finds its potential broad applications in the field of cooperative control. This paper presents stability switches analysis of arbitrary high-order consensus in multiagent networks with time delays. By employing a frequency domain method, we explicitly derive analytical equations that clarify a rigorous connection between the stability of general high-order consensus and the system parameters such as the network topology, communication time-delays, and feedback gains. Particularly, our results provide a general and a fairly precise notion of how increasing communication time-delay causes the stability switches of consensus. Furthermore, under communication constraints, the stability and robustness problems of consensus algorithms up to third order are discussed in details to illustrate our central results. Numerical examples and simulation results for fourth-order consensus are provided to demonstrate the effectiveness of our theoretical results.

  14. Which Interventions Have the Greatest Effect on Student Learning in Sub-Saharan Africa? "A Meta-Analysis of Rigorous Impact Evaluations"

    ERIC Educational Resources Information Center

    Conn, Katharine

    2014-01-01

    In the last three decades, there has been a large increase in the number of rigorous experimental and quasi-experimental evaluations of education programs in developing countries. These impact evaluations have taken place all over the globe, including a large number in Sub-Saharan Africa (SSA). The fact that the developing world is socially and…

  15. Curved fronts in the Belousov-Zhabotinskii reaction-diffusion systems in R2

    NASA Astrophysics Data System (ADS)

    Niu, Hong-Tao; Wang, Zhi-Cheng; Bu, Zhen-Hui

    2018-05-01

    In this paper we consider a diffusion system with the Belousov-Zhabotinskii (BZ for short) chemical reaction. Following Brazhnik and Tyson [4] and Pérez-Muñuzuri et al. [45], who predicted V-shaped fronts theoretically and discovered V-shaped fronts by experiments respectively, we give a rigorous mathematical proof of their results. We establish the existence of V-shaped traveling fronts in R2 by constructing a proper supersolution and a subsolution. Furthermore, we establish the stability of the V-shaped front in R2.

  16. Cymatics for the cloaking of flexural vibrations in a structured plate

    PubMed Central

    Misseroni, D.; Colquitt, D. J.; Movchan, A. B.; Movchan, N. V.; Jones, I. S.

    2016-01-01

    Based on rigorous theoretical findings, we present a proof-of-concept design for a structured square cloak enclosing a void in an elastic lattice. We implement high-precision fabrication and experimental testing of an elastic invisibility cloak for flexural waves in a mechanical lattice. This is accompanied by verifications and numerical modelling performed through finite element simulations. The primary advantage of our square lattice cloak, over other designs, is the straightforward implementation and the ease of construction. The elastic lattice cloak, implemented experimentally, shows high efficiency. PMID:27068339

  17. A consortium-driven framework to guide the implementation of ICH M7 Option 4 control strategies.

    PubMed

    Barber, Chris; Antonucci, Vincent; Baumann, Jens-Christoph; Brown, Roland; Covey-Crump, Elizabeth; Elder, David; Elliott, Eric; Fennell, Jared W; Gallou, Fabrice; Ide, Nathan D; Jordine, Guido; Kallemeyn, Jeffrey M; Lauwers, Dirk; Looker, Adam R; Lovelle, Lucie E; McLaughlin, Mark; Molzahn, Robert; Ott, Martin; Schils, Didier; Oestrich, Rolf Schulte; Stevenson, Neil; Talavera, Pere; Teasdale, Andrew; Urquhart, Michael W; Varie, David L; Welch, Dennie

    2017-11-01

    The ICH M7 Option 4 control of (potentially) mutagenic impurities is based on the use of scientific principles in lieu of routine analytical testing. This approach can reduce the burden of analytical testing without compromising patient safety, provided a scientifically rigorous approach is taken which is backed up by sufficient theoretical and/or analytical data. This paper introduces a consortium-led initiative and offers a proposal on the supporting evidence that could be presented in regulatory submissions. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Evolution of biological complexity

    PubMed Central

    Adami, Christoph; Ofria, Charles; Collier, Travis C.

    2000-01-01

    To make a case for or against a trend in the evolution of complexity in biological evolution, complexity needs to be both rigorously defined and measurable. A recent information-theoretic (but intuitively evident) definition identifies genomic complexity with the amount of information a sequence stores about its environment. We investigate the evolution of genomic complexity in populations of digital organisms and monitor in detail the evolutionary transitions that increase complexity. We show that, because natural selection forces genomes to behave as a natural “Maxwell Demon,” within a fixed environment, genomic complexity is forced to increase. PMID:10781045

  19. Supervised learning of probability distributions by neural networks

    NASA Technical Reports Server (NTRS)

    Baum, Eric B.; Wilczek, Frank

    1988-01-01

    Supervised learning algorithms for feedforward neural networks are investigated analytically. The back-propagation algorithm described by Werbos (1974), Parker (1985), and Rumelhart et al. (1986) is generalized by redefining the values of the input and output neurons as probabilities. The synaptic weights are then varied to follow gradients in the logarithm of likelihood rather than in the error. This modification is shown to provide a more rigorous theoretical basis for the algorithm and to permit more accurate predictions. A typical application involving a medical-diagnosis expert system is discussed.

  20. Mindfulness in nursing: an evolutionary concept analysis.

    PubMed

    White, Lacie

    2014-02-01

    To report an analysis of the concept of mindfulness. Mindfulness is an emerging concept in health care that has significant implications for a variety of clinical populations. Nursing uses this concept in limited ways, and subsequently requires conceptual clarity to further identify its significance, use and applications in nursing. Mindfulness was explored using Rodgers evolutionary method of concept analysis. For this analysis, a sample of 59 English theoretical and research-based articles from the Cumulative Index to Nursing and Allied Health Literature database were obtained. The search was conducted between all-inclusive years of the database, 1981-2012. Data were analysed with particular focus on the attributes, antecedents, consequences, references and related terms that arose in relation to mindfulness in the nursing literature. The analysis found five intricately connected attributes: mindfulness is a transformative process where one develops an increasing ability to 'experience being present', with 'acceptance', 'attention' and 'awareness'. Antecedents, attributes and consequences appeared to inform and strengthen one another over time. Mindfulness is a significant concept for the discipline of nursing with practical applications for nurse well-being, the development and sustainability of therapeutic nursing qualities and holistic health promotion. It is imperative that nurse well-being and self-care become a more prominent focus in nursing research and education. Further development of the concept of mindfulness could support this focus, particularly through rigorous qualitative methodologies. © 2013 John Wiley & Sons Ltd.

  1. Analysis of Sting Balance Calibration Data Using Optimized Regression Models

    NASA Technical Reports Server (NTRS)

    Ulbrich, Norbert; Bader, Jon B.

    2009-01-01

    Calibration data of a wind tunnel sting balance was processed using a search algorithm that identifies an optimized regression model for the data analysis. The selected sting balance had two moment gages that were mounted forward and aft of the balance moment center. The difference and the sum of the two gage outputs were fitted in the least squares sense using the normal force and the pitching moment at the balance moment center as independent variables. The regression model search algorithm predicted that the difference of the gage outputs should be modeled using the intercept and the normal force. The sum of the two gage outputs, on the other hand, should be modeled using the intercept, the pitching moment, and the square of the pitching moment. Equations of the deflection of a cantilever beam are used to show that the search algorithm s two recommended math models can also be obtained after performing a rigorous theoretical analysis of the deflection of the sting balance under load. The analysis of the sting balance calibration data set is a rare example of a situation when regression models of balance calibration data can directly be derived from first principles of physics and engineering. In addition, it is interesting to see that the search algorithm recommended the same regression models for the data analysis using only a set of statistical quality metrics.

  2. Space radiator simulation manual for computer code

    NASA Technical Reports Server (NTRS)

    Black, W. Z.; Wulff, W.

    1972-01-01

    A computer program that simulates the performance of a space radiator is presented. The program basically consists of a rigorous analysis which analyzes a symmetrical fin panel and an approximate analysis that predicts system characteristics for cases of non-symmetrical operation. The rigorous analysis accounts for both transient and steady state performance including aerodynamic and radiant heating of the radiator system. The approximate analysis considers only steady state operation with no aerodynamic heating. A description of the radiator system and instructions to the user for program operation is included. The input required for the execution of all program options is described. Several examples of program output are contained in this section. Sample output includes the radiator performance during ascent, reentry and orbit.

  3. Optimal design and evaluation of a color separation grating using rigorous coupled wave analysis

    NASA Astrophysics Data System (ADS)

    Nagayoshi, Mayumi; Oka, Keiko; Klaus, Werner; Komai, Yuki; Kodate, Kashiko

    2006-02-01

    In recent years, the technology which separates white light into the three primary colors of Red (R), Green (G) and Blue (B) and adjusts each optical intensity and composites R, G and B to display various colors is required in the development and spread of color visual equipments. Various color separation devices have been proposed and have been put to practical use in color visual equipments. We have focused on a small and light grating-type device which has the possibility of reduction in cost and large-scale production and generates only the three primary colors of R, G and B so that a high saturation level can be obtained. To perform a rigorous analysis and design of color separation gratings, our group has developed a program that is based on the Rigorous Coupled Wave Analysis (RCWA). We then calculated the parameters to obtain a diffraction efficiency of higher than 70% and the color gamut of about 70%. We will report on the design, fabrication and evaluation of color separation gratings that have been optimized for fabrication by laser drawing.

  4. Technology-enhanced Interactive Teaching of Marginal, Joint and Conditional Probabilities: The Special Case of Bivariate Normal Distribution

    PubMed Central

    Dinov, Ivo D.; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas

    2014-01-01

    Summary Data analysis requires subtle probability reasoning to answer questions like What is the chance of event A occurring, given that event B was observed? This generic question arises in discussions of many intriguing scientific questions such as What is the probability that an adolescent weighs between 120 and 140 pounds given that they are of average height? and What is the probability of (monetary) inflation exceeding 4% and housing price index below 110? To address such problems, learning some applied, theoretical or cross-disciplinary probability concepts is necessary. Teaching such courses can be improved by utilizing modern information technology resources. Students’ understanding of multivariate distributions, conditional probabilities, correlation and causation can be significantly strengthened by employing interactive web-based science educational resources. Independent of the type of a probability course (e.g. majors, minors or service probability course, rigorous measure-theoretic, applied or statistics course) student motivation, learning experiences and knowledge retention may be enhanced by blending modern technological tools within the classical conceptual pedagogical models. We have designed, implemented and disseminated a portable open-source web-application for teaching multivariate distributions, marginal, joint and conditional probabilities using the special case of bivariate Normal distribution. A real adolescent height and weight dataset is used to demonstrate the classroom utilization of the new web-application to address problems of parameter estimation, univariate and multivariate inference. PMID:25419016

  5. Technology-enhanced Interactive Teaching of Marginal, Joint and Conditional Probabilities: The Special Case of Bivariate Normal Distribution.

    PubMed

    Dinov, Ivo D; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas

    2013-01-01

    Data analysis requires subtle probability reasoning to answer questions like What is the chance of event A occurring, given that event B was observed? This generic question arises in discussions of many intriguing scientific questions such as What is the probability that an adolescent weighs between 120 and 140 pounds given that they are of average height? and What is the probability of (monetary) inflation exceeding 4% and housing price index below 110? To address such problems, learning some applied, theoretical or cross-disciplinary probability concepts is necessary. Teaching such courses can be improved by utilizing modern information technology resources. Students' understanding of multivariate distributions, conditional probabilities, correlation and causation can be significantly strengthened by employing interactive web-based science educational resources. Independent of the type of a probability course (e.g. majors, minors or service probability course, rigorous measure-theoretic, applied or statistics course) student motivation, learning experiences and knowledge retention may be enhanced by blending modern technological tools within the classical conceptual pedagogical models. We have designed, implemented and disseminated a portable open-source web-application for teaching multivariate distributions, marginal, joint and conditional probabilities using the special case of bivariate Normal distribution. A real adolescent height and weight dataset is used to demonstrate the classroom utilization of the new web-application to address problems of parameter estimation, univariate and multivariate inference.

  6. Howard Brenner's Legacy for Biological Transport Processes

    NASA Astrophysics Data System (ADS)

    Nitsche, Johannes

    2014-11-01

    This talk discusses the manner in which Howard Brenner's theoretical contributions have had, and long will have, strong and direct impact on the understanding of transport processes occurring in biological systems. His early work on low Reynolds number resistance/mobility coefficients of arbitrarily shaped particles, and particles near walls and in pores, is an essential component of models of hindered diffusion through many types of membranes and tissues, and convective transport in microfluidic diagnostic systems. His seminal contributions to macrotransport (coarse-graining, homogenization) theory presaged the growing discipline of multiscale modeling. For biological systems they represent the key to infusing diffusion models of a wide variety of tissues with a sound basis in their microscopic structure and properties, often over a hierarchy of scales. Both scientific currents are illustrated within the concrete context of diffusion models of drug/chemical diffusion through the skin. This area of theory, which is key to transdermal drug development and risk assessment of chemical exposure, has benefitted very directly from Brenner's contributions. In this as in other areas, Brenner's physicochemical insight, mathematical virtuosity, drive for fully justified analysis free of ad hoc assumptions, quest for generality, and impeccable exposition, have consistently elevated the level of theoretical understanding and presentation. We close with anecdotes showing how his personal qualities and warmth helped to impart high standards of rigor to generations of grateful research students. Authors are Johannes M. Nitsche, Ludwig C. Nitsche and Gerald B. Kasting.

  7. Rigorous Electromagnetic Analysis of the Focusing Action of Refractive Cylindrical Microlens

    NASA Astrophysics Data System (ADS)

    Liu, Juan; Gu, Ben-Yuan; Dong, Bi-Zhen; Yang, Guo-Zhen

    The focusing action of refractive cylindrical microlens is investigated based on the rigorous electromagnetic theory with the use of the boundary element method. The focusing behaviors of these refractive microlenses with continuous and multilevel surface-envelope are characterized in terms of total electric-field patterns, the electric-field intensity distributions on the focal plane, and their diffractive efficiencies at the focal spots. The obtained results are also compared with the ones obtained by Kirchhoff's scalar diffraction theory. The present numerical and graphical results may provide useful information for the analysis and design of refractive elements in micro-optics.

  8. Enhancing the quality and credibility of qualitative analysis.

    PubMed Central

    Patton, M Q

    1999-01-01

    Varying philosophical and theoretical orientations to qualitative inquiry remind us that issues of quality and credibility intersect with audience and intended research purposes. This overview examines ways of enhancing the quality and credibility of qualitative analysis by dealing with three distinct but related inquiry concerns: rigorous techniques and methods for gathering and analyzing qualitative data, including attention to validity, reliability, and triangulation; the credibility, competence, and perceived trustworthiness of the qualitative researcher; and the philosophical beliefs of evaluation users about such paradigm-based preferences as objectivity versus subjectivity, truth versus perspective, and generalizations versus extrapolations. Although this overview examines some general approaches to issues of credibility and data quality in qualitative analysis, it is important to acknowledge that particular philosophical underpinnings, specific paradigms, and special purposes for qualitative inquiry will typically include additional or substitute criteria for assuring and judging quality, validity, and credibility. Moreover, the context for these considerations has evolved. In early literature on evaluation methods the debate between qualitative and quantitative methodologists was often strident. In recent years the debate has softened. A consensus has gradually emerged that the important challenge is to match appropriately the methods to empirical questions and issues, and not to universally advocate any single methodological approach for all problems. PMID:10591279

  9. Rigorous diffraction analysis using geometrical theory of diffraction for future mask technology

    NASA Astrophysics Data System (ADS)

    Chua, Gek S.; Tay, Cho J.; Quan, Chenggen; Lin, Qunying

    2004-05-01

    Advanced lithographic techniques such as phase shift masks (PSM) and optical proximity correction (OPC) result in a more complex mask design and technology. In contrast to the binary masks, which have only transparent and nontransparent regions, phase shift masks also take into consideration transparent features with a different optical thickness and a modified phase of the transmitted light. PSM are well-known to show prominent diffraction effects, which cannot be described by the assumption of an infinitely thin mask (Kirchhoff approach) that is used in many commercial photolithography simulators. A correct prediction of sidelobe printability, process windows and linearity of OPC masks require the application of rigorous diffraction theory. The problem of aerial image intensity imbalance through focus with alternating Phase Shift Masks (altPSMs) is performed and compared between a time-domain finite-difference (TDFD) algorithm (TEMPEST) and Geometrical theory of diffraction (GTD). Using GTD, with the solution to the canonical problems, we obtained a relationship between the edge on the mask and the disturbance in image space. The main interest is to develop useful formulations that can be readily applied to solve rigorous diffraction for future mask technology. Analysis of rigorous diffraction effects for altPSMs using GTD approach will be discussed.

  10. Analysis of a novel non-contacting waveguide backshort

    NASA Technical Reports Server (NTRS)

    Weller, T. M.; Katehi, L. P. B.; Mcgrath, William R.

    1992-01-01

    A new non-contacting waveguide backshort has been developed for millimeter and submillimeter wave frequencies. The design consists of a metal bar with rectangular or circular holes cut into it, which is covered with a dielectric (mylar) layer to form a snug fit with the walls of a waveguide. Hole geometries are adjusted to obtain a periodic variation of the guide impedance on the correct length scale, in order to produce efficient reflection of RF power. It is a mechanically rugged design which can be easily fabricated for frequencies from 1 to 1000 GHz and is thus a sound alternative to the miniaturization of conventional non-contacting shorts. To aid in high-frequency design, a rigorous full-wave analysis has been completed, which will allow variations of the size, number and spacing of the holes to be easily analyzed. This paper will review the backshort design and the method developed for theoretical characterization, followed by a comparison of the experimental and numerical results. Low frequency models operating from 4-6 GHz are shown to demonstrate return loss of greater than -0.2 dB over a 33 percent bandwidth. The theory is in good agreement with measured data.

  11. Antireflective hydrophobic si subwavelength structures using thermally dewetted Ni/SiO2 nanomask patterns.

    PubMed

    Joo, Dong Hyuk; Leem, Jung Woo; Yu, Jae Su

    2011-11-01

    We report the disordered silicon (Si) subwavelength structures (SWSs), which are fabricated with the use of inductively coupled plasma (ICP) etching in SiCl4 gas using nickel/silicon dioxide (Ni/SiO2) nanopattens as the etch mask, on Si substrates by varying the etching parameters for broadband antireflective and self-cleaning surfaces. For the fabricated Si SWSs, the antireflection characteristics are experimentally investigated and a theoretical analysis is made based on the rigorous coupled-wave analysis method. The desirable dot-like Ni nanoparticles on SiO2/Si substrates are formed by the thermal dewetting process of Ni films at 900 degrees C. The truncated cone shaped Si SWS with a high average height of 790 +/- 23 nm, which is fabricated by ICP etching with 5 sccm SiCl4 at 50 W RF power with additional 200 W ICP power under 10 mTorr process pressure, exhibits a low average reflectance of approximately 5% over a wide wavelength range of 450-1050 nm. The water contact angle of 110 degrees is obtained, indicating a hydrophobic surface. The calculated reflectance results are also reasonably consistent with the experimental data.

  12. Pragmatic critical realism: could this methodological approach expand our understanding of employment relations?

    PubMed

    Mearns, Susan Lesley

    2011-01-01

    This paper seeks to highlight the need for employment relations academics and researchers to expand their use of research methodologies in order for them to enable the advancement of theoretical debate within their discipline. It focuses on the contribution that pragmatical critical realism has made to the field of perception and argues that it would add value to the subject of employment relations. It is a theoretically centred review of pragmatical critical realism and the possible contribution this methodology would make to the field of employment relations. The paper concludes that the employment relationship does not take place in a vacuum rather it is focussed on the interaction between imperfect individuals. Therefore, their interactions are moulded by emotions which can not be explored thoroughly or even acknowledged through a positivists' rigorous but limited acknowledgment of what constitutes 'knowledge' and development of theory. While not rejecting the contribution that quantitative data or positivism have made to the field, the study concludes that pragmatic critical realism has a lot to offer the development of the area and its theoretical foundations.

  13. Combining qualitative and quantitative research within mixed method research designs: a methodological review.

    PubMed

    Östlund, Ulrika; Kidd, Lisa; Wengström, Yvonne; Rowa-Dewar, Neneh

    2011-03-01

    It has been argued that mixed methods research can be useful in nursing and health science because of the complexity of the phenomena studied. However, the integration of qualitative and quantitative approaches continues to be one of much debate and there is a need for a rigorous framework for designing and interpreting mixed methods research. This paper explores the analytical approaches (i.e. parallel, concurrent or sequential) used in mixed methods studies within healthcare and exemplifies the use of triangulation as a methodological metaphor for drawing inferences from qualitative and quantitative findings originating from such analyses. This review of the literature used systematic principles in searching CINAHL, Medline and PsycINFO for healthcare research studies which employed a mixed methods approach and were published in the English language between January 1999 and September 2009. In total, 168 studies were included in the results. Most studies originated in the United States of America (USA), the United Kingdom (UK) and Canada. The analytic approach most widely used was parallel data analysis. A number of studies used sequential data analysis; far fewer studies employed concurrent data analysis. Very few of these studies clearly articulated the purpose for using a mixed methods design. The use of the methodological metaphor of triangulation on convergent, complementary, and divergent results from mixed methods studies is exemplified and an example of developing theory from such data is provided. A trend for conducting parallel data analysis on quantitative and qualitative data in mixed methods healthcare research has been identified in the studies included in this review. Using triangulation as a methodological metaphor can facilitate the integration of qualitative and quantitative findings, help researchers to clarify their theoretical propositions and the basis of their results. This can offer a better understanding of the links between theory and empirical findings, challenge theoretical assumptions and develop new theory. Copyright © 2010 Elsevier Ltd. All rights reserved.

  14. Novel Integral Equation Methods Applied to the Analysis of New Guiding and Radiating Structures and Optically-Inspired Phenomena at Microwaves

    NASA Astrophysics Data System (ADS)

    Gomez-Diaz, Juan Sebastian

    This PhD. dissertation presents a multidisciplinary work, which involves the development of different novel formulations applied to the accurate and efficient analysis of a wide variety of new structures, devices, and phenomena at themicrowave frequency region. The objectives of the present work can be divided into three main research lines: (1) The first research line is devoted to the Green's function analysis of multilayered enclosures with convex arbitrarily-shaped cross section. For this purpose, three accurate spatial-domain formulations are developed at the Green's functions level. These techniques are then efficiently incorporated into a mixed-potential integral equation framework, which allows the fast and accurate analysis of multilayered printed circuits in shielded enclosures. The study of multilayered shielded circuits has lead to the development of the novel hybridwaveguide-microstrip filter technology, which is light, compact, low-loss and presents important advantages for the space industry. (2) The second research line is related to the impulse-regime study ofmetamaterial-based composite right/left-handed (CRLH) structures and the subsequent theoretical and practical demonstration of several novel optically-inspired phenomena and applications at microwaves, in both, the guided and the radiative region. This study allows the development of new devices for ultra wide band and high data-rate communications systems. Besides, this research line also deals with the simple and accurate characterization of CRLH leaky-wave antennas using transmission line theory. (3) The third and last research line presents a novel CRLH parallel-plate waveguide leaky-wave antenna structure, and a rigorous iterative modal-based technique for its fast and complete characterization, including a systematic calculation of the antenna physical dimensions. It is important to point out that all the theoretical developments and novel structures presented in thiswork have been numerically confirmed, by the use of both, home-made software and commercial full-wave simulations, and experimentally verified, by the use of measurements from fabricated prototypes.

  15. Organizing principles as tools for bridging the gap between system theory and biological experimentation.

    PubMed

    Mekios, Constantinos

    2016-04-01

    Twentieth-century theoretical efforts towards the articulation of general system properties came short of having the significant impact on biological practice that their proponents envisioned. Although the latter did arrive at preliminary mathematical formulations of such properties, they had little success in showing how these could be productively incorporated into the research agenda of biologists. Consequently, the gap that kept system-theoretic principles cut-off from biological experimentation persisted. More recently, however, simple theoretical tools have proved readily applicable within the context of systems biology. In particular, examples reviewed in this paper suggest that rigorous mathematical expressions of design principles, imported primarily from engineering, could produce experimentally confirmable predictions of the regulatory properties of small biological networks. But this is not enough for contemporary systems biologists who adopt the holistic aspirations of early systemologists, seeking high-level organizing principles that could provide insights into problems of biological complexity at the whole-system level. While the presented evidence is not conclusive about whether this strategy could lead to the realization of the lofty goal of a comprehensive explanatory integration, it suggests that the ongoing quest for organizing principles is pragmatically advantageous for systems biologists. The formalisms postulated in the course of this process can serve as bridges between system-theoretic concepts and the results of molecular experimentation: they constitute theoretical tools for generalizing molecular data, thus producing increasingly accurate explanations of system-wide phenomena.

  16. Development and utilization of complementary communication channels for treatment decision making and survivorship issues among cancer patients: The CIS Research Consortium Experience.

    PubMed

    Fleisher, Linda; Wen, Kuang Yi; Miller, Suzanne M; Diefenbach, Michael; Stanton, Annette L; Ropka, Mary; Morra, Marion; Raich, Peter C

    2015-11-01

    Cancer patients and survivors are assuming active roles in decision-making and digital patient support tools are widely used to facilitate patient engagement. As part of Cancer Information Service Research Consortium's randomized controlled trials focused on the efficacy of eHealth interventions to promote informed treatment decision-making for newly diagnosed prostate and breast cancer patients, and post-treatment breast cancer, we conducted a rigorous process evaluation to examine the actual use of and perceived benefits of two complementary communication channels -- print and eHealth interventions. The three Virtual Cancer Information Service (V-CIS) interventions were developed through a rigorous developmental process, guided by self-regulatory theory, informed decision-making frameworks, and health communications best practices. Control arm participants received NCI print materials; experimental arm participants received the additional V-CIS patient support tool. Actual usage data from the web-based V-CIS was also obtained and reported. Print materials were highly used by all groups. About 60% of the experimental group reported using the V-CIS. Those who did use the V-CIS rated it highly on improvements in knowledge, patient-provider communication and decision-making. The findings show that how patients actually use eHealth interventions either singularly or within the context of other communication channels is complex. Integrating rigorous best practices and theoretical foundations is essential and multiple communication approaches should be considered to support patient preferences.

  17. Improved key-rate bounds for practical decoy-state quantum-key-distribution systems

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen; Zhao, Qi; Razavi, Mohsen; Ma, Xiongfeng

    2017-01-01

    The decoy-state scheme is the most widely implemented quantum-key-distribution protocol in practice. In order to account for the finite-size key effects on the achievable secret key generation rate, a rigorous statistical fluctuation analysis is required. Originally, a heuristic Gaussian-approximation technique was used for this purpose, which, despite its analytical convenience, was not sufficiently rigorous. The fluctuation analysis has recently been made rigorous by using the Chernoff bound. There is a considerable gap, however, between the key-rate bounds obtained from these techniques and that obtained from the Gaussian assumption. Here we develop a tighter bound for the decoy-state method, which yields a smaller failure probability. This improvement results in a higher key rate and increases the maximum distance over which secure key exchange is possible. By optimizing the system parameters, our simulation results show that our method almost closes the gap between the two previously proposed techniques and achieves a performance similar to that of conventional Gaussian approximations.

  18. Oscillator strengths of some Ba lines - A treatment including core-valence correlation and relativistic effects

    NASA Technical Reports Server (NTRS)

    Bauschlicher, C. W., Jr.; Jaffe, R. L.; Langhoff, S. R.; Partridge, H.; Mascarello, F. G.

    1985-01-01

    Theoretical calculations of selected excitation energies and oscillator strengths for Ba are presented that overcome the difficulties of previous theoretical treatments. A relativistic effective-core potential treatment is used to account for the relativistic core contraction, but the outermost ten electrons are treated explicitly. Core-valence correlation can be included in this procedure in a rigorous and systematic way through a configuration-interaction calculation. Insight is gained into the importance of relativistic effects by repeating many of the calculations using an all-electron nonrelativistic treatment employing an extended Slater basis set. It is found that the intensity of the intercombination line 3P1-1S0 is accurately determined by accounting for the deviation from LS coupling through spin-orbit mixing with the 1P1 state, and that deviations from the Lande interval rule provide an accurate measure of the degree of mixing.

  19. Final Technical Report for "Reducing tropical precipitation biases in CESM"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larson, Vincent

    In state-of-the-art climate models, each cloud type is treated using its own separate cloud parameterization and its own separate microphysics parameterization. This use of separate schemes for separate cloud regimes is undesirable because it is theoretically unfounded, it hampers interpretation of results, and it leads to the temptation to overtune parameters. In this grant, we have created a climate model that contains a unified cloud parameterization (“CLUBB”) and a unified microphysics parameterization (“MG2”). In this model, all cloud types --- including marine stratocumulus, shallow cumulus, and deep cumulus --- are represented with a single equation set. This model improves themore » representation of convection in the Tropics. The model has been compared with ARM observations. The chief benefit of the project is to provide a climate model that is based on a more theoretically rigorous formulation.« less

  20. Ab Initio and Monte Carlo Approaches For the MagnetocaloricEffect in Co- and In-Doped Ni-Mn-Ga Heusler Alloys

    NASA Astrophysics Data System (ADS)

    Sokolovskiy, Vladimir; Grünebohm, Anna; Buchelnikov, Vasiliy; Entel, Peter

    2014-09-01

    This special issue collects contributions from the participants of the "Information in Dynamical Systems and Complex Systems" workshop, which cover a wide range of important problems and new approaches that lie in the intersection of information theory and dynamical systems. The contributions include theoretical characterization and understanding of the different types of information flow and causality in general stochastic processes, inference and identification of coupling structure and parameters of system dynamics, rigorous coarse-grain modeling of network dynamical systems, and exact statistical testing of fundamental information-theoretic quantities such as the mutual information. The collective efforts reported herein reflect a modern perspective of the intimate connection between dynamical systems and information flow, leading to the promise of better understanding and modeling of natural complex systems and better/optimal design of engineering systems.

  1. Simultaneous determination of effective carrier lifetime and resistivity of Si wafers using the nonlinear nature of photocarrier radiometric signals

    NASA Astrophysics Data System (ADS)

    Sun, Qiming; Melnikov, Alexander; Wang, Jing; Mandelis, Andreas

    2018-04-01

    A rigorous treatment of the nonlinear behavior of photocarrier radiometric (PCR) signals is presented theoretically and experimentally for the quantitative characterization of semiconductor photocarrier recombination and transport properties. A frequency-domain model based on the carrier rate equation and the classical carrier radiative recombination theory was developed. The derived concise expression reveals different functionalities of the PCR amplitude and phase channels: the phase bears direct quantitative correlation with the carrier effective lifetime, while the amplitude versus the estimated photocarrier density dependence can be used to extract the equilibrium majority carrier density and thus, resistivity. An experimental ‘ripple’ optical excitation mode (small modulation depth compared to the dc level) was introduced to bypass the complicated ‘modulated lifetime’ problem so as to simplify theoretical interpretation and guarantee measurement self-consistency and reliability. Two Si wafers with known resistivity values were tested to validate the method.

  2. Multi-Disciplinary Knowledge Synthesis for Human Health Assessment on Earth and in Space

    NASA Astrophysics Data System (ADS)

    Christakos, G.

    We discuss methodological developments in multi-disciplinary knowledge synthesis (KS) of human health assessment. A theoretical KS framework can provide the rational means for the assimilation of various information bases (general, site-specific etc.) that are relevant to the life system of interest. KS-based techniques produce a realistic representation of the system, provide a rigorous assessment of the uncertainty sources, and generate informative health state predictions across space-time. The underlying epistemic cognition methodology is based on teleologic criteria and stochastic logic principles. The mathematics of KS involves a powerful and versatile spatiotemporal random field model that accounts rigorously for the uncertainty features of the life system and imposes no restriction on the shape of the probability distributions or the form of the predictors. KS theory is instrumental in understanding natural heterogeneities, assessing crucial human exposure correlations and laws of physical change, and explaining toxicokinetic mechanisms and dependencies in a spatiotemporal life system domain. It is hoped that a better understanding of KS fundamentals would generate multi-disciplinary models that are useful for the maintenance of human health on Earth and in Space.

  3. Reducible or irreducible? Mathematical reasoning and the ontological method.

    PubMed

    Fisher, William P

    2010-01-01

    Science is often described as nothing but the practice of measurement. This perspective follows from longstanding respect for the roles mathematics and quantification have played as media through which alternative hypotheses are evaluated and experience becomes better managed. Many figures in the history of science and psychology have contributed to what has been called the "quantitative imperative," the demand that fields of study employ number and mathematics even when they do not constitute the language in which investigators think together. But what makes an area of study scientific is, of course, not the mere use of number, but communities of investigators who share common mathematical languages for exchanging quantitative and quantitative value. Such languages require rigorous theoretical underpinning, a basis in data sufficient to the task, and instruments traceable to reference standard quantitative metrics. The values shared and exchanged by such communities typically involve the application of mathematical models that specify the sufficient and invariant relationships necessary for rigorous theorizing and instrument equating. The mathematical metaphysics of science are explored with the aim of connecting principles of quantitative measurement with the structures of sufficient reason.

  4. An Integrative, Multilevel, and Transdisciplinary Research Approach to Challenges of Work, Family, and Health

    PubMed Central

    Bray, Jeremy W.; Kelly, Erin L.; Hammer, Leslie B.; Almeida, David M.; Dearing, James W.; King, Rosalind B.; Buxton, Orfeu M.

    2013-01-01

    Recognizing a need for rigorous, experimental research to support the efforts of workplaces and policymakers in improving the health and wellbeing of employees and their families, the National Institutes of Health and the Centers for Disease Control and Prevention formed the Work, Family & Health Network (WFHN). The WFHN is implementing an innovative multisite study with a rigorous experimental design (adaptive randomization, control groups), comprehensive multilevel measures, a novel and theoretically based intervention targeting the psychosocial work environment, and translational activities. This paper describes challenges and benefits of designing a multilevel and transdisciplinary research network that includes an effectiveness study to assess intervention effects on employees, families, and managers; a daily diary study to examine effects on family functioning and daily stress; a process study to understand intervention implementation; and translational research to understand and inform diffusion of innovation. Challenges were both conceptual and logistical, spanning all aspects of study design and implementation. In dealing with these challenges, however, the WFHN developed innovative, transdisciplinary, multi-method approaches to conducting workplace research that will benefit both the research and business communities. PMID:24618878

  5. Molecular and Cellular Biophysics

    NASA Astrophysics Data System (ADS)

    Jackson, Meyer B.

    2006-01-01

    Molecular and Cellular Biophysics provides advanced undergraduate and graduate students with a foundation in the basic concepts of biophysics. Students who have taken physical chemistry and calculus courses will find this book an accessible and valuable aid in learning how these concepts can be used in biological research. The text provides a rigorous treatment of the fundamental theories in biophysics and illustrates their application with examples. Conformational transitions of proteins are studied first using thermodynamics, and subsequently with kinetics. Allosteric theory is developed as the synthesis of conformational transitions and association reactions. Basic ideas of thermodynamics and kinetics are applied to topics such as protein folding, enzyme catalysis and ion channel permeation. These concepts are then used as the building blocks in a treatment of membrane excitability. Through these examples, students will gain an understanding of the general importance and broad applicability of biophysical principles to biological problems. Offers a unique synthesis of concepts across a wide range of biophysical topics Provides a rigorous theoretical treatment, alongside applications in biological systems Author has been teaching biophysics for nearly 25 years

  6. Failure-Modes-And-Effects Analysis Of Software Logic

    NASA Technical Reports Server (NTRS)

    Garcia, Danny; Hartline, Thomas; Minor, Terry; Statum, David; Vice, David

    1996-01-01

    Rigorous analysis applied early in design effort. Method of identifying potential inadequacies and modes and effects of failures caused by inadequacies (failure-modes-and-effects analysis or "FMEA" for short) devised for application to software logic.

  7. The DOZZ formula from the path integral

    NASA Astrophysics Data System (ADS)

    Kupiainen, Antti; Rhodes, Rémi; Vargas, Vincent

    2018-05-01

    We present a rigorous proof of the Dorn, Otto, Zamolodchikov, Zamolodchikov formula (the DOZZ formula) for the 3 point structure constants of Liouville Conformal Field Theory (LCFT) starting from a rigorous probabilistic construction of the functional integral defining LCFT given earlier by the authors and David. A crucial ingredient in our argument is a probabilistic derivation of the reflection relation in LCFT based on a refined tail analysis of Gaussian multiplicative chaos measures.

  8. Evaluating the Theoretic Adequacy and Applied Potential of Computational Models of the Spacing Effect.

    PubMed

    Walsh, Matthew M; Gluck, Kevin A; Gunzelmann, Glenn; Jastrzembski, Tiffany; Krusmark, Michael

    2018-06-01

    The spacing effect is among the most widely replicated empirical phenomena in the learning sciences, and its relevance to education and training is readily apparent. Yet successful applications of spacing effect research to education and training is rare. Computational modeling can provide the crucial link between a century of accumulated experimental data on the spacing effect and the emerging interest in using that research to enable adaptive instruction. In this paper, we review relevant literature and identify 10 criteria for rigorously evaluating computational models of the spacing effect. Five relate to evaluating the theoretic adequacy of a model, and five relate to evaluating its application potential. We use these criteria to evaluate a novel computational model of the spacing effect called the Predictive Performance Equation (PPE). Predictive Performance Equation combines elements of earlier models of learning and memory including the General Performance Equation, Adaptive Control of Thought-Rational, and the New Theory of Disuse, giving rise to a novel computational account of the spacing effect that performs favorably across the complete sets of theoretic and applied criteria. We implemented two other previously published computational models of the spacing effect and compare them to PPE using the theoretic and applied criteria as guides. Copyright © 2018 Cognitive Science Society, Inc.

  9. Scientific approaches to science policy.

    PubMed

    Berg, Jeremy M

    2013-11-01

    The development of robust science policy depends on use of the best available data, rigorous analysis, and inclusion of a wide range of input. While director of the National Institute of General Medical Sciences (NIGMS), I took advantage of available data and emerging tools to analyze training time distribution by new NIGMS grantees, the distribution of the number of publications as a function of total annual National Institutes of Health support per investigator, and the predictive value of peer-review scores on subsequent scientific productivity. Rigorous data analysis should be used to develop new reforms and initiatives that will help build a more sustainable American biomedical research enterprise.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peer, Akshit; Biswas, Rana; Park, Joong -Mok

    Here, we demonstrate enhanced absorption in solar cells and enhanced light emission in OLEDs by light interaction with a periodically structured microlens array. We simulate n-i-p perovskite solar cells with a microlens at the air-glass interface, with rigorous scattering matrix simulations. The microlens focuses light in nanoscale regions within the absorber layer enhancing the solar cell. Optimal period of ~700 nm and microlens height of ~800-1000 nm, provides absorption (photocurrent) enhancement of 6% (6.3%). An external polymer microlens array on the air-glass side of the OLED generates experimental and theoretical enhancements >100%, by outcoupling trapped modes in the glass substrate.

  11. Transmission intensity disturbance in a rotating polarizer

    NASA Astrophysics Data System (ADS)

    Fan, J. Y.; Li, H. X.; Wu, F. Q.

    2008-01-01

    Random disturbance was observed in transmission intensity in various rotating prism polarizers when they were used in optical systems. As a result, the transmitted intensity exhibited cyclic significant deviation from the Malus cosine-squared law with rotation of prisms. The disturbance spoils the light quality transmitted through the polarizer thus dramatically depresses the accuracies of measurements when the prim polarizers were used in light path. A rigorous model is presented based on the solid basis of multi-beams interference, and theoretical results show good agreement with measured values and also indicate effective method for reducing the disturbance.

  12. Comments on "Theoretical investigation on H abstraction reaction mechanisms and rate constants of sevoflurane with the OH radical" [Chem. Phys. Lett. 692 (2018) 345-352

    NASA Astrophysics Data System (ADS)

    Mai, Tam V.-T.; Duong, Minh v.; Huynh, Lam K.

    2018-03-01

    This short communication discusses the role of the newly-found lowest-lying structures of the transition states (∼3.0 kcal/mol lower than those previously reported by Ren et al. (2018), together with the inclusion of the hindered internal rotation correction, in obtaining reliable kinetic data for the hydrogen abstraction from sevoflurane by OH radical. With the new structures and the more rigorous kinetic model, the calculated rate constants agree much better with the experimental data than those suggested by Ren and coworkers.

  13. Lectures on General Relativity, Cosmology and Quantum Black Holes

    NASA Astrophysics Data System (ADS)

    Ydri, Badis

    2017-07-01

    This book is a rigorous text for students in physics and mathematics requiring an introduction to the implications and interpretation of general relativity in areas of cosmology. Readers of this text will be well prepared to follow the theoretical developments in the field and undertake research projects as part of an MSc or PhD programme. This ebook contains interactive Q&A technology, allowing the reader to interact with the text and reveal answers to selected exercises posed by the author within the book. This feature may not function in all formats and on reading devices.

  14. Quantum gravity model with fundamental spinor fields

    NASA Astrophysics Data System (ADS)

    Obukhov, Yu. N.; Hehl, F. W.

    2014-01-01

    We discuss the possibility that gravitational potentials (metric, coframe and connection) may emerge as composite fields from more fundamental spinor constituents. We use the formalism of Poincaré gauge gravity as an appropriate theoretical scheme for the rigorous development of such an approach. We postulate the constitutive relations of an elastic Cosserat type continuum that models spacetime. These generalized Hooke and MacCullagh type laws consistently take into account the translational and Lorentz rotational deformations, respectively. The resulting theory extends the recently proposed Diakonov model. An intriguing feature of our theory is that in the lowest approximation it reproduces Heisenberg's nonlinear spinor model.

  15. Differential geometry based solvation model I: Eulerian formulation

    NASA Astrophysics Data System (ADS)

    Chen, Zhan; Baker, Nathan A.; Wei, G. W.

    2010-11-01

    This paper presents a differential geometry based model for the analysis and computation of the equilibrium property of solvation. Differential geometry theory of surfaces is utilized to define and construct smooth interfaces with good stability and differentiability for use in characterizing the solvent-solute boundaries and in generating continuous dielectric functions across the computational domain. A total free energy functional is constructed to couple polar and nonpolar contributions to the solvation process. Geometric measure theory is employed to rigorously convert a Lagrangian formulation of the surface energy into an Eulerian formulation so as to bring all energy terms into an equal footing. By optimizing the total free energy functional, we derive coupled generalized Poisson-Boltzmann equation (GPBE) and generalized geometric flow equation (GGFE) for the electrostatic potential and the construction of realistic solvent-solute boundaries, respectively. By solving the coupled GPBE and GGFE, we obtain the electrostatic potential, the solvent-solute boundary profile, and the smooth dielectric function, and thereby improve the accuracy and stability of implicit solvation calculations. We also design efficient second-order numerical schemes for the solution of the GPBE and GGFE. Matrix resulted from the discretization of the GPBE is accelerated with appropriate preconditioners. An alternative direct implicit (ADI) scheme is designed to improve the stability of solving the GGFE. Two iterative approaches are designed to solve the coupled system of nonlinear partial differential equations. Extensive numerical experiments are designed to validate the present theoretical model, test computational methods, and optimize numerical algorithms. Example solvation analysis of both small compounds and proteins are carried out to further demonstrate the accuracy, stability, efficiency and robustness of the present new model and numerical approaches. Comparison is given to both experimental and theoretical results in the literature.

  16. Differential geometry based solvation model I: Eulerian formulation

    PubMed Central

    Chen, Zhan; Baker, Nathan A.; Wei, G. W.

    2010-01-01

    This paper presents a differential geometry based model for the analysis and computation of the equilibrium property of solvation. Differential geometry theory of surfaces is utilized to define and construct smooth interfaces with good stability and differentiability for use in characterizing the solvent-solute boundaries and in generating continuous dielectric functions across the computational domain. A total free energy functional is constructed to couple polar and nonpolar contributions to the salvation process. Geometric measure theory is employed to rigorously convert a Lagrangian formulation of the surface energy into an Eulerian formulation so as to bring all energy terms into an equal footing. By minimizing the total free energy functional, we derive coupled generalized Poisson-Boltzmann equation (GPBE) and generalized geometric flow equation (GGFE) for the electrostatic potential and the construction of realistic solvent-solute boundaries, respectively. By solving the coupled GPBE and GGFE, we obtain the electrostatic potential, the solvent-solute boundary profile, and the smooth dielectric function, and thereby improve the accuracy and stability of implicit solvation calculations. We also design efficient second order numerical schemes for the solution of the GPBE and GGFE. Matrix resulted from the discretization of the GPBE is accelerated with appropriate preconditioners. An alternative direct implicit (ADI) scheme is designed to improve the stability of solving the GGFE. Two iterative approaches are designed to solve the coupled system of nonlinear partial differential equations. Extensive numerical experiments are designed to validate the present theoretical model, test computational methods, and optimize numerical algorithms. Example solvation analysis of both small compounds and proteins are carried out to further demonstrate the accuracy, stability, efficiency and robustness of the present new model and numerical approaches. Comparison is given to both experimental and theoretical results in the literature. PMID:20938489

  17. Spin-up flow of ferrofluids: Asymptotic theory and experimental measurements

    NASA Astrophysics Data System (ADS)

    Chaves, Arlex; Zahn, Markus; Rinaldi, Carlos

    2008-05-01

    We treat the flow of ferrofluid in a cylindrical container subjected to a uniform rotating magnetic field, commonly referred to as spin-up flow. A review of theoretical and experimental results published since the phenomenon was first observed in 1967 shows that the experimental data from surface observations of tracer particles are inadequate for the assessment of bulk flow theories. We present direct measurements of the bulk flow by using the ultrasound velocity profile method, and torque measurements for water and kerosene based ferrofluids, showing the fluid corotating with the field in a rigid-body-like fashion throughout most of the bulk region of the container, except near the air-fluid interface, where it was observed to counter-rotate. We obtain an extension of the spin diffusion theory of Zaitsev and Shliomis, using the regular perturbation method. The solution is rigorously valid for αK≪√3/2 , where αK is the Langevin parameter evaluated by using the applied field magnitude, and provides a means for obtaining successively higher contributions of the nonlinearity of the equilibrium magnetization response and the spin-magnetization coupling in the magnetization relaxation equation. Because of limitations in the sensitivity of our apparatus, experiments were carried out under conditions for which α ˜1. Still, under such conditions the predictions of the analysis are in good qualitative agreement with the experimental observations. An estimate of the spin viscosity is obtained from comparison of flow measurements and theoretical results of the extrapolated wall velocity from the regular perturbation method. The estimated value lies in the range of 10-8-10-12kgms-1 and is several orders of magnitude higher than that obtained from dimensional analysis of a suspension of noninteracting particles in a Newtonian fluid.

  18. Flexoelectricity from density-functional perturbation theory

    NASA Astrophysics Data System (ADS)

    Stengel, Massimiliano

    2013-11-01

    We derive the complete flexoelectric tensor, including electronic and lattice-mediated effects, of an arbitrary insulator in terms of the microscopic linear response of the crystal to atomic displacements. The basic ingredient, which can be readily calculated from first principles in the framework of density-functional perturbation theory, is the quantum-mechanical probability current response to a long-wavelength acoustic phonon. Its second-order Taylor expansion in the wave vector q around the Γ (q=0) point in the Brillouin zone naturally yields the flexoelectric tensor. At order one in q we recover Martin's theory of piezoelectricity [Martin, Phys. Rev. B 5, 1607 (1972)], thus providing an alternative derivation thereof. To put our derivations on firm theoretical grounds, we perform a thorough analysis of the nonanalytic behavior of the dynamical matrix and other response functions in a vicinity of Γ. Based on this analysis, we find that there is an ambiguity in the specification of the “zero macroscopic field” condition in the flexoelectric case; such arbitrariness can be related to an analytic band-structure term, in close analogy to the theory of deformation potentials. As a by-product, we derive a rigorous generalization of the Cochran-Cowley formula [Cochran and Cowley, J. Phys. Chem. Solids 23, 447 (1962)] to higher orders in q. This can be of great utility in building reliable atomistic models of electromechanical phenomena, as well as for improving the accuracy of the calculation of phonon dispersion curves. Finally, we discuss the physical interpretation of the various contributions to the flexoelectric response, either in the static or dynamic regime, and we relate our findings to earlier theoretical works on the subject.

  19. Category Theoretic Analysis of Hierarchical Protein Materials and Social Networks

    PubMed Central

    Spivak, David I.; Giesa, Tristan; Wood, Elizabeth; Buehler, Markus J.

    2011-01-01

    Materials in biology span all the scales from Angstroms to meters and typically consist of complex hierarchical assemblies of simple building blocks. Here we describe an application of category theory to describe structural and resulting functional properties of biological protein materials by developing so-called ologs. An olog is like a “concept web” or “semantic network” except that it follows a rigorous mathematical formulation based on category theory. This key difference ensures that an olog is unambiguous, highly adaptable to evolution and change, and suitable for sharing concepts with other olog. We consider simple cases of beta-helical and amyloid-like protein filaments subjected to axial extension and develop an olog representation of their structural and resulting mechanical properties. We also construct a representation of a social network in which people send text-messages to their nearest neighbors and act as a team to perform a task. We show that the olog for the protein and the olog for the social network feature identical category-theoretic representations, and we proceed to precisely explicate the analogy or isomorphism between them. The examples presented here demonstrate that the intrinsic nature of a complex system, which in particular includes a precise relationship between structure and function at different hierarchical levels, can be effectively represented by an olog. This, in turn, allows for comparative studies between disparate materials or fields of application, and results in novel approaches to derive functionality in the design of de novo hierarchical systems. We discuss opportunities and challenges associated with the description of complex biological materials by using ologs as a powerful tool for analysis and design in the context of materiomics, and we present the potential impact of this approach for engineering, life sciences, and medicine. PMID:21931622

  20. Investigating the Cosmic Web with Topological Data Analysis

    NASA Astrophysics Data System (ADS)

    Cisewski-Kehe, Jessi; Wu, Mike; Fasy, Brittany; Hellwing, Wojciech; Lovell, Mark; Rinaldo, Alessandro; Wasserman, Larry

    2018-01-01

    Data exhibiting complicated spatial structures are common in many areas of science (e.g. cosmology, biology), but can be difficult to analyze. Persistent homology is a popular approach within the area of Topological Data Analysis that offers a new way to represent, visualize, and interpret complex data by extracting topological features, which can be used to infer properties of the underlying structures. In particular, TDA may be useful for analyzing the large-scale structure (LSS) of the Universe, which is an intricate and spatially complex web of matter. In order to understand the physics of the Universe, theoretical and computational cosmologists develop large-scale simulations that allow for visualizing and analyzing the LSS under varying physical assumptions. Each point in the 3D data set represents a galaxy or a cluster of galaxies, and topological summaries ("persistent diagrams") can be obtained summarizing the different ordered holes in the data (e.g. connected components, loops, voids).The topological summaries are interesting and informative descriptors of the Universe on their own, but hypothesis tests using the topological summaries would provide a way to make more rigorous comparisons of LSS under different theoretical models. For example, the received cosmological model has cold dark matter (CDM); however, while the case is strong for CDM, there are some observational inconsistencies with this theory. Another possibility is warm dark matter (WDM). It is of interest to see if a CDM Universe and WDM Universe produce LSS that is topologically distinct.We present several possible test statistics for two-sample hypothesis tests using the topological summaries, carryout a simulation study to investigate the suitableness of the proposed test statistics using simulated data from a variation of the Voronoi foam model, and finally we apply the proposed inference framework to WDM vs. CDM cosmological simulation data.

  1. Conductance in a bis-terpyridine based single molecular breadboard circuit† †Electronic supplementary information (ESI) available. See DOI: 10.1039/c6sc03204d Click here for additional data file.

    PubMed Central

    Seth, Charu; Suravarapu, Sankarrao; Reber, David; Hong, Wenjing; Wandlowski, Thomas; Lafolet, Frédéric; Broekmann, Peter

    2017-01-01

    Controlling charge flow in single molecule circuits with multiple electrical contacts and conductance pathways is a much sought after goal in molecular electronics. In this joint experimental and theoretical study, we advance the possibility of creating single molecule breadboard circuits through an analysis of the conductance of a bis-terpyridine based molecule (TP1). The TP1 molecule can adopt multiple conformations through relative rotations of 7 aromatic rings and can attach to electrodes in 61 possible single and multi-terminal configurations through 6 pyridyl groups. Despite this complexity, we show that it is possible to extract well defined conductance features for the TP1 breadboard and assign them rigorously to the underlying constituent circuits. Mechanically controllable break-junction (MCBJ) experiments on the TP1 molecular breadboard show an unprecedented 4 conductance states spanning a range 10 –2 G 0 to 10 –7 G 0. Quantitative theoretical examination of the conductance of TP1 reveals that combinations of 5 types of single terminal 2–5 ring subcircuits are accessed as a function of electrode separation to produce the distinct conductance steps observed in the MCBJ experiments. We estimate the absolute conductance for each single terminal subcircuit and its percentage contribution to the 4 experimentally observed conductance states. We also provide a detailed analysis of the role of quantum interference and thermal fluctuations in modulating conductance within the subcircuits of the TP1 molecular breadboard. Finally, we discuss the possible development of molecular circuit theory and experimental advances necessary for mapping conductance through complex single molecular breadboard circuits in terms of their constituent subcircuits. PMID:28451287

  2. Coordinated Optimization of Visual Cortical Maps (I) Symmetry-based Analysis

    PubMed Central

    Reichl, Lars; Heide, Dominik; Löwel, Siegrid; Crowley, Justin C.; Kaschube, Matthias; Wolf, Fred

    2012-01-01

    In the primary visual cortex of primates and carnivores, functional architecture can be characterized by maps of various stimulus features such as orientation preference (OP), ocular dominance (OD), and spatial frequency. It is a long-standing question in theoretical neuroscience whether the observed maps should be interpreted as optima of a specific energy functional that summarizes the design principles of cortical functional architecture. A rigorous evaluation of this optimization hypothesis is particularly demanded by recent evidence that the functional architecture of orientation columns precisely follows species invariant quantitative laws. Because it would be desirable to infer the form of such an optimization principle from the biological data, the optimization approach to explain cortical functional architecture raises the following questions: i) What are the genuine ground states of candidate energy functionals and how can they be calculated with precision and rigor? ii) How do differences in candidate optimization principles impact on the predicted map structure and conversely what can be learned about a hypothetical underlying optimization principle from observations on map structure? iii) Is there a way to analyze the coordinated organization of cortical maps predicted by optimization principles in general? To answer these questions we developed a general dynamical systems approach to the combined optimization of visual cortical maps of OP and another scalar feature such as OD or spatial frequency preference. From basic symmetry assumptions we obtain a comprehensive phenomenological classification of possible inter-map coupling energies and examine representative examples. We show that each individual coupling energy leads to a different class of OP solutions with different correlations among the maps such that inferences about the optimization principle from map layout appear viable. We systematically assess whether quantitative laws resembling experimental observations can result from the coordinated optimization of orientation columns with other feature maps. PMID:23144599

  3. Tackling complexities in understanding the social determinants of health: the contribution of ethnographic research

    PubMed Central

    2011-01-01

    Objective The complexities inherent in understanding the social determinants of health are often not well-served by quantitative approaches. My aim is to show that well-designed and well-conducted ethnographic studies have an important contribution to make in this regard. Ethnographic research designs are a difficult but rigorous approach to research questions that require us to understand the complexity of people’s social and cultural lives. Approach I draw on an ethnographic study to describe the complexities of studying maternal health in a rural area in India. I then show how the lessons learnt in that setting and context can be applied to studies done in very different settings. Results I show how ethnographic research depends for rigour on a theoretical framework for sample selection; why immersion in the community under study, and rapport building with research participants, is important to ensure rich and meaningful data; and how flexible approaches to data collection lead to the gradual emergence of an analysis based on intense cross-referencing with community views and thus a conclusion that explains the similarities and differences observed. Conclusion When using ethnographic research design it can be difficult to specify in advance the exact details of the study design. Researchers can encounter issues in the field that require them to change what they planned on doing. In rigorous ethnographic studies, the researcher in the field is the research instrument and needs to be well trained in the method. Implication Ethnographic research is challenging, but nevertheless provides a rewarding way of researching complex health problems that require an understanding of the social and cultural determinants of health. PMID:22168509

  4. Near Identifiability of Dynamical Systems

    NASA Technical Reports Server (NTRS)

    Hadaegh, F. Y.; Bekey, G. A.

    1987-01-01

    Concepts regarding approximate mathematical models treated rigorously. Paper presents new results in analysis of structural identifiability, equivalence, and near equivalence between mathematical models and physical processes they represent. Helps establish rigorous mathematical basis for concepts related to structural identifiability and equivalence revealing fundamental requirements, tacit assumptions, and sources of error. "Structural identifiability," as used by workers in this field, loosely translates as meaning ability to specify unique mathematical model and set of model parameters that accurately predict behavior of corresponding physical system.

  5. Mechanical properties of frog skeletal muscles in iodoacetic acid rigor.

    PubMed Central

    Mulvany, M J

    1975-01-01

    1. Methods have been developed for describing the length: tension characteristics of frog skeletal muscles which go into rigor at 4 degrees C following iodoacetic acid poisoning either in the presence of Ca2+ (Ca-rigor) or its absence (Ca-free-rigor). 2. Such rigor muscles showed less resistance to slow stretch (slow rigor resistance) that to fast stretch (fast rigor resistance). The slow and fast rigor resistances of Ca-free-rigor muscles were much lower than those of Ca-rigor muscles. 3. The slow rigor resistance of Ca-rigor muscles was proportional to the amount of overlap between the contractile filaments present when the muscles were put into rigor. 4. Withdrawing Ca2+ from Ca-rigor muscles (induced-Ca-free rigor) reduced their slow and fast rigor resistances. Readdition of Ca2+ (but not Mg2+, Mn2+ or Sr2+) reversed the effect. 5. The slow and fast rigor resistances of Ca-rigor muscles (but not of Ca-free-rigor muscles) decreased with time. 6.The sarcomere structure of Ca-rigor and induced-Ca-free rigor muscles stretched by 0.2lo was destroyed in proportion to the amount of stretch, but the lengths of the remaining intact sarcomeres were essentially unchanged. This suggests that there had been a successive yielding of the weakeast sarcomeres. 7. The difference between the slow and fast rigor resistance and the effect of calcium on these resistances are discussed in relation to possible variations in the strength of crossbridges between the thick and thin filaments. Images Plate 1 Plate 2 PMID:1082023

  6. Reviews of theoretical frameworks: Challenges and judging the quality of theory application.

    PubMed

    Hean, Sarah; Anderson, Liz; Green, Chris; John, Carol; Pitt, Richard; O'Halloran, Cath

    2016-06-01

    Rigorous reviews of available information, from a range of resources, are required to support medical and health educators in their decision making. The aim of this article is to highlight the importance of a review of theoretical frameworks specifically as a supplement to reviews that focus on a synthesis of the empirical evidence alone. Establishing a shared understanding of theory as a concept is highlighted as a challenge and some practical strategies to achieving this are presented. This article also introduces the concept of theoretical quality, arguing that a critique of how theory is applied should complement the methodological appraisal of the literature in a review. We illustrate the challenge of establishing a shared meaning of theory through reference to experiences of an on-going review of this kind conducted in the field of interprofessional education (IPE) and use a high scoring paper selected in this review to illustrate how theoretical quality can be assessed. In reaching a shared understanding of theory as a concept, practical strategies that promote experiential and practical ways of knowing are required in addition to more propositional ways of sharing knowledge. Concepts of parsimony, testability, operational adequacy and empirical adequacy are explored as concepts that establish theoretical quality. Reviews of theoretical frameworks used in medical education are required to inform educational practice. Review teams should make time and effort to reach a shared understanding of the term theory. Theory reviews, and reviews more widely, should add an assessment of theory application to the protocol of their review method.

  7. Systematic theoretical study of non-nuclear electron density maxima in some diatomic molecules.

    PubMed

    Terrabuio, Luiz A; Teodoro, Tiago Q; Rachid, Marina G; Haiduke, Roberto L A

    2013-10-10

    First, exploratory calculations were performed to investigate the presence of non-nuclear maxima (NNMs) in ground-state electron densities of homonuclear diatomic molecules from hydrogen up to calcium at their equilibrium geometries. In a second stage, only for the cases in which these features were previously detected, a rigorous analysis was carried out by several combinations of theoretical methods and basis sets in order to ensure that they are not only calculation artifacts. Our best results support that Li2, B2, C2, and P2 are molecules that possess true NNMs. A NNM was found in values obtained from the largest basis sets for Na2, but it disappeared at the experimental geometry because optimized bond lengths are significantly inaccurate for this case (deviations of 0.10 Å). Two of these maxima are also observed in Si2 with CCSD and large basis sets, but they are no longer detected as core-valence correlation or multiconfigurational wave functions are taken into account. Therefore, the NNMs in Si2 can be considered unphysical features due to an incomplete treatment of electron correlation. Finally, we show that a NNM is encountered in LiNa, representing the first discovery of such electron density maxima in a heteronuclear diatomic system at its equilibrium geometry, to our knowledge. Some results for LiNa, found in variations in internuclear distances, suggest that molecular electric moments, such as dipole and quadrupole, are sensitive to the presence of NNMs.

  8. The bacteriorhodopsin model membrane system as a prototype molecular computing element.

    PubMed

    Hong, F T

    1986-01-01

    The quest for more sophisticated integrated circuits to overcome the limitation of currently available silicon integrated circuits has led to the proposal of using biological molecules as computational elements by computer scientists and engineers. While the theoretical aspect of this possibility has been pursued by computer scientists, the research and development of experimental prototypes have not been pursued with an equal intensity. In this survey, we make an attempt to examine model membrane systems that incorporate the protein pigment bacteriorhodopsin which is found in Halobacterium halobium. This system was chosen for several reasons. The pigment/membrane system is sufficiently simple and stable for rigorous quantitative study, yet at the same time sufficiently complex in molecular structure to permit alteration of this structure in an attempt to manipulate the photosignal. Several methods of forming the pigment/membrane assembly are described and the potential application to biochip design is discussed. Experimental data using these membranes and measured by a tunable voltage clamp method are presented along with a theoretical analysis based on the Gouy-Chapman diffuse double layer theory to illustrate the usefulness of this approach. It is shown that detailed layouts of the pigment/membrane assembly as well as external loading conditions can modify the time course of the photosignal in a predictable manner. Some problems that may arise in the actual implementation and manufacturing, as well as the use of existing technology in protein chemistry, immunology, and recombinant DNA technology are discussed.

  9. On analyticity of linear waves scattered by a layered medium

    NASA Astrophysics Data System (ADS)

    Nicholls, David P.

    2017-10-01

    The scattering of linear waves by periodic structures is a crucial phenomena in many branches of applied physics and engineering. In this paper we establish rigorous analytic results necessary for the proper numerical analysis of a class of High-Order Perturbation of Surfaces methods for simulating such waves. More specifically, we prove a theorem on existence and uniqueness of solutions to a system of partial differential equations which model the interaction of linear waves with a multiply layered periodic structure in three dimensions. This result provides hypotheses under which a rigorous numerical analysis could be conducted for recent generalizations to the methods of Operator Expansions, Field Expansions, and Transformed Field Expansions.

  10. Extensions to regret-based decision curve analysis: an application to hospice referral for terminal patients.

    PubMed

    Tsalatsanis, Athanasios; Barnes, Laura E; Hozo, Iztok; Djulbegovic, Benjamin

    2011-12-23

    Despite the well documented advantages of hospice care, most terminally ill patients do not reap the maximum benefit from hospice services, with the majority of them receiving hospice care either prematurely or delayed. Decision systems to improve the hospice referral process are sorely needed. We present a novel theoretical framework that is based on well-established methodologies of prognostication and decision analysis to assist with the hospice referral process for terminally ill patients. We linked the SUPPORT statistical model, widely regarded as one of the most accurate models for prognostication of terminally ill patients, with the recently developed regret based decision curve analysis (regret DCA). We extend the regret DCA methodology to consider harms associated with the prognostication test as well as harms and effects of the management strategies. In order to enable patients and physicians in making these complex decisions in real-time, we developed an easily accessible web-based decision support system available at the point of care. The web-based decision support system facilitates the hospice referral process in three steps. First, the patient or surrogate is interviewed to elicit his/her personal preferences regarding the continuation of life-sustaining treatment vs. palliative care. Then, regret DCA is employed to identify the best strategy for the particular patient in terms of threshold probability at which he/she is indifferent between continuation of treatment and of hospice referral. Finally, if necessary, the probabilities of survival and death for the particular patient are computed based on the SUPPORT prognostication model and contrasted with the patient's threshold probability. The web-based design of the CDSS enables patients, physicians, and family members to participate in the decision process from anywhere internet access is available. We present a theoretical framework to facilitate the hospice referral process. Further rigorous clinical evaluation including testing in a prospective randomized controlled trial is required and planned.

  11. Extensions to Regret-based Decision Curve Analysis: An application to hospice referral for terminal patients

    PubMed Central

    2011-01-01

    Background Despite the well documented advantages of hospice care, most terminally ill patients do not reap the maximum benefit from hospice services, with the majority of them receiving hospice care either prematurely or delayed. Decision systems to improve the hospice referral process are sorely needed. Methods We present a novel theoretical framework that is based on well-established methodologies of prognostication and decision analysis to assist with the hospice referral process for terminally ill patients. We linked the SUPPORT statistical model, widely regarded as one of the most accurate models for prognostication of terminally ill patients, with the recently developed regret based decision curve analysis (regret DCA). We extend the regret DCA methodology to consider harms associated with the prognostication test as well as harms and effects of the management strategies. In order to enable patients and physicians in making these complex decisions in real-time, we developed an easily accessible web-based decision support system available at the point of care. Results The web-based decision support system facilitates the hospice referral process in three steps. First, the patient or surrogate is interviewed to elicit his/her personal preferences regarding the continuation of life-sustaining treatment vs. palliative care. Then, regret DCA is employed to identify the best strategy for the particular patient in terms of threshold probability at which he/she is indifferent between continuation of treatment and of hospice referral. Finally, if necessary, the probabilities of survival and death for the particular patient are computed based on the SUPPORT prognostication model and contrasted with the patient's threshold probability. The web-based design of the CDSS enables patients, physicians, and family members to participate in the decision process from anywhere internet access is available. Conclusions We present a theoretical framework to facilitate the hospice referral process. Further rigorous clinical evaluation including testing in a prospective randomized controlled trial is required and planned. PMID:22196308

  12. Improved bounds on the energy-minimizing strains in martensitic polycrystals

    NASA Astrophysics Data System (ADS)

    Peigney, Michaël

    2016-07-01

    This paper is concerned with the theoretical prediction of the energy-minimizing (or recoverable) strains in martensitic polycrystals, considering a nonlinear elasticity model of phase transformation at finite strains. The main results are some rigorous upper bounds on the set of energy-minimizing strains. Those bounds depend on the polycrystalline texture through the volume fractions of the different orientations. The simplest form of the bounds presented is obtained by combining recent results for single crystals with a homogenization approach proposed previously for martensitic polycrystals. However, the polycrystalline bound delivered by that procedure may fail to recover the monocrystalline bound in the homogeneous limit, as is demonstrated in this paper by considering an example related to tetragonal martensite. This motivates the development of a more detailed analysis, leading to improved polycrystalline bounds that are notably consistent with results for single crystals in the homogeneous limit. A two-orientation polycrystal of tetragonal martensite is studied as an illustration. In that case, analytical expressions of the upper bounds are derived and the results are compared with lower bounds obtained by considering laminate textures.

  13. Performance characteristics of two volume phase holographic grisms produced for the ESPRESSO spectrograph

    NASA Astrophysics Data System (ADS)

    Arns, James A.

    2016-08-01

    The ESPRESSO spectrograph [1], a new addition to the European Southern Observatory's (ESO) Very Large Telescope (VLT), requires two volume phase holographic (VPH) grisms, one blue and the other red, splitting the overall spectral range of the instrument to maximize throughput while achieving high resolution. The blue grism covers the spectral range from 375 nm to 520 nm with a dispersion of 0.88 degrees/nm at the central wavelength of 438 nm. The red grism operates from 535 nm to 780 nm with a dispersion of 0.47 degrees/nm at 654.8 nm. Both designs use a single input prism to enhance the dispersion of the grism assembly. The grisms are relatively large in size with a working aperture of 185 mm x 185 mm for the blue grism and 215 nm x 185 mm for the red grism respectively. This paper describes the specifications of the two grating types, gives the rigorous coupled wave analysis (RCWA) theoretical performances of diffraction efficiency for the production designs and presents the measured performances of each of the delivered grisms.

  14. Understanding Race and Racism in Nursing: Insights from Aboriginal Nurses

    PubMed Central

    Vukic, Adele; Jesty, Charlotte; Mathews, Sr. Veronica; Etowa, Josephine

    2012-01-01

    Purpose. Indigenous Peoples are underrepresented in the health professions. This paper examines indigenous identity and the quality and nature of nursing work-life. The knowledge generated should enhance strategies to increase representation of indigenous peoples in nursing to reduce health inequities. Design. Community-based participatory research employing Grounded Theory as the method was the design for this study. Theoretical sampling and constant comparison guided the data collection and analysis, and a number of validation strategies including member checks were employed to ensure rigor of the research process. Sample. Twenty-two Aboriginal nurses in Atlantic Canada. Findings. Six major themes emerged from the study: Cultural Context of Work-life, Becoming a Nurse, Navigating Nursing, Race Racism and Nursing, Socio-Political Context of Aboriginal Nursing, and Way Forward. Race and racism in nursing and related subthemes are the focus of this paper. Implications. The experiences of Aboriginal nurses as described in this paper illuminate the need to understand the interplay of race and racism in the health care system. Our paper concludes with Aboriginal nurses' suggestions for systemic change at various levels. PMID:22778991

  15. Deep-etched sinusoidal polarizing beam splitter grating.

    PubMed

    Feng, Jijun; Zhou, Changhe; Cao, Hongchao; Lv, Peng

    2010-04-01

    A sinusoidal-shaped fused-silica grating as a highly efficient polarizing beam splitter (PBS) is investigated based on the simplified modal method. The grating structure depends mainly on the ratio of groove depth to grating period and the ratio of incident wavelength to grating period. These ratios can be used as a guideline for the grating design at different wavelengths. A sinusoidal-groove PBS grating is designed at a wavelength of 1310 nm under Littrow mounting, and the transmitted TM and TE polarized waves are mainly diffracted into the zeroth order and the -1st order, respectively. The grating profile is optimized by using rigorous coupled-wave analysis. The designed PBS grating is highly efficient (>95.98%) over the O-band wavelength range (1260-1360 nm) for both TE and TM polarizations. The sinusoidal grating can exhibit higher diffraction efficiency, larger extinction ratio, and less reflection loss than the rectangular-groove PBS grating. By applying wet etching technology on the rectangular grating, which was manufactured by holographic recording and inductively coupled plasma etching technology, the sinusoidal grating can be approximately fabricated. Experimental results are in agreement with theoretical values.

  16. Key stages of material expansion in dielectrics upon femtosecond laser ablation revealed by double-color illumination time-resolved microscopy

    NASA Astrophysics Data System (ADS)

    Garcia-Lechuga, Mario; Solis, Javier; Siegel, Jan

    2018-03-01

    The physical origin of material removal in dielectrics upon femtosecond laser pulse irradiation (800 nm, 120 fs pulse duration) has been investigated at fluences slightly above ablation threshold. Making use of a versatile pump-probe microscopy setup, the dynamics and different key stages of the ablation process in lithium niobate have been monitored. The use of two different illumination wavelengths, 400 and 800 nm, and a rigorous image analysis combined with theoretical modelling, enables drawing a clear picture of the material excitation and expansion stages. Immediately after excitation, a dense electron plasma is generated. Few picoseconds later, direct evidence of a rarefaction wave propagating into the bulk is obtained, with an estimated speed of 3650 m/s. This process marks the onset of material expansion, which is confirmed by the appearance of transient Newton rings, which dynamically change during the expansion up to approximately 1 ns. Exploring delays up to 15 ns, a second dynamic Newton ring pattern is observed, consistent with the formation of a second ablation front propagating five times slower than the first one.

  17. Qualitative Descriptive Methods in Health Science Research.

    PubMed

    Colorafi, Karen Jiggins; Evans, Bronwynne

    2016-07-01

    The purpose of this methodology paper is to describe an approach to qualitative design known as qualitative descriptive that is well suited to junior health sciences researchers because it can be used with a variety of theoretical approaches, sampling techniques, and data collection strategies. It is often difficult for junior qualitative researchers to pull together the tools and resources they need to embark on a high-quality qualitative research study and to manage the volumes of data they collect during qualitative studies. This paper seeks to pull together much needed resources and provide an overview of methods. A step-by-step guide to planning a qualitative descriptive study and analyzing the data is provided, utilizing exemplars from the authors' research. This paper presents steps to conducting a qualitative descriptive study under the following headings: describing the qualitative descriptive approach, designing a qualitative descriptive study, steps to data analysis, and ensuring rigor of findings. The qualitative descriptive approach results in a summary in everyday, factual language that facilitates understanding of a selected phenomenon across disciplines of health science researchers. © The Author(s) 2016.

  18. The Lévy flight paradigm: random search patterns and mechanisms.

    PubMed

    Reynolds, A M; Rhodes, C J

    2009-04-01

    Over recent years there has been an accumulation of evidence from a variety of experimental, theoretical, and field studies that many organisms use a movement strategy approximated by Lévy flights when they are searching for resources. Lévy flights are random movements that can maximize the efficiency of resource searches in uncertain environments. This is a highly significant finding because it suggests that Lévy flights provide a rigorous mathematical basis for separating out evolved, innate behaviors from environmental influences. We discuss recent developments in random-search theory, as well as the many different experimental and data collection initiatives that have investigated search strategies. Methods for trajectory construction and robust data analysis procedures are presented. The key to prediction and understanding does, however, lie in the elucidation of mechanisms underlying the observed patterns. We discuss candidate neurological, olfactory, and learning mechanisms for the emergence of Lévy flight patterns in some organisms, and note that convergence of behaviors along such different evolutionary pathways is not surprising given the energetic efficiencies that Lévy flight movement patterns confer.

  19. Validation of the instrument of health literacy competencies for Chinese-speaking health professionals.

    PubMed

    Chang, Li-Chun; Chen, Yu-Chi; Liao, Li-Ling; Wu, Fei Ling; Hsieh, Pei-Lin; Chen, Hsiao-Jung

    2017-01-01

    The study aimed to illustrate the constructs and test the psychometric properties of an instrument of health literacy competencies (IOHLC) for health professionals. A multi-phase questionnaire development method was used to develop the scale. The categorization of the knowledge and practice domains achieved consensus through a modified Delphi process. To reduce the number of items, the 92-item IOHLC was psychometrically evaluated through internal consistency, Rasch modeling, and two-stage factor analysis. In total, 736 practitioners, including nurses, nurse practitioners, health educators, case managers, and dieticians completed the 92-item IOHLC online from May 2012 to January 2013. The final version of the IOHLC covered 9 knowledge items and 40 skill items containing 9 dimensions, with good model fit, and explaining 72% of total variance. All domains had acceptable internal consistency and discriminant validity. The tool in this study is the first to verify health literacy competencies rigorously. Moreover, through psychometric testing, the 49-item IOHLC demonstrates adequate reliability and validity. The IOHLC may serve as a reference for the theoretical and in-service training of Chinese-speaking individuals' health literacy competencies.

  20. An in-depth stability analysis of nonuniform FDTD combined with novel local implicitization techniques

    NASA Astrophysics Data System (ADS)

    Van Londersele, Arne; De Zutter, Daniël; Vande Ginste, Dries

    2017-08-01

    This work focuses on efficient full-wave solutions of multiscale electromagnetic problems in the time domain. Three local implicitization techniques are proposed and carefully analyzed in order to relax the traditional time step limit of the Finite-Difference Time-Domain (FDTD) method on a nonuniform, staggered, tensor product grid: Newmark, Crank-Nicolson (CN) and Alternating-Direction-Implicit (ADI) implicitization. All of them are applied in preferable directions, alike Hybrid Implicit-Explicit (HIE) methods, as to limit the rank of the sparse linear systems. Both exponential and linear stability are rigorously investigated for arbitrary grid spacings and arbitrary inhomogeneous, possibly lossy, isotropic media. Numerical examples confirm the conservation of energy inside a cavity for a million iterations if the time step is chosen below the proposed, relaxed limit. Apart from the theoretical contributions, new accomplishments such as the development of the leapfrog Alternating-Direction-Hybrid-Implicit-Explicit (ADHIE) FDTD method and a less stringent Courant-like time step limit for the conventional, fully explicit FDTD method on a nonuniform grid, have immediate practical applications.

  1. Ghost imaging based on Pearson correlation coefficients

    NASA Astrophysics Data System (ADS)

    Yu, Wen-Kai; Yao, Xu-Ri; Liu, Xue-Feng; Li, Long-Zhen; Zhai, Guang-Jie

    2015-05-01

    Correspondence imaging is a new modality of ghost imaging, which can retrieve a positive/negative image by simple conditional averaging of the reference frames that correspond to relatively large/small values of the total intensity measured at the bucket detector. Here we propose and experimentally demonstrate a more rigorous and general approach in which a ghost image is retrieved by calculating a Pearson correlation coefficient between the bucket detector intensity and the brightness at a given pixel of the reference frames, and at the next pixel, and so on. Furthermore, we theoretically provide a statistical interpretation of these two imaging phenomena, and explain how the error depends on the sample size and what kind of distribution the error obeys. According to our analysis, the image signal-to-noise ratio can be greatly improved and the sampling number reduced by means of our new method. Project supported by the National Key Scientific Instrument and Equipment Development Project of China (Grant No. 2013YQ030595) and the National High Technology Research and Development Program of China (Grant No. 2013AA122902).

  2. Estimation of integral curves from high angular resolution diffusion imaging (HARDI) data.

    PubMed

    Carmichael, Owen; Sakhanenko, Lyudmila

    2015-05-15

    We develop statistical methodology for a popular brain imaging technique HARDI based on the high order tensor model by Özarslan and Mareci [10]. We investigate how uncertainty in the imaging procedure propagates through all levels of the model: signals, tensor fields, vector fields, and fibers. We construct asymptotically normal estimators of the integral curves or fibers which allow us to trace the fibers together with confidence ellipsoids. The procedure is computationally intense as it blends linear algebra concepts from high order tensors with asymptotical statistical analysis. The theoretical results are illustrated on simulated and real datasets. This work generalizes the statistical methodology proposed for low angular resolution diffusion tensor imaging by Carmichael and Sakhanenko [3], to several fibers per voxel. It is also a pioneering statistical work on tractography from HARDI data. It avoids all the typical limitations of the deterministic tractography methods and it delivers the same information as probabilistic tractography methods. Our method is computationally cheap and it provides well-founded mathematical and statistical framework where diverse functionals on fibers, directions and tensors can be studied in a systematic and rigorous way.

  3. Estimation of integral curves from high angular resolution diffusion imaging (HARDI) data

    PubMed Central

    Carmichael, Owen; Sakhanenko, Lyudmila

    2015-01-01

    We develop statistical methodology for a popular brain imaging technique HARDI based on the high order tensor model by Özarslan and Mareci [10]. We investigate how uncertainty in the imaging procedure propagates through all levels of the model: signals, tensor fields, vector fields, and fibers. We construct asymptotically normal estimators of the integral curves or fibers which allow us to trace the fibers together with confidence ellipsoids. The procedure is computationally intense as it blends linear algebra concepts from high order tensors with asymptotical statistical analysis. The theoretical results are illustrated on simulated and real datasets. This work generalizes the statistical methodology proposed for low angular resolution diffusion tensor imaging by Carmichael and Sakhanenko [3], to several fibers per voxel. It is also a pioneering statistical work on tractography from HARDI data. It avoids all the typical limitations of the deterministic tractography methods and it delivers the same information as probabilistic tractography methods. Our method is computationally cheap and it provides well-founded mathematical and statistical framework where diverse functionals on fibers, directions and tensors can be studied in a systematic and rigorous way. PMID:25937674

  4. Dynamics and Breakup of a Contracting Viscous Filament

    NASA Astrophysics Data System (ADS)

    Wilkes, Edward; Notz, Patrick; Ambravaneswaran, Bala; Basaran, Osman

    1999-11-01

    Free viscous filaments are formed during the breakup of liquid drops and jets. Such filaments are typically precursors of satellite droplets that are often undesirable in applications such as ink-jet printing. In this paper, the contraction of an axisymmetric liquid filament due to action of surface tension is studied theoretically. The analysis is based on solving (a) the full Navier-Stokes system in two-dimensions (2-d) and (b) a one-dimensional (1-d) approximation of the exact equations based on slender-jet theory. The rigorous, 2-d calculations are carried out with finite element algorithms using either algebraic or elliptic mesh generation. As the filament contracts, bulbous regions form at its two ends. When the initial aspect ratio a/b and/or the Reynolds number Re are sufficiently low, the ends coalesce into an oscillating free drop. Filament breakup occurs when a/b and/or Re are sufficiently high. The 2-d algorithms reveal for the first time that liquid filaments of finite viscosity can overturn prior to interface rupture. The power of elliptic mesh generation over algebraic methods in analyzing such situations is highlighted.

  5. A hybrid method for determination of the acoustic impedance of an unflanged cylindrical duct for multimode wave

    NASA Astrophysics Data System (ADS)

    Snakowska, Anna; Jurkiewicz, Jerzy; Gorazd, Łukasz

    2017-05-01

    The paper presents derivation of the impedance matrix based on the rigorous solution of the wave equation obtained by the Wiener-Hopf technique for a semi-infinite unflanged cylindrical duct. The impedance matrix allows, in turn, calculate the acoustic impedance along the duct and, as a special case, the radiation impedance. The analysis is carried out for a multimode incident wave accounting for modes coupling on the duct outlet not only qualitatively but also quantitatively for a selected source operating inside. The quantitative evaluation of the acoustic impedance requires setting of modes amplitudes which has been obtained applying the mode decomposition method to the far-field pressure radiation measurements and theoretical formulae for single mode directivity characteristics for an unflanged duct. Calculation of the acoustic impedance for a non-uniform distribution of the sound pressure and the sound velocity on a duct cross section requires determination of the acoustic power transmitted along/radiated from a duct. In the paper, the impedance matrix, the power, and the acoustic impedance were derived as functions of Helmholtz number and distance from the outlet.

  6. THE MECHANISM OF LESION FORMATION BY FOCUSED ULTRASOUND ABLATION CATHETER FOR TREATMENT OF ATRIAL FIBRILLATION

    PubMed Central

    Sinelnikov, Y.D.; Fjield, T.; Sapozhnikov, O.A.

    2009-01-01

    The application of therapeutic ultrasound for the treatment of atrial fibrillation (AF) is investigated. The results of theoretical and experimental investigation of ultrasound ablation catheter are presented. The major components of the catheter are the high power cylindrical piezoelectric element and parabolic balloon reflector. Thermal elevation in the ostia of pulmonary veins is achieved by focusing the ultrasound beam in shape of a torus that transverses the myocardial tissue. High intensity ultrasound heating in the focal zone results in a lesion surrounding the pulmonary veins that creates an electrical conduction blocks and relief from AF symptoms. The success of the ablation procedure largely depends on the correct choice of reflector geometry and ultrasonic power. We present a theoretical model of the catheter’s acoustic field and bioheat transfer modeling of cardiac lesions. The application of an empirically derived relation between lesion formation and acoustic power is shown to correlate with the experimental data. Developed control methods combine the knowledge of theoretical acoustics and the thermal lesion formation simulations with experiment and thereby establish rigorous dosimetry that contributes to a safe and effective ultrasound ablation procedure. PMID:20161431

  7. Can context justify an ethical double standard for clinical research in developing countries?

    PubMed Central

    Landes, Megan

    2005-01-01

    Background The design of clinical research deserves special caution so as to safeguard the rights of participating individuals. While the international community has agreed on ethical standards for the design of research, these frameworks still remain open to interpretation, revision and debate. Recently a breach in the consensus of how to apply these ethical standards to research in developing countries has occurred, notably beginning with the 1994 placebo-controlled trials to reduce maternal to child transmission of HIV-1 in Africa, Asia and the Caribbean. The design of these trials sparked intense debate with the inclusion of a placebo-control group despite the existence of a 'gold standard' and trial supporters grounded their justifications of the trial design on the context of scarcity in resource-poor settings. Discussion These 'contextual' apologetics are arguably an ethical loophole inherent in current bioethical methodology. However, this convenient appropriation of 'contextual' analysis simply fails to acknowledge the underpinnings of feminist ethical analysis upon which it must stand. A more rigorous analysis of the political, social, and economic structures pertaining to the global context of developing countries reveals that the bioethical principles of beneficence and justice fail to be met in this trial design. Conclusion Within this broader, and theoretically necessary, understanding of context, it becomes impossible to justify an ethical double standard for research in developing countries. PMID:16045801

  8. Translational research: a concept analysis.

    PubMed

    Wendler, M Cecilia; Kirkbride, Geri; Wade, Kristen; Ferrell, Lynne

    2013-01-01

    BACKGROUND/CONCEPTUAL FRAMEWORK: Little is known about which approaches facilitate adoption and sustainment of evidence-based practice change in the highly complex care environments that constitute clinical practice today. The purpose of this article was to complete a concept analysis of translational research using a modified Walker and Avant approach. DESIGN/DATA COLLECTION: Using a rigorous and thorough review of the recent health care literature generated by a deep electronic search from 2004-2011, 85 appropriate documents were retrieved. Close reading of the articles by three coresearchers yielded an analysis of the emerging concept of translational research. Using the iterative process described by Walker and Avant, a tentative definition of the concept of translational research, along with antecedents and consequences were identified. Implications for health care professionals in education, practice, and research are offered. Further research is needed to determine the adequacy of the definition, to identify empirical referents, and to guide theory development. The study resulted in a theoretical definition of the concept of translational research, along with identification of antecedents and consequences and a description of an ideal or model case to illustrate the definition. Implications for practice and education include the importance of focusing on translational research approaches that may reduce the research-practice gap in health care, thereby improving patient care delivery. Research is needed to determine the usefulness of the definition in health care clinical practice.

  9. Towards tests of quark-hadron duality with functional analysis and spectral function data

    NASA Astrophysics Data System (ADS)

    Boito, Diogo; Caprini, Irinel

    2017-04-01

    The presence of terms that violate quark-hadron duality in the expansion of QCD Green's functions is a generally accepted fact. Recently, a new approach was proposed for the study of duality violations (DVs), which exploits the existence of a rigorous lower bound on the functional distance, measured in a certain norm, between a "true" correlator and its approximant calculated theoretically along a contour in the complex energy plane. In the present paper, we pursue the investigation of functional-analysis-based tests towards their application to real spectral function data. We derive a closed analytic expression for the minimal functional distance based on the general weighted L2 norm and discuss its relation with the distance measured in the L∞ norm. Using fake data sets obtained from a realistic toy model in which we allow for covariances inspired from the publicly available ALEPH spectral functions, we obtain, by Monte Carlo simulations, the statistical distribution of the strength parameter that measures the magnitude of the DV term added to the usual operator product expansion. The results show that, if the region with large errors near the end point of the spectrum in τ decays is excluded, the functional-analysis-based tests using either L2 or L∞ norms are able to detect, in a statistically significant way, the presence of DVs in realistic spectral function pseudodata.

  10. Digital morphogenesis via Schelling segregation

    NASA Astrophysics Data System (ADS)

    Barmpalias, George; Elwes, Richard; Lewis-Pye, Andrew

    2018-04-01

    Schelling’s model of segregation looks to explain the way in which particles or agents of two types may come to arrange themselves spatially into configurations consisting of large homogeneous clusters, i.e. connected regions consisting of only one type. As one of the earliest agent based models studied by economists and perhaps the most famous model of self-organising behaviour, it also has direct links to areas at the interface between computer science and statistical mechanics, such as the Ising model and the study of contagion and cascading phenomena in networks. While the model has been extensively studied it has largely resisted rigorous analysis, prior results from the literature generally pertaining to variants of the model which are tweaked so as to be amenable to standard techniques from statistical mechanics or stochastic evolutionary game theory. In Brandt et al (2012 Proc. 44th Annual ACM Symp. on Theory of Computing) provided the first rigorous analysis of the unperturbed model, for a specific set of input parameters. Here we provide a rigorous analysis of the model’s behaviour much more generally and establish some surprising forms of threshold behaviour, notably the existence of situations where an increased level of intolerance for neighbouring agents of opposite type leads almost certainly to decreased segregation.

  11. Why Open-Ended Survey Questions Are Unlikely to Support Rigorous Qualitative Insights.

    PubMed

    LaDonna, Kori A; Taylor, Taryn; Lingard, Lorelei

    2018-03-01

    Health professions education researchers are increasingly relying on a combination of quantitative and qualitative research methods to explore complex questions in the field. This important and necessary development, however, creates new methodological challenges that can affect both the rigor of the research process and the quality of the findings. One example is "qualitatively" analyzing free-text responses to survey or assessment instrument questions. In this Invited Commentary, the authors explain why analysis of such responses rarely meets the bar for rigorous qualitative research. While the authors do not discount the potential for free-text responses to enhance quantitative findings or to inspire new research questions, they caution that these responses rarely produce data rich enough to generate robust, stand-alone insights. The authors consider exemplars from health professions education research and propose strategies for treating free-text responses appropriately.

  12. IMPROVING ALTERNATIVES FOR ENVIRONMENTAL IMPACT ASSESSMENT. (R825758)

    EPA Science Inventory

    Environmental impact assessment (EIA), in the US, requires an objective and rigorous analysis of alternatives. Yet the choice of alternatives for that analysis can be subjective and arbitrary. Alternatives often reflect narrow project objectives, agency agendas, and predilecti...

  13. FORMAL SCENARIO DEVELOPMENT FOR ENVIRONMENTAL IMPACT ASSESSMENT STUDIES

    EPA Science Inventory

    Scenario analysis is a process of evaluating possible future events through the consideration of alternative plausible (though not equally likely) outcomes (scenarios). The analysis is designed to enable improved decision-making and assessment through a more rigorous evaluation o...

  14. Theoretical rationale for music selection in oncology intervention research: an integrative review.

    PubMed

    Burns, Debra S

    2012-01-01

    Music-based interventions have helped patients with cancer improve their quality of life, decrease treatment related distress, and manage pain. However, quantitative findings from music intervention studies are inconsistent. The purpose of this review was to explore the theoretical underpinnings for the selection of the music stimuli used to influence targeted outcomes. It was hypothesized that disparate findings were due in part to the atheoretical nature of music selection and the resulting diversity in music stimuli between and within studies. A systematic research synthesis including a comprehensive database and reference list search resulted in 22 studies. Included studies were compiled into two tables cataloging intervention theory, intervention content, and outcomes. A majority of studies did not provide a rationale or intervention theory for the delivery of music or choice of outcomes. Recorded music was the most common delivery method, but the specific music was rarely included within the report. Only two studies that included a theoretical framework reported null results on at least some of the outcomes. Null results are partially explained by an incomplete or mismatch in intervention theory and music selection and delivery. While the inclusion of an intervention theory does not guarantee positive results, including a theoretical rationale for the use of music, particular therapeutic processes or mechanisms, and the specifics of how music is selected and delivered increases scientific rigor and the probability of clinical translation.

  15. Concrete ensemble Kalman filters with rigorous catastrophic filter divergence

    PubMed Central

    Kelly, David; Majda, Andrew J.; Tong, Xin T.

    2015-01-01

    The ensemble Kalman filter and ensemble square root filters are data assimilation methods used to combine high-dimensional, nonlinear dynamical models with observed data. Ensemble methods are indispensable tools in science and engineering and have enjoyed great success in geophysical sciences, because they allow for computationally cheap low-ensemble-state approximation for extremely high-dimensional turbulent forecast models. From a theoretical perspective, the dynamical properties of these methods are poorly understood. One of the central mysteries is the numerical phenomenon known as catastrophic filter divergence, whereby ensemble-state estimates explode to machine infinity, despite the true state remaining in a bounded region. In this article we provide a breakthrough insight into the phenomenon, by introducing a simple and natural forecast model that transparently exhibits catastrophic filter divergence under all ensemble methods and a large set of initializations. For this model, catastrophic filter divergence is not an artifact of numerical instability, but rather a true dynamical property of the filter. The divergence is not only validated numerically but also proven rigorously. The model cleanly illustrates mechanisms that give rise to catastrophic divergence and confirms intuitive accounts of the phenomena given in past literature. PMID:26261335

  16. Concrete ensemble Kalman filters with rigorous catastrophic filter divergence.

    PubMed

    Kelly, David; Majda, Andrew J; Tong, Xin T

    2015-08-25

    The ensemble Kalman filter and ensemble square root filters are data assimilation methods used to combine high-dimensional, nonlinear dynamical models with observed data. Ensemble methods are indispensable tools in science and engineering and have enjoyed great success in geophysical sciences, because they allow for computationally cheap low-ensemble-state approximation for extremely high-dimensional turbulent forecast models. From a theoretical perspective, the dynamical properties of these methods are poorly understood. One of the central mysteries is the numerical phenomenon known as catastrophic filter divergence, whereby ensemble-state estimates explode to machine infinity, despite the true state remaining in a bounded region. In this article we provide a breakthrough insight into the phenomenon, by introducing a simple and natural forecast model that transparently exhibits catastrophic filter divergence under all ensemble methods and a large set of initializations. For this model, catastrophic filter divergence is not an artifact of numerical instability, but rather a true dynamical property of the filter. The divergence is not only validated numerically but also proven rigorously. The model cleanly illustrates mechanisms that give rise to catastrophic divergence and confirms intuitive accounts of the phenomena given in past literature.

  17. Developing inter-professional learning: tactics, teamwork and talk.

    PubMed

    Begley, Cecily M

    2009-04-01

    Teamwork and collaboration between all health professionals results in high quality clinical care, and increased job satisfaction for staff. Encouraging inter-professional learning (IPL) may be advantageous in developing more effective teams. There is little rigorous research in this area, but many small uncontrolled studies do demonstrate positive results. IPL involves structured learning opportunities that enhance problem-solving abilities and conflict resolution. It should be clearly differentiated from shared teaching (or multidisciplinary/multiprofessional learning), where common content is taught to many professions without any intention to develop interaction. To counteract the sometimes negative attitudes in both students and staff, educators need to commence IPL early in the programme, base it in both theoretical and clinical placements and ensure that it is valued and assessed. Difficulties with timetabling and accommodation need to be solved prior to commencement. A facilitator should be employed, and a team of committed lecturers developed, with an emphasis on teamwork and the discouragement of individualism. Opportunities for student interaction and ways of improving group dynamics within non-threatening learning environments should to be sought, and instances of conflict embraced and resolved. Future IPL programmes should be rigorously evaluated and may demonstrate enhanced inter-professional relationships and improved quality of patient/client care.

  18. Theory of the deformation of aligned polyethylene.

    PubMed

    Hammad, A; Swinburne, T D; Hasan, H; Del Rosso, S; Iannucci, L; Sutton, A P

    2015-08-08

    Solitons are proposed as the agents of plastic and viscoelastic deformation in aligned polyethylene. Interactions between straight, parallel molecules are mapped rigorously onto the Frenkel-Kontorova model. It is shown that these molecular interactions distribute an applied load between molecules, with a characteristic transfer length equal to the soliton width. Load transfer leads to the introduction of tensile and compressive solitons at the chain ends to mark the onset of plasticity at a well-defined yield stress, which is much less than the theoretical pull-out stress. Interaction energies between solitons and an equation of motion for solitons are derived. The equation of motion is based on Langevin dynamics and the fluctuation-dissipation theorem and it leads to the rigorous definition of an effective mass for solitons. It forms the basis of a soliton dynamics in direct analogy to dislocation dynamics. Close parallels are drawn between solitons in aligned polymers and dislocations in crystals, including the configurational force on a soliton. The origins of the strain rate and temperature dependencies of the viscoelastic behaviour are discussed in terms of the formation energy of solitons. A failure mechanism is proposed involving soliton condensation under a tensile load.

  19. Validation of Normalizations, Scaling, and Photofading Corrections for FRAP Data Analysis

    PubMed Central

    Kang, Minchul; Andreani, Manuel; Kenworthy, Anne K.

    2015-01-01

    Fluorescence Recovery After Photobleaching (FRAP) has been a versatile tool to study transport and reaction kinetics in live cells. Since the fluorescence data generated by fluorescence microscopy are in a relative scale, a wide variety of scalings and normalizations are used in quantitative FRAP analysis. Scaling and normalization are often required to account for inherent properties of diffusing biomolecules of interest or photochemical properties of the fluorescent tag such as mobile fraction or photofading during image acquisition. In some cases, scaling and normalization are also used for computational simplicity. However, to our best knowledge, the validity of those various forms of scaling and normalization has not been studied in a rigorous manner. In this study, we investigate the validity of various scalings and normalizations that have appeared in the literature to calculate mobile fractions and correct for photofading and assess their consistency with FRAP equations. As a test case, we consider linear or affine scaling of normal or anomalous diffusion FRAP equations in combination with scaling for immobile fractions. We also consider exponential scaling of either FRAP equations or FRAP data to correct for photofading. Using a combination of theoretical and experimental approaches, we show that compatible scaling schemes should be applied in the correct sequential order; otherwise, erroneous results may be obtained. We propose a hierarchical workflow to carry out FRAP data analysis and discuss the broader implications of our findings for FRAP data analysis using a variety of kinetic models. PMID:26017223

  20. Orthogonal basis with a conicoid first mode for shape specification of optical surfaces.

    PubMed

    Ferreira, Chelo; López, José L; Navarro, Rafael; Sinusía, Ester Pérez

    2016-03-07

    A rigorous and powerful theoretical framework is proposed to obtain systems of orthogonal functions (or shape modes) to represent optical surfaces. The method is general so it can be applied to different initial shapes and different polynomials. Here we present results for surfaces with circular apertures when the first basis function (mode) is a conicoid. The system for aspheres with rotational symmetry is obtained applying an appropriate change of variables to Legendre polynomials, whereas the system for general freeform case is obtained applying a similar procedure to spherical harmonics. Numerical comparisons with standard systems, such as Forbes and Zernike polynomials, are performed and discussed.

  1. Social Norms: Do We Love Norms Too Much?

    PubMed

    Bell, David C; Cox, Mary L

    2015-03-01

    Social norms are often cited as the cause of many social phenomena, especially as an explanation for prosocial family and relationship behaviors. And yet maybe we love the idea of social norms too much, as suggested by our failure to subject them to rigorous test. Compared to the detail in social norms theoretical orientations, there is very little detail in tests of normative theories. To provide guidance to researchers who invoke social norms as explanations, we catalog normative orientations that have been proposed to account for consistent patterns of action. We call on researchers to conduct tests of normative theories and the processes such theories assert.

  2. Structure of the Stern layer in Phospholipid Systems

    NASA Astrophysics Data System (ADS)

    Vangaveti, Sweta; Travesset, Alex

    2011-03-01

    The structure of the Stern layer in Phospholipid Systems results from a subtle competition of salt concentration, ionic valence, specific ionic-phospolipid interactions and pH. It becomes very challenging to develop a rigorous theory that encompasses all these effects, yet its understanding is extremely relevant for both model and biological systems, as the structure of the Stern layer determines the interactions of phospholipids with proteins or electrostatic phase separation (rafts). In this talk we will present our theoretical model for the Stern Layer and discuss how all these effects are included. Particularly emphasis is made to Phosphoinositides and Phosphatidic acid. This work is supported by grant NSF DMR-0748475.

  3. Social Norms: Do We Love Norms Too Much?

    PubMed Central

    Bell, David C.; Cox, Mary L.

    2014-01-01

    Social norms are often cited as the cause of many social phenomena, especially as an explanation for prosocial family and relationship behaviors. And yet maybe we love the idea of social norms too much, as suggested by our failure to subject them to rigorous test. Compared to the detail in social norms theoretical orientations, there is very little detail in tests of normative theories. To provide guidance to researchers who invoke social norms as explanations, we catalog normative orientations that have been proposed to account for consistent patterns of action. We call on researchers to conduct tests of normative theories and the processes such theories assert. PMID:25937833

  4. Light management in perovskite solar cells and organic LEDs with microlens arrays

    DOE PAGES

    Peer, Akshit; Biswas, Rana; Park, Joong -Mok; ...

    2017-04-28

    Here, we demonstrate enhanced absorption in solar cells and enhanced light emission in OLEDs by light interaction with a periodically structured microlens array. We simulate n-i-p perovskite solar cells with a microlens at the air-glass interface, with rigorous scattering matrix simulations. The microlens focuses light in nanoscale regions within the absorber layer enhancing the solar cell. Optimal period of ~700 nm and microlens height of ~800-1000 nm, provides absorption (photocurrent) enhancement of 6% (6.3%). An external polymer microlens array on the air-glass side of the OLED generates experimental and theoretical enhancements >100%, by outcoupling trapped modes in the glass substrate.

  5. Experimental Observation and Theoretical Description of Multisoliton Fission in Shallow Water

    NASA Astrophysics Data System (ADS)

    Trillo, S.; Deng, G.; Biondini, G.; Klein, M.; Clauss, G. F.; Chabchoub, A.; Onorato, M.

    2016-09-01

    We observe the dispersive breaking of cosine-type long waves [Phys. Rev. Lett. 15, 240 (1965)] in shallow water, characterizing the highly nonlinear "multisoliton" fission over variable conditions. We provide new insight into the interpretation of the results by analyzing the data in terms of the periodic inverse scattering transform for the Korteweg-de Vries equation. In a wide range of dispersion and nonlinearity, the data compare favorably with our analytical estimate, based on a rigorous WKB approach, of the number of emerging solitons. We are also able to observe experimentally the universal Fermi-Pasta-Ulam recurrence in the regime of moderately weak dispersion.

  6. Multiple p-n junction subwavelength gratings for transmission-mode electro-optic modulators

    PubMed Central

    Lee, Ki Young; Yoon, Jae Woong; Song, Seok Ho; Magnusson, Robert

    2017-01-01

    We propose a free-space electro-optic transmission modulator based on multiple p-n-junction semiconductor subwavelength gratings. The proposed device operates with a high-Q guided-mode resonance undergoing electro-optic resonance shift due to direct electrical control. Using rigorous electrical and optical modeling methods, we theoretically demonstrate a modulation depth of 84%, on-state efficiency 85%, and on-off extinction ratio of 19 dB at 1,550 nm wavelength under electrical control signals within a favorably low bias voltage range from −4 V to +1 V. This functionality operates in the transmission mode and sustainable in the high-speed operation regime up to a 10-GHz-scale modulation bandwidth in principle. The theoretical performance prediction is remarkably advantageous over plasmonic tunable metasurfaces in the power-efficiency and absolute modulation-depth aspects. Therefore, further experimental study is of great interest for creating practical-level metasurface components in various application areas. PMID:28417962

  7. The thermodynamics of dense granular flow and jamming

    NASA Astrophysics Data System (ADS)

    Lu, Shih Yu

    The scope of the thesis is to propose, based on experimental evidence and theoretical validation, a quantifiable connection between systems that exhibit the jamming phenomenon. When jammed, some materials that flow are able to resist deformation so that they appear solid-like on the laboratory scale. But unlike ordinary fusion, which has a critically defined criterion in pressure and temperature, jamming occurs under a wide range of conditions. These condition have been rigorously investigated but at the moment, no self-consistent framework can apply to grains, foam and colloids that may have suddenly ceased to flow. To quantify the jamming behavior, a constitutive model of dense granular flows is deduced from shear-flow experiments. The empirical equations are then generalized, via a thermodynamic approach, into an equation-of-state for jamming. Notably, the unifying theory also predicts the experimental data on the behavior of molecular glassy liquids. This analogy paves a crucial road map for a unifying theoretical framework in condensed matter, for example, ranging from sand to fire retardants to toothpaste.

  8. An Informative Interpretation of Decision Theory: The Information Theoretic Basis for Signal-to-Noise Ratio and Log Likelihood Ratio

    DOE PAGES

    Polcari, J.

    2013-08-16

    The signal processing concept of signal-to-noise ratio (SNR), in its role as a performance measure, is recast within the more general context of information theory, leading to a series of useful insights. Establishing generalized SNR (GSNR) as a rigorous information theoretic measure inherent in any set of observations significantly strengthens its quantitative performance pedigree while simultaneously providing a specific definition under general conditions. This directly leads to consideration of the log likelihood ratio (LLR): first, as the simplest possible information-preserving transformation (i.e., signal processing algorithm) and subsequently, as an absolute, comparable measure of information for any specific observation exemplar. Furthermore,more » the information accounting methodology that results permits practical use of both GSNR and LLR as diagnostic scalar performance measurements, directly comparable across alternative system/algorithm designs, applicable at any tap point within any processing string, in a form that is also comparable with the inherent performance bounds due to information conservation.« less

  9. Molecular Theory of Detonation Initiation: Insight from First Principles Modeling of the Decomposition Mechanisms of Organic Nitro Energetic Materials.

    PubMed

    Tsyshevsky, Roman V; Sharia, Onise; Kuklja, Maija M

    2016-02-19

    This review presents a concept, which assumes that thermal decomposition processes play a major role in defining the sensitivity of organic energetic materials to detonation initiation. As a science and engineering community we are still far away from having a comprehensive molecular detonation initiation theory in a widely agreed upon form. However, recent advances in experimental and theoretical methods allow for a constructive and rigorous approach to design and test the theory or at least some of its fundamental building blocks. In this review, we analyzed a set of select experimental and theoretical articles, which were augmented by our own first principles modeling and simulations, to reveal new trends in energetic materials and to refine known existing correlations between their structures, properties, and functions. Our consideration is intentionally limited to the processes of thermally stimulated chemical reactions at the earliest stage of decomposition of molecules and materials containing defects.

  10. Molecular Theory of Detonation Initiation: Insight from First Principles Modeling of the Decomposition Mechanisms of Organic Nitro Energetic Materials

    DOE PAGES

    Tsyshevsky, Roman; Sharia, Onise; Kuklja, Maija

    2016-02-19

    Our review presents a concept, which assumes that thermal decomposition processes play a major role in defining the sensitivity of organic energetic materials to detonation initiation. As a science and engineering community we are still far away from having a comprehensive molecular detonation initiation theory in a widely agreed upon form. However, recent advances in experimental and theoretical methods allow for a constructive and rigorous approach to design and test the theory or at least some of its fundamental building blocks. In this review, we analyzed a set of select experimental and theoretical articles, which were augmented by our ownmore » first principles modeling and simulations, to reveal new trends in energetic materials and to refine known existing correlations between their structures, properties, and functions. Lastly, our consideration is intentionally limited to the processes of thermally stimulated chemical reactions at the earliest stage of decomposition of molecules and materials containing defects.« less

  11. Potential energy landscapes identify the information-theoretic nature of the epigenome

    PubMed Central

    Jenkinson, Garrett; Pujadas, Elisabet; Goutsias, John; Feinberg, Andrew P.

    2017-01-01

    Epigenetics studies genomic modifications carrying information independent of DNA sequence heritable through cell division. In 1940, Waddington coined the term “epigenetic landscape” as a metaphor for pluripotency and differentiation, but methylation landscapes have not yet been rigorously computed. By using principles of statistical physics and information theory, we derive epigenetic energy landscapes from whole-genome bisulfite sequencing data that allow us to quantify methylation stochasticity genome-wide using Shannon’s entropy and associate entropy with chromatin structure. Moreover, we consider the Jensen-Shannon distance between sample-specific energy landscapes as a measure of epigenetic dissimilarity and demonstrate its effectiveness for discerning epigenetic differences. By viewing methylation maintenance as a communications system, we introduce methylation channels and show that higher-order chromatin organization can be predicted from their informational properties. Our results provide a fundamental understanding of the information-theoretic nature of the epigenome that leads to a powerful approach for studying its role in disease and aging. PMID:28346445

  12. Late-Onset ADHD: Understanding the Evidence and Building Theoretical Frameworks.

    PubMed

    Caye, Arthur; Sibley, Margaret H; Swanson, James M; Rohde, Luis Augusto

    2017-11-13

    The traditional definition of Attention-Deficit/Hyperactivity Disorder (ADHD), assuming onset in childhood, has been challenged by evidence from four recent birth-cohort studies that reported most adults with ADHD lacked a childhood categorical ADHD diagnosis. Late onset of symptoms was evaluated in the long-term follow-up of the Multimodal Treatment study of ADHD (MTA). In most cases, other factors were present that discounted the late onset of ADHD symptoms and excluded the diagnosis of ADHD. We offer two theoretical frameworks for understanding the ADHD trajectory throughout the life cycle: (1) the complex phenotype model, and (2) the restricted phenotype model. We conclude that (a) late onset (after age 12) is a valid trajectory for ADHD symptoms, (b) the percentage of these cases with onset after adolescence is yet uncertain, and (c) the percentage meeting exclusion criteria for diagnosis of ADHD is influenced by the rigor of the methodology used to obtain evidence and whether or not DSM exclusionary criteria are applied.

  13. Unique geologic insights from "non-unique" gravity and magnetic interpretation

    USGS Publications Warehouse

    Saltus, R.W.; Blakely, R.J.

    2011-01-01

    Interpretation of gravity and magnetic anomalies is mathematically non-unique because multiple theoretical solutions are always possible. The rigorous mathematical label of "nonuniqueness" can lead to the erroneous impression that no single interpretation is better in a geologic sense than any other. The purpose of this article is to present a practical perspective on the theoretical non-uniqueness of potential-field interpretation in geology. There are multiple ways to approach and constrain potential-field studies to produce significant, robust, and definitive results. The "non-uniqueness" of potential-field studies is closely related to the more general topic of scientific uncertainty in the Earth sciences and beyond. Nearly all results in the Earth sciences are subject to significant uncertainty because problems are generally addressed with incomplete and imprecise data. The increasing need to combine results from multiple disciplines into integrated solutions in order to address complex global issues requires special attention to the appreciation and communication of uncertainty in geologic interpretation.

  14. Wolf Attack Probability: A Theoretical Security Measure in Biometric Authentication Systems

    NASA Astrophysics Data System (ADS)

    Une, Masashi; Otsuka, Akira; Imai, Hideki

    This paper will propose a wolf attack probability (WAP) as a new measure for evaluating security of biometric authentication systems. The wolf attack is an attempt to impersonate a victim by feeding “wolves” into the system to be attacked. The “wolf” means an input value which can be falsely accepted as a match with multiple templates. WAP is defined as a maximum success probability of the wolf attack with one wolf sample. In this paper, we give a rigorous definition of the new security measure which gives strength estimation of an individual biometric authentication system against impersonation attacks. We show that if one reestimates using our WAP measure, a typical fingerprint algorithm turns out to be much weaker than theoretically estimated by Ratha et al. Moreover, we apply the wolf attack to a finger-vein-pattern based algorithm. Surprisingly, we show that there exists an extremely strong wolf which falsely matches all templates for any threshold value.

  15. Status of rates and rate equations for thermal leptogenesis

    NASA Astrophysics Data System (ADS)

    Biondini, S.; Bödeker, D.; Brambilla, N.; Garny, M.; Ghiglieri, J.; Hohenegger, A.; Laine, M.; Mendizabal, S.; Millington, P.; Salvio, A.; Vairo, A.

    2018-02-01

    In many realizations of leptogenesis, heavy right-handed neutrinos play the main role in the generation of an imbalance between matter and antimatter in the early Universe. Hence, it is relevant to address quantitatively their dynamics in a hot and dense environment by taking into account the various thermal aspects of the problem at hand. The strong washout regime offers an interesting framework to carry out calculations systematically and reduce theoretical uncertainties. Indeed, any matter-antimatter asymmetry generated when the temperature of the hot plasma T exceeds the right-handed neutrino mass scale M is efficiently erased, and one can focus on the temperature window T ≪ M. We review recent progress in the thermal field theoretic derivation of the key ingredients for the leptogenesis mechanism: the right-handed neutrino production rate, the CP asymmetry in the heavy-neutrino decays and the washout rates. The derivation of evolution equations for the heavy-neutrino and lepton-asymmetry number densities, their rigorous formulation and applicability are also discussed.

  16. Establishing a Research Agenda for Understanding the Role and Impact of Mental Health Peer Specialists.

    PubMed

    Chinman, Matthew; McInnes, D Keith; Eisen, Susan; Ellison, Marsha; Farkas, Marianne; Armstrong, Moe; Resnick, Sandra G

    2017-09-01

    Mental health peer specialists are individuals with serious mental illnesses who receive training to use their lived experiences to help others with serious mental illnesses in clinical settings. This Open Forum discusses the state of the research for mental health peer specialists and suggests a research agenda to advance the field. Studies have suggested that peer specialists vary widely in their roles, settings, and theoretical orientations. Theories of action have been proposed, but none have been tested. Outcome studies have shown benefits of peer specialists; however, many studies have methodological shortcomings. Qualitative descriptions of peer specialists are plentiful but lack grounding in implementation science frameworks. A research agenda advancing the field could include empirically testing theoretical mechanisms of peer specialists, developing a measure of peer specialist fidelity, conducting more rigorous outcomes studies, involving peer specialists in executing the research, and assessing various factors that influence implementing peer specialist services and testing strategies that could address those factors.

  17. Development of rigor mortis is not affected by muscle volume.

    PubMed

    Kobayashi, M; Ikegaya, H; Takase, I; Hatanaka, K; Sakurada, K; Iwase, H

    2001-04-01

    There is a hypothesis suggesting that rigor mortis progresses more rapidly in small muscles than in large muscles. We measured rigor mortis as tension determined isometrically in rat musculus erector spinae that had been cut into muscle bundles of various volumes. The muscle volume did not influence either the progress or the resolution of rigor mortis, which contradicts the hypothesis. Differences in pre-rigor load on the muscles influenced the onset and resolution of rigor mortis in a few pairs of samples, but did not influence the time taken for rigor mortis to reach its full extent after death. Moreover, the progress of rigor mortis in this muscle was biphasic; this may reflect the early rigor of red muscle fibres and the late rigor of white muscle fibres.

  18. CSF analysis

    MedlinePlus

    ... A, Sancesario GM, Esposito Z, et al. Plasmin system of Alzheimer's disease: CSF analysis. J Neural Transm (Vienna) . ... urac.org). URAC's accreditation program is an independent audit to verify that A.D.A.M. follows rigorous standards of quality and accountability. A.D.A.M. is ...

  19. Optimum Laser Beam Characteristics for Achieving Smoother Ablations in Laser Vision Correction.

    PubMed

    Verma, Shwetabh; Hesser, Juergen; Arba-Mosquera, Samuel

    2017-04-01

    Controversial opinions exist regarding optimum laser beam characteristics for achieving smoother ablations in laser-based vision correction. The purpose of the study was to outline a rigorous simulation model for simulating shot-by-shot ablation process. The impact of laser beam characteristics like super Gaussian order, truncation radius, spot geometry, spot overlap, and lattice geometry were tested on ablation smoothness. Given the super Gaussian order, the theoretical beam profile was determined following Lambert-Beer model. The intensity beam profile originating from an excimer laser was measured with a beam profiler camera. For both, the measured and theoretical beam profiles, two spot geometries (round and square spots) were considered, and two types of lattices (reticular and triangular) were simulated with varying spot overlaps and ablated material (cornea or polymethylmethacrylate [PMMA]). The roughness in ablation was determined by the root-mean-square per square root of layer depth. Truncating the beam profile increases the roughness in ablation, Gaussian profiles theoretically result in smoother ablations, round spot geometries produce lower roughness in ablation compared to square geometry, triangular lattices theoretically produce lower roughness in ablation compared to the reticular lattice, theoretically modeled beam profiles show lower roughness in ablation compared to the measured beam profile, and the simulated roughness in ablation on PMMA tends to be lower than on human cornea. For given input parameters, proper optimum parameters for minimizing the roughness have been found. Theoretically, the proposed model can be used for achieving smoothness with laser systems used for ablation processes at relatively low cost. This model may improve the quality of results and could be directly applied for improving postoperative surface quality.

  20. Quality and rigor of the concept mapping methodology: a pooled study analysis.

    PubMed

    Rosas, Scott R; Kane, Mary

    2012-05-01

    The use of concept mapping in research and evaluation has expanded dramatically over the past 20 years. Researchers in academic, organizational, and community-based settings have applied concept mapping successfully without the benefit of systematic analyses across studies to identify the features of a methodologically sound study. Quantitative characteristics and estimates of quality and rigor that may guide for future studies are lacking. To address this gap, we conducted a pooled analysis of 69 concept mapping studies to describe characteristics across study phases, generate specific indicators of validity and reliability, and examine the relationship between select study characteristics and quality indicators. Individual study characteristics and estimates were pooled and quantitatively summarized, describing the distribution, variation and parameters for each. In addition, variation in the concept mapping data collection in relation to characteristics and estimates was examined. Overall, results suggest concept mapping yields strong internal representational validity and very strong sorting and rating reliability estimates. Validity and reliability were consistently high despite variation in participation and task completion percentages across data collection modes. The implications of these findings as a practical reference to assess the quality and rigor for future concept mapping studies are discussed. Copyright © 2011 Elsevier Ltd. All rights reserved.

  1. Using qualitative mixed methods to study small health care organizations while maximising trustworthiness and authenticity.

    PubMed

    Phillips, Christine B; Dwan, Kathryn; Hepworth, Julie; Pearce, Christopher; Hall, Sally

    2014-11-19

    The primary health care sector delivers the majority of health care in western countries through small, community-based organizations. However, research into these healthcare organizations is limited by the time constraints and pressure facing them, and the concern by staff that research is peripheral to their work. We developed Q-RARA-Qualitative Rapid Appraisal, Rigorous Analysis-to study small, primary health care organizations in a way that is efficient, acceptable to participants and methodologically rigorous. Q-RARA comprises a site visit, semi-structured interviews, structured and unstructured observations, photographs, floor plans, and social scanning data. Data were collected over the course of one day per site and the qualitative analysis was integrated and iterative. We found Q-RARA to be acceptable to participants and effective in collecting data on organizational function in multiple sites without disrupting the practice, while maintaining a balance between speed and trustworthiness. The Q-RARA approach is capable of providing a richly textured, rigorous understanding of the processes of the primary care practice while also allowing researchers to develop an organizational perspective. For these reasons the approach is recommended for use in small-scale organizations both within and outside the primary health care sector.

  2. Preserving pre-rigor meat functionality for beef patty production.

    PubMed

    Claus, J R; Sørheim, O

    2006-06-01

    Three methods were examined for preserving pre-rigor meat functionality in beef patties. Hot-boned semimembranosus muscles were processed as follows: (1) pre-rigor ground, salted, patties immediately cooked; (2) pre-rigor ground, salted and stored overnight; (3) pre-rigor injected with brine; and (4) post-rigor ground and salted. Raw patties contained 60% lean beef, 19.7% beef fat trim, 1.7% NaCl, 3.6% starch, and 15% water. Pre-rigor processing occurred at 3-3.5h postmortem. Patties made from pre-rigor ground meat had higher pH values; greater protein solubility; firmer, more cohesive, and chewier texture; and substantially lower cooking losses than the other treatments. Addition of salt was sufficient to reduce the rate and extent of glycolysis. Brine injection of intact pre-rigor muscles resulted in some preservation of the functional properties but not as pronounced as with salt addition to pre-rigor ground meat.

  3. Investigation of possible observable e ects in a proposed theory of physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freidan, Daniel

    2015-03-31

    The work supported by this grant produced rigorous mathematical results on what is possible in quantum field theory. Quantum field theory is the well-established mathematical language for fundamental particle physics, for critical phenomena in condensed matter physics, and for Physical Mathematics (the numerous branches of Mathematics that have benefitted from ideas, constructions, and conjectures imported from Theoretical Physics). Proving rigorous constraints on what is possible in quantum field theories thus guides the field, puts actual constraints on what is physically possible in physical or mathematical systems described by quantum field theories, and saves the community the effort of trying tomore » do what is proved impossible. Results were obtained in two dimensional qft (describing, e.g., quantum circuits) and in higher dimensional qft. Rigorous bounds were derived on basic quantities in 2d conformal field theories, i.e., in 2d critical phenomena. Conformal field theories are the basic objects in quantum field theory, the scale invariant theories describing renormalization group fixed points from which all qfts flow. The first known lower bounds on the 2d boundary entropy were found. This is the entropy- information content- in junctions in critical quantum circuits. For dimensions d > 2, a no-go theorem was proved on the possibilities of Cauchy fields, which are the analogs of the holomorphic fields in d = 2 dimensions, which have had enormously useful applications in Physics and Mathematics over the last four decades. This closed o the possibility of finding analogously rich theories in dimensions above 2. The work of two postdoctoral research fellows was partially supported by this grant. Both have gone on to tenure track positions.« less

  4. The Role of Data Analysis Software in Graduate Programs in Education and Post-Graduate Research

    ERIC Educational Resources Information Center

    Harwell, Michael

    2018-01-01

    The importance of data analysis software in graduate programs in education and post-graduate educational research is self-evident. However the role of this software in facilitating supererogated statistical practice versus "cookbookery" is unclear. The need to rigorously document the role of data analysis software in students' graduate…

  5. Using Content Analysis to Examine the Verbal or Written Communication of Stakeholders within Early Intervention.

    ERIC Educational Resources Information Center

    Johnson, Lawrence J.; LaMontagne, M. J.

    1993-01-01

    This paper describes content analysis as a data analysis technique useful for examining written or verbal communication within early intervention. The article outlines the use of referential or thematic recording units derived from interview data, identifies procedural guidelines, and addresses issues of rigor and validity. (Author/JDD)

  6. Integrated Sensitivity Analysis Workflow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedman-Hill, Ernest J.; Hoffman, Edward L.; Gibson, Marcus J.

    2014-08-01

    Sensitivity analysis is a crucial element of rigorous engineering analysis, but performing such an analysis on a complex model is difficult and time consuming. The mission of the DART Workbench team at Sandia National Laboratories is to lower the barriers to adoption of advanced analysis tools through software integration. The integrated environment guides the engineer in the use of these integrated tools and greatly reduces the cycle time for engineering analysis.

  7. A Meta-Analysis of Single-Subject Research on Behavioral Momentum to Enhance Success in Students with Autism.

    PubMed

    Cowan, Richard J; Abel, Leah; Candel, Lindsay

    2017-05-01

    We conducted a meta-analysis of single-subject research studies investigating the effectiveness of antecedent strategies grounded in behavioral momentum for improving compliance and on-task performance for students with autism. First, we assessed the research rigor of those studies meeting our inclusionary criteria. Next, in order to apply a universal metric to help determine the effectiveness of this category of antecedent strategies investigated via single-subject research methods, we calculated effect sizes via omnibus improvement rate differences (IRDs). Outcomes provide additional support for behavioral momentum, especially interventions incorporating the high-probability command sequence. Implications for research and practice are discussed, including the consideration of how single-subject research is systematically reviewed to assess the rigor of studies and assist in determining overall intervention effectiveness .

  8. A Research Communication Brief: Gluten Analysis in Beef Samples Collected Using a Rigorous, Nationally Representative Sampling Protocol Confirms That Grain-Finished Beef Is Naturally Gluten-Free.

    PubMed

    McNeill, Shalene H; Cifelli, Amy M; Roseland, Janet M; Belk, Keith E; Woerner, Dale R; Gehring, Kerri B; Savell, Jeffrey W; Brooks, J Chance; Thompson, Leslie D

    2017-08-25

    Knowing whether or not a food contains gluten is vital for the growing number of individuals with celiac disease and non-celiac gluten sensitivity. Questions have recently been raised about whether beef from conventionally-raised, grain-finished cattle may contain gluten. To date, basic principles of ruminant digestion have been cited in support of the prevailing expert opinion that beef is inherently gluten-free. For this study, gluten analysis was conducted in beef samples collected using a rigorous nationally representative sampling protocol to determine whether gluten was present. The findings of our research uphold the understanding of the principles of gluten digestion in beef cattle and corroborate recommendations that recognize beef as a naturally gluten-free food.

  9. From psycho-social theory to sustainable classroom practice: developing a research-based teacher-delivered sex education programme.

    PubMed

    Wight, D; Abraham, C

    2000-02-01

    This paper describes the development of a theoretically based sex education programme currently undergoing a randomized controlled trial in the UK. It considers some of the practical difficulties involved in translating research-based conclusions into acceptable, replicable and potentially effective classroom lessons. The discussion acknowledges that the implications of social psychological research and the requirements of rigorous evaluation may conflict with accepted principles inherent in current sex education practice. It also emphasizes that theoretical ideas must be carefully embedded in lessons which are informed by an awareness of classroom culture, and the needs and skills of teachers. For example, the use of same-sex student groups to reflect on the gendered construction of sexuality may be problematic. Materials must be tailored to recipients' circumstances, which may require substituting for limited experience with the use of detailed scripts and scenarios. Furthermore, role-play techniques for sexual negotiation that work elsewhere may not be effective in the UK. The use of trigger video sessions and other techniques are recommended. Finally, the problems involved in promoting condom-related skills are discussed. The paper concludes that, if an intervention is to be sustainable beyond the research stage, it must be designed to overcome such problems while remaining theoretically informed.

  10. Interfacing theories of program with theories of evaluation for advancing evaluation practice: Reductionism, systems thinking, and pragmatic synthesis.

    PubMed

    Chen, Huey T

    2016-12-01

    Theories of program and theories of evaluation form the foundation of program evaluation theories. Theories of program reflect assumptions on how to conceptualize an intervention program for evaluation purposes, while theories of evaluation reflect assumptions on how to design useful evaluation. These two types of theories are related, but often discussed separately. This paper attempts to use three theoretical perspectives (reductionism, systems thinking, and pragmatic synthesis) to interface them and discuss the implications for evaluation practice. Reductionism proposes that an intervention program can be broken into crucial components for rigorous analyses; systems thinking view an intervention program as dynamic and complex, requiring a holistic examination. In spite of their contributions, reductionism and systems thinking represent the extreme ends of a theoretical spectrum; many real-world programs, however, may fall in the middle. Pragmatic synthesis is being developed to serve these moderate- complexity programs. These three theoretical perspectives have their own strengths and challenges. Knowledge on these three perspectives and their evaluation implications can provide a better guide for designing fruitful evaluations, improving the quality of evaluation practice, informing potential areas for developing cutting-edge evaluation approaches, and contributing to advancing program evaluation toward a mature applied science. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. What we know about the purpose, theoretical foundation, scope and dimensionality of existing self-management measurement tools: A scoping review.

    PubMed

    Packer, Tanya L; Fracini, America; Audulv, Åsa; Alizadeh, Neda; van Gaal, Betsie G I; Warner, Grace; Kephart, George

    2018-04-01

    To identify self-report, self-management measures for adults with chronic conditions, and describe their purpose, theoretical foundation, dimensionality (multi versus uni), and scope (generic versus condition specific). A search of four databases (8479 articles) resulted in a scoping review of 28 self-management measures. Although authors identified tools as measures of self-management, wide variation in constructs measured, purpose, and theoretical foundations existed. Subscales on 13 multidimensional tools collectively measure domains of self-management relevant to clients, however no one tool's subscales cover all domains. Viewing self-management as a complex, multidimensional whole, demonstrated that existing measures assess different, related aspects of self-management. Activities and social roles, though important to patients, are rarely measured. Measures with capacity to quantify and distinguish aspects of self-management may promote tailored patient care. In selecting tools for research or assessment, the reason for development, definitions, and theories underpinning the measure should be scrutinized. Our ability to measure self-management must be rigorously mapped to provide comprehensive and system-wide care for clients with chronic conditions. Viewing self-management as a complex whole will help practitioners to understand the patient perspective and their contribution in supporting each individual patient. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Scientific Data Analysis Toolkit: A Versatile Add-in to Microsoft Excel for Windows

    ERIC Educational Resources Information Center

    Halpern, Arthur M.; Frye, Stephen L.; Marzzacco, Charles J.

    2018-01-01

    Scientific Data Analysis Toolkit (SDAT) is a rigorous, versatile, and user-friendly data analysis add-in application for Microsoft Excel for Windows (PC). SDAT uses the familiar Excel environment to carry out most of the analytical tasks used in data analysis. It has been designed for student use in manipulating and analyzing data encountered in…

  13. On the Tracy-Widomβ Distribution for β=6

    NASA Astrophysics Data System (ADS)

    Grava, Tamara; Its, Alexander; Kapaev, Andrei; Mezzadri, Francesco

    2016-11-01

    We study the Tracy-Widom distribution function for Dyson's β-ensemble with β = 6. The starting point of our analysis is the recent work of I. Rumanov where he produces a Lax-pair representation for the Bloemendal-Virág equation. The latter is a linear PDE which describes the Tracy-Widom functions corresponding to general values of β. Using his Lax pair, Rumanov derives an explicit formula for the Tracy-Widom β=6 function in terms of the second Painlevé transcendent and the solution of an auxiliary ODE. Rumanov also shows that this formula allows him to derive formally the asymptotic expansion of the Tracy-Widom function. Our goal is to make Rumanov's approach and hence the asymptotic analysis it provides rigorous. In this paper, the first one in a sequel, we show that Rumanov's Lax-pair can be interpreted as a certain gauge transformation of the standard Lax pair for the second Painlevé equation. This gauge transformation though contains functional parameters which are defined via some auxiliary nonlinear ODE which is equivalent to the auxiliary ODE of Rumanov's formula. The gauge-interpretation of Rumanov's Lax-pair allows us to highlight the steps of the original Rumanov's method which needs rigorous justifications in order to make the method complete. We provide a rigorous justification of one of these steps. Namely, we prove that the Painlevé function involved in Rumanov's formula is indeed, as it has been suggested by Rumanov, the Hastings-McLeod solution of the second Painlevé equation. The key issue which we also discuss and which is still open is the question of integrability of the auxiliary ODE in Rumanov's formula. We note that this question is crucial for the rigorous asymptotic analysis of the Tracy-Widom function. We also notice that our work is a partial answer to one of the problems related to the β-ensembles formulated by Percy Deift during the June 2015 Montreal Conference on integrable systems.

  14. Causality analysis in business performance measurement system using system dynamics methodology

    NASA Astrophysics Data System (ADS)

    Yusof, Zainuridah; Yusoff, Wan Fadzilah Wan; Maarof, Faridah

    2014-07-01

    One of the main components of the Balanced Scorecard (BSC) that differentiates it from any other performance measurement system (PMS) is the Strategy Map with its unidirectional causality feature. Despite its apparent popularity, criticisms on the causality have been rigorously discussed by earlier researchers. In seeking empirical evidence of causality, propositions based on the service profit chain theory were developed and tested using the econometrics analysis, Granger causality test on the 45 data points. However, the insufficiency of well-established causality models was found as only 40% of the causal linkages were supported by the data. Expert knowledge was suggested to be used in the situations of insufficiency of historical data. The Delphi method was selected and conducted in obtaining the consensus of the causality existence among the 15 selected expert persons by utilizing 3 rounds of questionnaires. Study revealed that only 20% of the propositions were not supported. The existences of bidirectional causality which demonstrate significant dynamic environmental complexity through interaction among measures were obtained from both methods. With that, a computer modeling and simulation using System Dynamics (SD) methodology was develop as an experimental platform to identify how policies impacting the business performance in such environments. The reproduction, sensitivity and extreme condition tests were conducted onto developed SD model to ensure their capability in mimic the reality, robustness and validity for causality analysis platform. This study applied a theoretical service management model within the BSC domain to a practical situation using SD methodology where very limited work has been done.

  15. Statistical testing and power analysis for brain-wide association study.

    PubMed

    Gong, Weikang; Wan, Lin; Lu, Wenlian; Ma, Liang; Cheng, Fan; Cheng, Wei; Grünewald, Stefan; Feng, Jianfeng

    2018-04-05

    The identification of connexel-wise associations, which involves examining functional connectivities between pairwise voxels across the whole brain, is both statistically and computationally challenging. Although such a connexel-wise methodology has recently been adopted by brain-wide association studies (BWAS) to identify connectivity changes in several mental disorders, such as schizophrenia, autism and depression, the multiple correction and power analysis methods designed specifically for connexel-wise analysis are still lacking. Therefore, we herein report the development of a rigorous statistical framework for connexel-wise significance testing based on the Gaussian random field theory. It includes controlling the family-wise error rate (FWER) of multiple hypothesis testings using topological inference methods, and calculating power and sample size for a connexel-wise study. Our theoretical framework can control the false-positive rate accurately, as validated empirically using two resting-state fMRI datasets. Compared with Bonferroni correction and false discovery rate (FDR), it can reduce false-positive rate and increase statistical power by appropriately utilizing the spatial information of fMRI data. Importantly, our method bypasses the need of non-parametric permutation to correct for multiple comparison, thus, it can efficiently tackle large datasets with high resolution fMRI images. The utility of our method is shown in a case-control study. Our approach can identify altered functional connectivities in a major depression disorder dataset, whereas existing methods fail. A software package is available at https://github.com/weikanggong/BWAS. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Irrelevance of phase size in purification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harvey, A.H.

    1988-11-03

    Recently, Reis has suggested that it might be possible to remove a solute species completely from a small (or finely dispersed) phase by a reduction to some low but finite value of the chemical potential of that species in the medium surrounding the phase. Sciamanna and Prausnitz, while expressing some doubts about the rigor of the theoretical approach, used similar arguments to examine the possibility of obtaining ultrapurity in a small dispersed phase by equilibrium purification operations such as distillation and extraction. Here they demonstrate that Reis' original suggestion is incorrect. Furthermore, they show that, under well-defined and reasonable assumptions,more » the size of a phase has no influence on its purity.« less

  17. Reciprocal relations for transmission coefficients - Theory and application

    NASA Technical Reports Server (NTRS)

    Qu, Jianmin; Achenbach, Jan D.; Roberts, Ronald A.

    1989-01-01

    The authors present a rigorous proof of certain intuitively plausible reciprocal relations for time harmonic plane-wave transmission and reflection at the interface between a fluid and an anisotropic elastic solid. Precise forms of the reciprocity relations for the transmission coefficients and for the transmitted energy fluxes are derived, based on the reciprocity theorem of elastodynamics. It is shown that the reciprocity relations can be used in conjunction with measured values of peak amplitudes for transmission through a slab of the solid (water-solid-water) to obtain the water-solid coefficients. Experiments were performed for a slab of a unidirectional fiber-reinforced composite. Good agreement of the experimentally measured transmission coefficients with theoretical values was obtained.

  18. Ground-state hyperfine splitting for Rb, Cs, Fr, Ba+, and Ra+

    NASA Astrophysics Data System (ADS)

    Ginges, J. S. M.; Volotka, A. V.; Fritzsche, S.

    2017-12-01

    We have systematically investigated the ground-state hyperfine structure for alkali-metal atoms 87Rb,133Cs, and 211Fr and alkali-metal-like ions +135Ba and +225Ra, which are of particular interest for parity violation studies. The quantum electrodynamic one-loop radiative corrections have been rigorously evaluated within an extended Furry picture employing core-Hartree and Kohn-Sham atomic potentials. Moreover, the effect of the nuclear magnetization distribution on the hyperfine structure intervals has been studied in detail and its uncertainty has been estimated. Finally, the theoretical description of the hyperfine structure has been completed with full many-body calculations performed in the all-orders correlation potential method.

  19. High-Contrast Gratings based Spoof Surface Plasmons

    NASA Astrophysics Data System (ADS)

    Li, Zhuo; Liu, Liangliang; Xu, Bingzheng; Ning, Pingping; Chen, Chen; Xu, Jia; Chen, Xinlei; Gu, Changqing; Qing, Quan

    2016-02-01

    In this work, we explore the existence of spoof surface plasmons (SSPs) supported by deep-subwavelength high-contrast gratings (HCGs) on a perfect electric conductor plane. The dispersion relation of the HCGs-based SSPs is derived analyt- ically by combining multimode network theory with rigorous mode matching method, which has nearly the same form with and can be degenerated into that of the SSPs arising from deep-subwavelength metallic gratings (MGs). Numerical simula- tions validate the analytical dispersion relation and an effective medium approximation is also presented to obtain the same analytical dispersion formula. This work sets up a unified theoretical framework for SSPs and opens up new vistas in surface plasmon optics.

  20. Beam-splitter switches based on zenithal bistable liquid-crystal gratings.

    PubMed

    Zografopoulos, Dimitrios C; Beccherelli, Romeo; Kriezis, Emmanouil E

    2014-10-01

    The tunable optical diffractive properties of zenithal bistable nematic liquid-crystal gratings are theoretically investigated. The liquid-crystal orientation is rigorously solved via a tensorial formulation of the Landau-de Gennes theory and the optical transmission properties of the gratings are investigated via full-wave finite-element frequency-domain simulations. It is demonstrated that by proper design the two stable states of the grating can provide nondiffracting and diffracting operation, the latter with equal power splitting among different diffraction orders. An electro-optic switching mechanism, based on dual-frequency nematic materials, and its temporal dynamics are further discussed. Such gratings provide a solution towards tunable beam-steering and beam-splitting components with extremely low power consumption.

  1. Award for Distinguished Contributions to Research in Public Policy: Dorothy L. Espelage.

    PubMed

    2016-11-01

    APA's Award for Distinguished Contributions to Research in Public Policy is given to a psychologist who has made a distinguished empirical and/or theoretical contribution to research in public policy, either through a single extraordinary achievement or a lifetime of work. Dorothy L. Espelage is the 2016 recipient of this award for her exceptional work on bullying, gender, and school violence. "She is an outstanding rigorous researcher who uses the most sophisticated methods in assessing the effects of interventions designed to improve the social and emotional lives of children both within and outside of school." Espelage's citation, biography, and selected bibliography are presented here. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  2. Evaluating WHO Healthy Cities in Europe: issues and perspectives.

    PubMed

    de Leeuw, Evelyne

    2013-10-01

    In this introductory article, we situate the findings of the Phase IV evaluation effort of the WHO European Healthy Cities Network in its historic evolutionary development. We review each of the contributions to this supplement in terms of the theoretical and methodological frameworks applied. Although the findings of each are both relevant and generated with a scholarly rigor that is appropriate to the context in which the evaluation took place, we find that particularly these contextual factors have not contributed to optimum quality of research. Any drawbacks in individual contributions cannot be attributed to their analysts and authors but relate to the complicated and evolving nature of the project. These factors are also reviewed.

  3. Interpretation of high-dimensional numerical results for the Anderson transition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Suslov, I. M., E-mail: suslov@kapitza.ras.ru

    The existence of the upper critical dimension d{sub c2} = 4 for the Anderson transition is a rigorous consequence of the Bogoliubov theorem on renormalizability of φ{sup 4} theory. For d ≥ 4 dimensions, one-parameter scaling does not hold and all existent numerical data should be reinterpreted. These data are exhausted by the results for d = 4, 5 from scaling in quasi-one-dimensional systems and the results for d = 4, 5, 6 from level statistics. All these data are compatible with the theoretical scaling dependences obtained from Vollhardt and Wolfle’s self-consistent theory of localization. The widespread viewpoint that d{submore » c2} = ∞ is critically discussed.« less

  4. Quantum Approximate Methods for the Atomistic Modeling of Multicomponent Alloys. Chapter 7

    NASA Technical Reports Server (NTRS)

    Bozzolo, Guillermo; Garces, Jorge; Mosca, Hugo; Gargano, pablo; Noebe, Ronald D.; Abel, Phillip

    2007-01-01

    This chapter describes the role of quantum approximate methods in the understanding of complex multicomponent alloys at the atomic level. The need to accelerate materials design programs based on economical and efficient modeling techniques provides the framework for the introduction of approximations and simplifications in otherwise rigorous theoretical schemes. As a promising example of the role that such approximate methods might have in the development of complex systems, the BFS method for alloys is presented and applied to Ru-rich Ni-base superalloys and also to the NiAI(Ti,Cu) system, highlighting the benefits that can be obtained from introducing simple modeling techniques to the investigation of such complex systems.

  5. Cost-Effectiveness Analysis of Early Reading Programs: A Demonstration with Recommendations for Future Research

    ERIC Educational Resources Information Center

    Hollands, Fiona M.; Kieffer, Michael J.; Shand, Robert; Pan, Yilin; Cheng, Henan; Levin, Henry M.

    2016-01-01

    We review the value of cost-effectiveness analysis for evaluation and decision making with respect to educational programs and discuss its application to early reading interventions. We describe the conditions for a rigorous cost-effectiveness analysis and illustrate the challenges of applying the method in practice, providing examples of programs…

  6. Feeding Problems and Nutrient Intake in Children with Autism Spectrum Disorders: A Meta-Analysis and Comprehensive Review of the Literature

    ERIC Educational Resources Information Center

    Sharp, William G.; Berry, Rashelle C.; McCracken, Courtney; Nuhu, Nadrat N.; Marvel, Elizabeth; Saulnier, Celine A.; Klin, Ami; Jones, Warren; Jaquess, David L.

    2013-01-01

    We conducted a comprehensive review and meta-analysis of research regarding feeding problems and nutrient status among children with autism spectrum disorders (ASD). The systematic search yielded 17 prospective studies involving a comparison group. Using rigorous meta-analysis techniques, we calculated the standardized mean difference (SMD) with…

  7. Can power-law scaling and neuronal avalanches arise from stochastic dynamics?

    PubMed

    Touboul, Jonathan; Destexhe, Alain

    2010-02-11

    The presence of self-organized criticality in biology is often evidenced by a power-law scaling of event size distributions, which can be measured by linear regression on logarithmic axes. We show here that such a procedure does not necessarily mean that the system exhibits self-organized criticality. We first provide an analysis of multisite local field potential (LFP) recordings of brain activity and show that event size distributions defined as negative LFP peaks can be close to power-law distributions. However, this result is not robust to change in detection threshold, or when tested using more rigorous statistical analyses such as the Kolmogorov-Smirnov test. Similar power-law scaling is observed for surrogate signals, suggesting that power-law scaling may be a generic property of thresholded stochastic processes. We next investigate this problem analytically, and show that, indeed, stochastic processes can produce spurious power-law scaling without the presence of underlying self-organized criticality. However, this power-law is only apparent in logarithmic representations, and does not survive more rigorous analysis such as the Kolmogorov-Smirnov test. The same analysis was also performed on an artificial network known to display self-organized criticality. In this case, both the graphical representations and the rigorous statistical analysis reveal with no ambiguity that the avalanche size is distributed as a power-law. We conclude that logarithmic representations can lead to spurious power-law scaling induced by the stochastic nature of the phenomenon. This apparent power-law scaling does not constitute a proof of self-organized criticality, which should be demonstrated by more stringent statistical tests.

  8. Mobile Health Technology Evaluation

    PubMed Central

    Kumar, Santosh; Nilsen, Wendy J.; Abernethy, Amy; Atienza, Audie; Patrick, Kevin; Pavel, Misha; Riley, William T.; Shar, Albert; Spring, Bonnie; Spruijt-Metz, Donna; Hedeker, Donald; Honavar, Vasant; Kravitz, Richard; Lefebvre, R. Craig; Mohr, David C.; Murphy, Susan A.; Quinn, Charlene; Shusterman, Vladimir; Swendeman, Dallas

    2013-01-01

    Creative use of new mobile and wearable health information and sensing technologies (mHealth) has the potential to reduce the cost of health care and improve well-being in numerous ways. These applications are being developed in a variety of domains, but rigorous research is needed to examine the potential, as well as the challenges, of utilizing mobile technologies to improve health outcomes. Currently, evidence is sparse for the efficacy of mHealth. Although these technologies may be appealing and seemingly innocuous, research is needed to assess when, where, and for whom mHealth devices, apps, and systems are efficacious. In order to outline an approach to evidence generation in the field of mHealth that would ensure research is conducted on a rigorous empirical and theoretic foundation, on August 16, 2011, researchers gathered for the mHealth Evidence Workshop at NIH. The current paper presents the results of the workshop. Although the discussions at the meeting were cross-cutting, the areas covered can be categorized broadly into three areas: (1) evaluating assessments; (2) evaluating interventions; and, (3) reshaping evidence generation using mHealth. This paper brings these concepts together to describe current evaluation standards, future possibilities and set a grand goal for the emerging field of mHealth research. PMID:23867031

  9. Culture and symptom reporting at menopause.

    PubMed

    Melby, Melissa K; Lock, Margaret; Kaufert, Patricia

    2005-01-01

    The purpose of the present paper is to review recent research on the relationship of culture and menopausal symptoms and propose a biocultural framework that makes use of both biological and cultural parameters in future research. Medline was searched for English-language articles published from 2000 to 2004 using the keyword 'menopause' in the journals--Menopause, Maturitas, Climacteric, Social Science and Medicine, Medical Anthropology Quarterly, Journal of Women's Health, Journal of the American Medical Association, American Journal of Epidemiology, Lancet and British Medical Journal, excluding articles concerning small clinical samples, surgical menopause or HRT. Additionally, references of retrieved articles and reviews were hand-searched. Although a large number of studies and publications exist, methodological differences limit attempts at comparison or systematic review. We outline a theoretical framework in which relevant biological and cultural variables can be operationalized and measured, making it possible for rigorous comparisons in the future. Several studies carried out in Japan, North America and Australia, using similar methodology but different culture/ethnic groups, indicate that differences in symptom reporting are real and highlight the importance of biocultural research. We suggest that both biological variation and cultural differences contribute to the menopausal transition, and that more rigorous data collection is required to elucidate how biology and culture interact in female ageing.

  10. Understanding information exchange during disaster response: Methodological insights from infocentric analysis

    Treesearch

    Toddi A. Steelman; Branda Nowell; Deena Bayoumi; Sarah McCaffrey

    2014-01-01

    We leverage economic theory, network theory, and social network analytical techniques to bring greater conceptual and methodological rigor to understand how information is exchanged during disasters. We ask, "How can information relationships be evaluated more systematically during a disaster response?" "Infocentric analysis"—a term and...

  11. Driven and No Regrets: A Qualitative Analysis of Students Earning Baccalaureate Degrees in Three Years

    ERIC Educational Resources Information Center

    Firmin, Michael W.; Gilson, Krista Merrick

    2007-01-01

    Using rigorous qualitative research methodology, twenty-four college students receiving their undergraduate degrees in three years were interviewed. Following analysis of the semi-structured interview transcripts and coding, themes emerged, indicating that these students possessed self-discipline, self-motivation, and drive. Overall, the results…

  12. Gender, Discourse, and "Gender and Discourse."

    ERIC Educational Resources Information Center

    Davis, Hayley

    1997-01-01

    A critic of Deborah Tannen's book "Gender and Discourse" responds to comments made about her critique, arguing that the book's analysis of the relationship of gender and discourse tends to seek, and perhaps force, explanations only in those terms. Another linguist's analysis of similar phenomena is found to be more rigorous. (MSE)

  13. Exploration of the Maximum Entropy/Optimal Projection Approach to Control Design Synthesis for Large Space Structures.

    DTIC Science & Technology

    1985-02-01

    Energy Analysis , a branch of dynamic modal analysis developed for analyzing acoustic vibration problems, its present stage of development embodies a...Maximum Entropy Stochastic Modelling and Reduced-Order Design Synthesis is a rigorous new approach to this class of problems. Inspired by Statistical

  14. Evaluating Computer-Related Incidents on Campus

    ERIC Educational Resources Information Center

    Rothschild, Daniel; Rezmierski, Virginia

    2004-01-01

    The Computer Incident Factor Analysis and Categorization (CIFAC) Project at the University of Michigan began in September 2003 with grants from EDUCAUSE and the National Science Foundation (NSF). The project's primary goal is to create a best-practices security framework for colleges and universities based on rigorous quantitative analysis of…

  15. Addressing Methodological Challenges in Large Communication Data Sets: Collecting and Coding Longitudinal Interactions in Home Hospice Cancer Care.

    PubMed

    Reblin, Maija; Clayton, Margaret F; John, Kevin K; Ellington, Lee

    2016-07-01

    In this article, we present strategies for collecting and coding a large longitudinal communication data set collected across multiple sites, consisting of more than 2000 hours of digital audio recordings from approximately 300 families. We describe our methods within the context of implementing a large-scale study of communication during cancer home hospice nurse visits, but this procedure could be adapted to communication data sets across a wide variety of settings. This research is the first study designed to capture home hospice nurse-caregiver communication, a highly understudied location and type of communication event. We present a detailed example protocol encompassing data collection in the home environment, large-scale, multisite secure data management, the development of theoretically-based communication coding, and strategies for preventing coder drift and ensuring reliability of analyses. Although each of these challenges has the potential to undermine the utility of the data, reliability between coders is often the only issue consistently reported and addressed in the literature. Overall, our approach demonstrates rigor and provides a "how-to" example for managing large, digitally recorded data sets from collection through analysis. These strategies can inform other large-scale health communication research.

  16. Defect-free atomic array formation using the Hungarian matching algorithm

    NASA Astrophysics Data System (ADS)

    Lee, Woojun; Kim, Hyosub; Ahn, Jaewook

    2017-05-01

    Deterministic loading of single atoms onto arbitrary two-dimensional lattice points has recently been demonstrated, where by dynamically controlling the optical-dipole potential, atoms from a probabilistically loaded lattice were relocated to target lattice points to form a zero-entropy atomic lattice. In this atom rearrangement, how to pair atoms with the target sites is a combinatorial optimization problem: brute-force methods search all possible combinations so the process is slow, while heuristic methods are time efficient but optimal solutions are not guaranteed. Here, we use the Hungarian matching algorithm as a fast and rigorous alternative to this problem of defect-free atomic lattice formation. Our approach utilizes an optimization cost function that restricts collision-free guiding paths so that atom loss due to collision is minimized during rearrangement. Experiments were performed with cold rubidium atoms that were trapped and guided with holographically controlled optical-dipole traps. The result of atom relocation from a partially filled 7 ×7 lattice to a 3 ×3 target lattice strongly agrees with the theoretical analysis: using the Hungarian algorithm minimizes the collisional and trespassing paths and results in improved performance, with over 50% higher success probability than the heuristic shortest-move method.

  17. Efficacy and enlightenment: LSD psychotherapy and the Drug Amendments of 1962.

    PubMed

    Oram, Matthew

    2014-04-01

    The decline in therapeutic research with lysergic acid diethylamide (LSD) in the United States over the course of the 1960s has commonly been attributed to the growing controversy surrounding its recreational use. However, research difficulties played an equal role in LSD psychotherapy's demise, as they frustrated researchers' efforts to clearly establish the efficacy of treatment. Once the Kefauver Harris Drug Amendments of 1962 introduced the requirement that proof of efficacy be established through controlled clinical trials before a drug could be approved to market, the value of clinical research became increasingly dependent on the scientific rigor of the trial's design. LSD psychotherapy's complex method of utilizing drug effects to catalyze a psychological treatment clashed with the controlled trial methodology on both theoretical and practical levels, making proof of efficacy difficult to obtain. Through a close examination of clinical trials performed after 1962, this article explores how the new emphasis on controlled clinical trials frustrated the progress of LSD psychotherapy research by focusing researchers' attention on trial design to the detriment of their therapeutic method. This analysis provides a new perspective on the death of LSD psychotherapy and explores the implications of the Drug Amendments of 1962.

  18. Digital coherent receiver based transmitter penalty characterization.

    PubMed

    Geisler, David J; Kaufmann, John E

    2016-12-26

    For optical communications links where receivers are signal-power-starved, such as through free-space, it is important to design transmitters and receivers that can operate as close as practically possible to theoretical limits. A total system penalty is typically assessed in terms of how far the end-to-end bit-error rate (BER) is from these limits. It is desirable, but usually difficult, to determine the division of this penalty between the transmitter and receiver. This paper describes a new rigorous and computationally based method that isolates which portion of the penalty can be assessed against the transmitter. There are two basic parts to this approach: (1) use of a coherent optical receiver to perform frequency down-conversion of a transmitter's optical signal waveform to the electrical domain, preserving both optical field amplitude and phase information, and (2): software-based analysis of the digitized electrical waveform. The result is a single numerical metric that quantifies how close a transmitter's signal waveform is to the ideal, based on its BER performance with a perfect software-defined matched-filter receiver demodulator. A detailed description of applying the proposed methodology to the waveform characterization of an optical burst-mode differential phase-shifted keying (DPSK) transmitter is experimentally demonstrated.

  19. Full-dimensional quantum calculations of the dissociation energy, zero-point, and 10 K properties of H7+/D7+ clusters using an ab initio potential energy surface.

    PubMed

    Barragán, Patricia; Pérez de Tudela, Ricardo; Qu, Chen; Prosmiti, Rita; Bowman, Joel M

    2013-07-14

    Diffusion Monte Carlo (DMC) and path-integral Monte Carlo computations of the vibrational ground state and 10 K equilibrium state properties of the H7 (+)/D7 (+) cations are presented, using an ab initio full-dimensional potential energy surface. The DMC zero-point energies of dissociated fragments H5 (+)(D5 (+))+H2(D2) are also calculated and from these results and the electronic dissociation energy, dissociation energies, D0, of 752 ± 15 and 980 ± 14 cm(-1) are reported for H7 (+) and D7 (+), respectively. Due to the known error in the electronic dissociation energy of the potential surface, these quantities are underestimated by roughly 65 cm(-1). These values are rigorously determined for first time, and compared with previous theoretical estimates from electronic structure calculations using standard harmonic analysis, and available experimental measurements. Probability density distributions are also computed for the ground vibrational and 10 K state of H7 (+) and D7 (+). These are qualitatively described as a central H3 (+)/D3 (+) core surrounded by "solvent" H2/D2 molecules that nearly freely rotate.

  20. A Theoretical Analysis of the Effect of the Hydrogenation of Graphene to Graphane on Its Mechanical Properties

    NASA Astrophysics Data System (ADS)

    Peng, Q.; Liang, Chao; Ji, Wei; de, Suvranu

    2013-03-01

    We investigated the mechanical properties of graphene and graphane using first-principles calculations based on density-functional theory. A conventional unitcell containing a hexagonal ring made of carbon atoms was chosen to capture the finite wave vector ``soft modes'', which affect the the fourth and fifth elastic constants considerably. Graphane has about 2/3 ultimate strengths in all three tested deformation modes - armchair, zigzag, and biaxial- compared to graphene. However, graphane has larger ultimate strains in zigzag deformation, and smaller in armchair deformation. We obtained the second, third, fourth, and fifth order elastic constants for a rigorous continuum description of the elastic response. Graphane has a relatively low in-plane stiffness of 240 N/m which is about 2/3 of that of graphene, and a very small Poisson ratio of 0.078, 44% of that of graphene. The pressure dependence of the second order elastic constants were predicted from the third order elastic constants. The Poisson's ratio monotonically decreases with increasing pressure. Acknowledge the financial support from DTRA Grant # BRBAA08-C-2-0130, the U.S. NRCFDP # NRC-38-08-950, and U.S. DOE NEUP Grant #DE-NE0000325.

  1. Curvature-undulation coupling as a basis for curvature sensing and generation in bilayer membranes.

    PubMed

    Bradley, Ryan P; Radhakrishnan, Ravi

    2016-08-30

    We present coarse-grained molecular dynamics simulations of the epsin N-terminal homology domain interacting with a lipid bilayer and demonstrate a rigorous theoretical formalism and analysis method for computing the induced curvature field in varying concentrations of the protein in the dilute limit. Our theory is based on the description of the height-height undulation spectrum in the presence of a curvature field. We formulated an objective function to compare the acquired undulation spectrum from the simulations to that of the theory. We recover the curvature field parameters by minimizing the objective function even in the limit where the protein-induced membrane curvature is of the same order as the amplitude due to thermal undulations. The coupling between curvature and undulations leads to significant predictions: (i) Under dilute conditions, the proteins can sense a site of spontaneous curvature at distances much larger than their size; (ii) as the density of proteins increases the coupling focuses and stabilizes the curvature field to the site of the proteins; and (iii) the mapping of the protein localization and the induction of a stable curvature is a cooperative process that can be described through a Hill function.

  2. Curvature–undulation coupling as a basis for curvature sensing and generation in bilayer membranes

    PubMed Central

    Bradley, Ryan P.; Radhakrishnan, Ravi

    2016-01-01

    We present coarse-grained molecular dynamics simulations of the epsin N-terminal homology domain interacting with a lipid bilayer and demonstrate a rigorous theoretical formalism and analysis method for computing the induced curvature field in varying concentrations of the protein in the dilute limit. Our theory is based on the description of the height–height undulation spectrum in the presence of a curvature field. We formulated an objective function to compare the acquired undulation spectrum from the simulations to that of the theory. We recover the curvature field parameters by minimizing the objective function even in the limit where the protein-induced membrane curvature is of the same order as the amplitude due to thermal undulations. The coupling between curvature and undulations leads to significant predictions: (i) Under dilute conditions, the proteins can sense a site of spontaneous curvature at distances much larger than their size; (ii) as the density of proteins increases the coupling focuses and stabilizes the curvature field to the site of the proteins; and (iii) the mapping of the protein localization and the induction of a stable curvature is a cooperative process that can be described through a Hill function. PMID:27531962

  3. Stochastic-analytic approach to the calculation of multiply scattered lidar returns

    NASA Astrophysics Data System (ADS)

    Gillespie, D. T.

    1985-08-01

    The problem of calculating the nth-order backscattered power of a laser firing short pulses at time zero into an homogeneous cloud with specified scattering and absorption parameters, is discussed. In the problem, backscattered power is measured at any time less than zero by a small receiver colocated with the laser and fitted with a forward looking conical baffle. Theoretical calculations are made on the premise that the laser pulse is composed of propagating photons which are scattered and absorbed by the cloud particles in a probabilistic manner. The effect of polarization was not taken into account in the calculations. An exact formula is derived for backscattered power, based on direct physical arguments together with a rigorous analysis of random variables. It is shown that, for values of n less than or equal to 2, the obtained formula is a well-behaved (3n-4) dimensionless integral. The computational feasibility of the integral formula is demonstrated for a model cloud of isotropically scattering particles. An analytical formula is obtained for a value of n = 2, and a Monte Carlo program was used to obtain numerical results for values of n = 3, . . ., 6.

  4. On Event-Triggered Adaptive Architectures for Decentralized and Distributed Control of Large-Scale Modular Systems

    PubMed Central

    Albattat, Ali; Gruenwald, Benjamin C.; Yucelen, Tansel

    2016-01-01

    The last decade has witnessed an increased interest in physical systems controlled over wireless networks (networked control systems). These systems allow the computation of control signals via processors that are not attached to the physical systems, and the feedback loops are closed over wireless networks. The contribution of this paper is to design and analyze event-triggered decentralized and distributed adaptive control architectures for uncertain networked large-scale modular systems; that is, systems consist of physically-interconnected modules controlled over wireless networks. Specifically, the proposed adaptive architectures guarantee overall system stability while reducing wireless network utilization and achieving a given system performance in the presence of system uncertainties that can result from modeling and degraded modes of operation of the modules and their interconnections between each other. In addition to the theoretical findings including rigorous system stability and the boundedness analysis of the closed-loop dynamical system, as well as the characterization of the effect of user-defined event-triggering thresholds and the design parameters of the proposed adaptive architectures on the overall system performance, an illustrative numerical example is further provided to demonstrate the efficacy of the proposed decentralized and distributed control approaches. PMID:27537894

  5. Thermal conduction in particle packs via finite elements

    NASA Astrophysics Data System (ADS)

    Lechman, Jeremy B.; Yarrington, Cole; Erikson, William; Noble, David R.

    2013-06-01

    Conductive transport in heterogeneous materials composed of discrete particles is a fundamental problem for a number of applications. While analytical results and rigorous bounds on effective conductivity in mono-sized particle dispersions are well established in the literature, the methods used to arrive at these results often fail when the average size of particle clusters becomes large (i.e., near the percolation transition where particle contact networks dominate the bulk conductivity). Our aim is to develop general, efficient numerical methods that would allow us to explore this behavior and compare to a recent microstructural description of conduction in this regime. To this end, we present a finite element analysis approach to modeling heat transfer in granular media with the goal of predicting effective bulk thermal conductivities of particle-based heterogeneous composites. Our approach is verified against theoretical predictions for random isotropic dispersions of mono-disperse particles at various volume fractions up to close packing. Finally, we present results for the probability distribution of the effective conductivity in particle dispersions generated by Brownian dynamics, and suggest how this might be useful in developing stochastic models of effective properties based on the dynamical process involved in creating heterogeneous dispersions.

  6. Processing capacity under perceptual and cognitive load: a closer look at load theory.

    PubMed

    Fitousi, Daniel; Wenger, Michael J

    2011-06-01

    Variations in perceptual and cognitive demands (load) play a major role in determining the efficiency of selective attention. According to load theory (Lavie, Hirst, Fockert, & Viding, 2004) these factors (a) improve or hamper selectivity by altering the way resources (e.g., processing capacity) are allocated, and (b) tap resources rather than data limitations (Norman & Bobrow, 1975). Here we provide an extensive and rigorous set of tests of these assumptions. Predictions regarding changes in processing capacity are tested using the hazard function of the response time (RT) distribution (Townsend & Ashby, 1978; Wenger & Gibson, 2004). The assumption that load taps resource rather than data limitations is examined using measures of sensitivity and bias drawn from signal detection theory (Swets, 1964). All analyses were performed at two levels: the individual and the aggregate. Hypotheses regarding changes in processing capacity were confirmed at the level of the aggregate. Hypotheses regarding resource and data limitations were not completely supported at either level of analysis. And in all of the analyses, we observed substantial individual differences. In sum, the results suggest a need to expand the theoretical vocabulary of load theory, rather than a need to discard it.

  7. Adaptive tracking control for active suspension systems with non-ideal actuators

    NASA Astrophysics Data System (ADS)

    Pan, Huihui; Sun, Weichao; Jing, Xingjian; Gao, Huijun; Yao, Jianyong

    2017-07-01

    As a critical component of transportation vehicles, active suspension systems are instrumental in the improvement of ride comfort and maneuverability. However, practical active suspensions commonly suffer from parameter uncertainties (e.g., the variations of payload mass and suspension component parameters), external disturbances and especially the unknown non-ideal actuators (i.e., dead-zone and hysteresis nonlinearities), which always significantly deteriorate the control performance in practice. To overcome these issues, this paper synthesizes an adaptive tracking control strategy for vehicle suspension systems to achieve suspension performance improvements. The proposed control algorithm is formulated by developing a unified framework of non-ideal actuators rather than a separate way, which is a simple yet effective approach to remove the unexpected nonlinear effects. From the perspective of practical implementation, the advantages of the presented controller for active suspensions include that the assumptions on the measurable actuator outputs, the prior knowledge of nonlinear actuator parameters and the uncertain parameters within a known compact set are not required. Furthermore, the stability of the closed-loop suspension system is theoretically guaranteed by rigorous mathematical analysis. Finally, the effectiveness of the presented adaptive control scheme is confirmed using comparative numerical simulation validations.

  8. On Event-Triggered Adaptive Architectures for Decentralized and Distributed Control of Large-Scale Modular Systems.

    PubMed

    Albattat, Ali; Gruenwald, Benjamin C; Yucelen, Tansel

    2016-08-16

    The last decade has witnessed an increased interest in physical systems controlled over wireless networks (networked control systems). These systems allow the computation of control signals via processors that are not attached to the physical systems, and the feedback loops are closed over wireless networks. The contribution of this paper is to design and analyze event-triggered decentralized and distributed adaptive control architectures for uncertain networked large-scale modular systems; that is, systems consist of physically-interconnected modules controlled over wireless networks. Specifically, the proposed adaptive architectures guarantee overall system stability while reducing wireless network utilization and achieving a given system performance in the presence of system uncertainties that can result from modeling and degraded modes of operation of the modules and their interconnections between each other. In addition to the theoretical findings including rigorous system stability and the boundedness analysis of the closed-loop dynamical system, as well as the characterization of the effect of user-defined event-triggering thresholds and the design parameters of the proposed adaptive architectures on the overall system performance, an illustrative numerical example is further provided to demonstrate the efficacy of the proposed decentralized and distributed control approaches.

  9. Assessing photocatalytic power of g-C{sub 3}N{sub 4} for solar fuel production: A first-principles study involving quasi-particle theory and dispersive forces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osorio-Guillén, J. M., E-mail: mario.osorio@udea.edu.co; Espinosa-García, W. F.; Grupo de Investigación en Modelamiento y Simulación Computacional, Facultad de Ingenierías, Universidad de San Buenaventura Seccional Medellín, Carrera 56C No 51-110, Medellín

    2015-09-07

    First-principles quasi-particle theory has been employed to assess catalytic power of graphitic carbon nitride, g-C{sub 3}N{sub 4}, for solar fuel production. A comparative study between g-h-triazine and g-h-heptazine has been carried out taking also into account van der Waals dispersive forces. The band edge potentials have been calculated using a recently developed approach where quasi-particle effects are taken into account through the GW approximation. First, it was found that the description of ground state properties such as cohesive and surface formation energies requires the proper treatment of dispersive interaction. Furthermore, through the analysis of calculated band-edge potentials, it is shownmore » that g-h-triazine has high reductive power reaching the potential to reduce CO{sub 2} to formic acid, coplanar g-h-heptazine displays the highest thermodynamics force toward H{sub 2}O/O{sub 2} oxidation reaction, and corrugated g-h-heptazine exhibits a good capacity for both reactions. This rigorous theoretical study shows a route to further improve the catalytic performance of g-C{sub 3}N{sub 4}.« less

  10. What can graph theory tell us about word learning and lexical retrieval?

    PubMed

    Vitevitch, Michael S

    2008-04-01

    Graph theory and the new science of networks provide a mathematically rigorous approach to examine the development and organization of complex systems. These tools were applied to the mental lexicon to examine the organization of words in the lexicon and to explore how that structure might influence the acquisition and retrieval of phonological word-forms. Pajek, a program for large network analysis and visualization (V. Batagelj & A. Mvrar, 1998), was used to examine several characteristics of a network derived from a computerized database of the adult lexicon. Nodes in the network represented words, and a link connected two nodes if the words were phonological neighbors. The average path length and clustering coefficient suggest that the phonological network exhibits small-world characteristics. The degree distribution was fit better by an exponential rather than a power-law function. Finally, the network exhibited assortative mixing by degree. Some of these structural characteristics were also found in graphs that were formed by 2 simple stochastic processes suggesting that similar processes might influence the development of the lexicon. The graph theoretic perspective may provide novel insights about the mental lexicon and lead to future studies that help us better understand language development and processing.

  11. Multi-Source Multi-Target Dictionary Learning for Prediction of Cognitive Decline.

    PubMed

    Zhang, Jie; Li, Qingyang; Caselli, Richard J; Thompson, Paul M; Ye, Jieping; Wang, Yalin

    2017-06-01

    Alzheimer's Disease (AD) is the most common type of dementia. Identifying correct biomarkers may determine pre-symptomatic AD subjects and enable early intervention. Recently, Multi-task sparse feature learning has been successfully applied to many computer vision and biomedical informatics researches. It aims to improve the generalization performance by exploiting the shared features among different tasks. However, most of the existing algorithms are formulated as a supervised learning scheme. Its drawback is with either insufficient feature numbers or missing label information. To address these challenges, we formulate an unsupervised framework for multi-task sparse feature learning based on a novel dictionary learning algorithm. To solve the unsupervised learning problem, we propose a two-stage Multi-Source Multi-Target Dictionary Learning (MMDL) algorithm. In stage 1, we propose a multi-source dictionary learning method to utilize the common and individual sparse features in different time slots. In stage 2, supported by a rigorous theoretical analysis, we develop a multi-task learning method to solve the missing label problem. Empirical studies on an N = 3970 longitudinal brain image data set, which involves 2 sources and 5 targets, demonstrate the improved prediction accuracy and speed efficiency of MMDL in comparison with other state-of-the-art algorithms.

  12. Analysis of MUSIC-type imaging functional for single, thin electromagnetic inhomogeneity in limited-view inverse scattering problem

    NASA Astrophysics Data System (ADS)

    Ahn, Chi Young; Jeon, Kiwan; Park, Won-Kwang

    2015-06-01

    This study analyzes the well-known MUltiple SIgnal Classification (MUSIC) algorithm to identify unknown support of thin penetrable electromagnetic inhomogeneity from scattered field data collected within the so-called multi-static response matrix in limited-view inverse scattering problems. The mathematical theories of MUSIC are partially discovered, e.g., in the full-view problem, for an unknown target of dielectric contrast or a perfectly conducting crack with the Dirichlet boundary condition (Transverse Magnetic-TM polarization) and so on. Hence, we perform further research to analyze the MUSIC-type imaging functional and to certify some well-known but theoretically unexplained phenomena. For this purpose, we establish a relationship between the MUSIC imaging functional and an infinite series of Bessel functions of integer order of the first kind. This relationship is based on the rigorous asymptotic expansion formula in the existence of a thin inhomogeneity with a smooth supporting curve. Various results of numerical simulation are presented in order to support the identified structure of MUSIC. Although a priori information of the target is needed, we suggest a least condition of range of incident and observation directions to apply MUSIC in the limited-view problem.

  13. Learning Probabilistic Inference through Spike-Timing-Dependent Plasticity.

    PubMed

    Pecevski, Dejan; Maass, Wolfgang

    2016-01-01

    Numerous experimental data show that the brain is able to extract information from complex, uncertain, and often ambiguous experiences. Furthermore, it can use such learnt information for decision making through probabilistic inference. Several models have been proposed that aim at explaining how probabilistic inference could be performed by networks of neurons in the brain. We propose here a model that can also explain how such neural network could acquire the necessary information for that from examples. We show that spike-timing-dependent plasticity in combination with intrinsic plasticity generates in ensembles of pyramidal cells with lateral inhibition a fundamental building block for that: probabilistic associations between neurons that represent through their firing current values of random variables. Furthermore, by combining such adaptive network motifs in a recursive manner the resulting network is enabled to extract statistical information from complex input streams, and to build an internal model for the distribution p (*) that generates the examples it receives. This holds even if p (*) contains higher-order moments. The analysis of this learning process is supported by a rigorous theoretical foundation. Furthermore, we show that the network can use the learnt internal model immediately for prediction, decision making, and other types of probabilistic inference.

  14. Learning Probabilistic Inference through Spike-Timing-Dependent Plasticity123

    PubMed Central

    Pecevski, Dejan

    2016-01-01

    Abstract Numerous experimental data show that the brain is able to extract information from complex, uncertain, and often ambiguous experiences. Furthermore, it can use such learnt information for decision making through probabilistic inference. Several models have been proposed that aim at explaining how probabilistic inference could be performed by networks of neurons in the brain. We propose here a model that can also explain how such neural network could acquire the necessary information for that from examples. We show that spike-timing-dependent plasticity in combination with intrinsic plasticity generates in ensembles of pyramidal cells with lateral inhibition a fundamental building block for that: probabilistic associations between neurons that represent through their firing current values of random variables. Furthermore, by combining such adaptive network motifs in a recursive manner the resulting network is enabled to extract statistical information from complex input streams, and to build an internal model for the distribution p* that generates the examples it receives. This holds even if p* contains higher-order moments. The analysis of this learning process is supported by a rigorous theoretical foundation. Furthermore, we show that the network can use the learnt internal model immediately for prediction, decision making, and other types of probabilistic inference. PMID:27419214

  15. Assessing photocatalytic power of g-C3N4 for solar fuel production: A first-principles study involving quasi-particle theory and dispersive forces.

    PubMed

    Osorio-Guillén, J M; Espinosa-García, W F; Moyses Araujo, C

    2015-09-07

    First-principles quasi-particle theory has been employed to assess catalytic power of graphitic carbon nitride, g-C3N4, for solar fuel production. A comparative study between g-h-triazine and g-h-heptazine has been carried out taking also into account van der Waals dispersive forces. The band edge potentials have been calculated using a recently developed approach where quasi-particle effects are taken into account through the GW approximation. First, it was found that the description of ground state properties such as cohesive and surface formation energies requires the proper treatment of dispersive interaction. Furthermore, through the analysis of calculated band-edge potentials, it is shown that g-h-triazine has high reductive power reaching the potential to reduce CO2 to formic acid, coplanar g-h-heptazine displays the highest thermodynamics force toward H2O/O2 oxidation reaction, and corrugated g-h-heptazine exhibits a good capacity for both reactions. This rigorous theoretical study shows a route to further improve the catalytic performance of g-C3N4.

  16. Addressing Methodological Challenges in Large Communication Datasets: Collecting and Coding Longitudinal Interactions in Home Hospice Cancer Care

    PubMed Central

    Reblin, Maija; Clayton, Margaret F; John, Kevin K; Ellington, Lee

    2015-01-01

    In this paper, we present strategies for collecting and coding a large longitudinal communication dataset collected across multiple sites, consisting of over 2000 hours of digital audio recordings from approximately 300 families. We describe our methods within the context of implementing a large-scale study of communication during cancer home hospice nurse visits, but this procedure could be adapted to communication datasets across a wide variety of settings. This research is the first study designed to capture home hospice nurse-caregiver communication, a highly understudied location and type of communication event. We present a detailed example protocol encompassing data collection in the home environment, large-scale, multi-site secure data management, the development of theoretically-based communication coding, and strategies for preventing coder drift and ensuring reliability of analyses. Although each of these challenges have the potential to undermine the utility of the data, reliability between coders is often the only issue consistently reported and addressed in the literature. Overall, our approach demonstrates rigor and provides a “how-to” example for managing large, digitally-recorded data sets from collection through analysis. These strategies can inform other large-scale health communication research. PMID:26580414

  17. Effectiveness of the implementation of an evidence-based nursing model using participatory action research in oncohematology: research protocol.

    PubMed

    Abad-Corpa, Eva; Meseguer-Liza, Cristobal; Martínez-Corbalán, José Tomás; Zárate-Riscal, Lourdes; Caravaca-Hernández, Amor; Paredes-Sidrach de Cardona, Antonio; Carrillo-Alcaraz, Andrés; Delgado-Hito, Pilar; Cabrero-García, Julio

    2010-08-01

    To generate changes in nursing practice introducing an evidence-based clinical practice (EBCP) model through a participatory process. To evaluate the effectiveness of the changes in terms of nurse-sensitive outcome (NSO). For international nursing science, it is necessary to explore the reasons for supporting EBCP and evaluate the real repercussions and effectiveness. A mixed methods study with a sequential transformative design will be conducted in the bone marrow transplant unit of a tertiary-level Spanish hospital, in two time periods >12 months (date of approval of the protocol: 2006). To evaluate the effectiveness of the intervention, we will use a prospective quasi-experimental design with two non-equivalent and non-concurrent groups. NSO and patient health data will be collected: (a) impact of psycho-social adjustment; (b) patient satisfaction; (c) symptom control; (d) adverse effects. All patients admitted during the period of time will be included, and all staff working on the unit during a participatory action research (PAR). The PAR design will be adopted from a constructivist paradigm perspective, following Checkland's "Soft Systems" theoretical model. Qualitative techniques will be used: 2-hour group meetings with nursing professionals, to be recorded and transcribed. Field diaries (participants and researchers) will be drawn up and data analysis will be carried out by content analysis. PAR is a rigorous research method for introducing changes into practice to improve NSO.

  18. Voices of successful science teachers in an urban diverse single gender girls' school

    NASA Astrophysics Data System (ADS)

    Malhan, Jyoti

    This research study was conducted as a qualitative case study of four successful science teachers of female students in a diverse, title 1, urban, public girls' school. The study was designed to hear the 'muted' voices of successful science teachers concerning their beliefs and practices when they effectively provide learning opportunities for female students of color in their classrooms. Ethic of Care, equity pedagogy and culturally responsive pedagogy, created the theoretical framework for interpretation of the powerful narratives and storytelling that influenced this group of successful teachers. Data were collected by conducting in-depth, semi-structured interviews. Constant comparative method and narrative analysis were used to code and categorize the data. Analysis was conducted after each interview to discover emergent themes. Teachers conducted member checks throughout the process. The findings from the study yielded the following: (1) teachers had a passion for science and incorporated ongoing scientific developments and real-life examples and applications in their teaching, (2) teachers adopted a caring, concerned, and student-centered approach to teaching, (3) teachers acknowledged certain benefits to a single-sex girls education which included fewer distractions, increased confidence and comfort level of students, (5) teachers built relationships with students that encouraged students to engage with rigorous course content and meet higher expectations for performance. Themes that emerged included: care, culturally responsive pedagogy and culturally relevant curriculum.

  19. Stability analysis of an implicitly defined labor market model

    NASA Astrophysics Data System (ADS)

    Mendes, Diana A.; Mendes, Vivaldo M.

    2008-06-01

    Until very recently, the pervasive existence of models exhibiting well-defined backward dynamics but ill-defined forward dynamics in economics and finance has apparently posed no serious obstacles to the analysis of their dynamics and stability, despite the problems that may arise from possible erroneous conclusions regarding theoretical considerations and policy prescriptions from such models. A large number of papers have dealt with this problem in the past by assuming the existence of symmetry between forward and backward dynamics, even in the case when the map cannot be invertible either forward or backwards. However, this procedure has been seriously questioned over the last few years in a series of papers dealing with implicit difference equations and inverse limit spaces. This paper explores the search and matching labor market model developed by Bhattacharya and Bunzel [J. Bhattacharya, H. Bunzel, Chaotic Planning Solution in the Textbook Model of Equilibrium Labor Market Search and Matching, Mimeo, Iowa State University, 2002; J. Bhattacharya, H. Bunzel, Economics Bulletin 5 (19) (2003) 1-10], with the following objectives in mind: (i) to show that chaotic dynamics may still be present in the model for acceptable parameter values, (ii) to clarify some open questions related with the admissible dynamics in the forward looking setting, by providing a rigorous proof of the existence of cyclic and chaotic dynamics through the application of tools from symbolic dynamics and inverse limit theory.

  20. Shrinkage Degree in $L_{2}$ -Rescale Boosting for Regression.

    PubMed

    Xu, Lin; Lin, Shaobo; Wang, Yao; Xu, Zongben

    2017-08-01

    L 2 -rescale boosting ( L 2 -RBoosting) is a variant of L 2 -Boosting, which can essentially improve the generalization performance of L 2 -Boosting. The key feature of L 2 -RBoosting lies in introducing a shrinkage degree to rescale the ensemble estimate in each iteration. Thus, the shrinkage degree determines the performance of L 2 -RBoosting. The aim of this paper is to develop a concrete analysis concerning how to determine the shrinkage degree in L 2 -RBoosting. We propose two feasible ways to select the shrinkage degree. The first one is to parameterize the shrinkage degree and the other one is to develop a data-driven approach. After rigorously analyzing the importance of the shrinkage degree in L 2 -RBoosting, we compare the pros and cons of the proposed methods. We find that although these approaches can reach the same learning rates, the structure of the final estimator of the parameterized approach is better, which sometimes yields a better generalization capability when the number of sample is finite. With this, we recommend to parameterize the shrinkage degree of L 2 -RBoosting. We also present an adaptive parameter-selection strategy for shrinkage degree and verify its feasibility through both theoretical analysis and numerical verification. The obtained results enhance the understanding of L 2 -RBoosting and give guidance on how to use it for regression tasks.

  1. Development of the Sexual Minority Adolescent Stress Inventory

    PubMed Central

    Schrager, Sheree M.; Goldbach, Jeremy T.; Mamey, Mary Rose

    2018-01-01

    Although construct measurement is critical to explanatory research and intervention efforts, rigorous measure development remains a notable challenge. For example, though the primary theoretical model for understanding health disparities among sexual minority (e.g., lesbian, gay, bisexual) adolescents is minority stress theory, nearly all published studies of this population rely on minority stress measures with poor psychometric properties and development procedures. In response, we developed the Sexual Minority Adolescent Stress Inventory (SMASI) with N = 346 diverse adolescents ages 14–17, using a comprehensive approach to de novo measure development designed to produce a measure with desirable psychometric properties. After exploratory factor analysis on 102 candidate items informed by a modified Delphi process, we applied item response theory techniques to the remaining 72 items. Discrimination and difficulty parameters and item characteristic curves were estimated overall, within each of 12 initially derived factors, and across demographic subgroups. Two items were removed for excessive discrimination and three were removed following reliability analysis. The measure demonstrated configural and scalar invariance for gender and age; a three-item factor was excluded for demonstrating substantial differences by sexual identity and race/ethnicity. The final 64-item measure comprised 11 subscales and demonstrated excellent overall (α = 0.98), subscale (α range 0.75–0.96), and test–retest (scale r > 0.99; subscale r range 0.89–0.99) reliabilities. Subscales represented a mix of proximal and distal stressors, including domains of internalized homonegativity, identity management, intersectionality, and negative expectancies (proximal) and social marginalization, family rejection, homonegative climate, homonegative communication, negative disclosure experiences, religion, and work domains (distal). Thus, the SMASI development process illustrates a method to incorporate information from multiple sources, including item response theory models, to guide item selection in building a psychometrically sound measure. We posit that similar methods can be used to improve construct measurement across all areas of psychological research, particularly in areas where a strong theoretical framework exists but existing measures are limited. PMID:29599737

  2. Experimental evaluation of rigor mortis. V. Effect of various temperatures on the evolution of rigor mortis.

    PubMed

    Krompecher, T

    1981-01-01

    Objective measurements were carried out to study the evolution of rigor mortis on rats at various temperatures. Our experiments showed that: (1) at 6 degrees C rigor mortis reaches full development between 48 and 60 hours post mortem, and is resolved at 168 hours post mortem; (2) at 24 degrees C rigor mortis reaches full development at 5 hours post mortem, and is resolved at 16 hours post mortem; (3) at 37 degrees C rigor mortis reaches full development at 3 hours post mortem, and is resolved at 6 hours post mortem; (4) the intensity of rigor mortis grows with increase in temperature (difference between values obtained at 24 degrees C and 37 degrees C); and (5) and 6 degrees C a "cold rigidity" was found, in addition to and independent of rigor mortis.

  3. The Right Tools for the Job: The Challenges of Theory and Method in Geoscience Education Research

    NASA Astrophysics Data System (ADS)

    Riggs, E. M.

    2011-12-01

    As geoscience education has matured as a research field over the last decade, workers in this area have been challenged to adapt methodologies and theoretical approaches to study design and data collection. These techniques are as diverse as the earth sciences themselves, and researchers have drawn on established methods and traditions from science education research, social science research, and the cognitive and learning sciences. While the diversity of methodological and theoretical approaches is powerful, the challenge is to ground geoscience education research in rigorous methodologies that are appropriate for the epistemological and functional realities of the content area and the environment in which the research is conducted. The issue of theory is the first hurdle. After techniques are proven, earth scientists typically need not worry much about the theoretical value or theory-laden nature of measurements they make in the field or laboratory. As an example, a field geologist does not question the validity of the gravitational field that levels the spirit level within a Brunton compass. However, in earth science education research, these issues are magnified because a theoretical approach to a study affects what is admitted as data and the weight that can be given to conclusions. Not only must one be concerned about the validity of measurements and observations, but also the value of this information from an epistemological standpoint. The assigning of meaning to student gestures, utterances, writing and actions all carries theoretical implications. For example, working with geologists learning or working in the field, purely experimental research designs are very difficult, and the majority of the work must be conducted in a naturalistic environment. In fact dealing with time pressure, distractions, and complexity of a field environment is part of intellectual backdrop for field geology that separates experts from novices and advanced students from beginners. Thus researchers must embrace the uncontrolled nature of the setting, the qualitative nature of the data collected, and the researcher's role in interpreting geologically appropriate actions as evidence of successful problem solving and investigation. Working to understand the role of diversity and culture in the geosciences also involves a wide array of theory, from affective issues through culturally and linguistically-influenced cognition, through gender, self-efficacy, and many other areas of inquiry. Research in understanding spatial skills draws heavily on techniques from cognition research but also must involve the field-specific knowledge of geoscientists to infuse these techniques with exemplars, a catalog of meaningful actions by students, and an understanding of how to recognize success. These examples illustrate briefly the wide array of tools from other fields that is being brought to bear to advance rigorous geoscience education research. We will illustrate a few of these and the insights we have gained, and the power of theory and method from other fields to enlighten us as we attempt to educate a broader array of earth scientists.

  4. A Research Communication Brief: Gluten Analysis in Beef Samples Collected Using a Rigorous, Nationally Representative Sampling Protocol Confirms That Grain-Finished Beef Is Naturally Gluten-Free

    PubMed Central

    McNeill, Shalene H.; Cifelli, Amy M.; Roseland, Janet M.; Belk, Keith E.; Gehring, Kerri B.; Brooks, J. Chance; Thompson, Leslie D.

    2017-01-01

    Knowing whether or not a food contains gluten is vital for the growing number of individuals with celiac disease and non-celiac gluten sensitivity. Questions have recently been raised about whether beef from conventionally-raised, grain-finished cattle may contain gluten. To date, basic principles of ruminant digestion have been cited in support of the prevailing expert opinion that beef is inherently gluten-free. For this study, gluten analysis was conducted in beef samples collected using a rigorous nationally representative sampling protocol to determine whether gluten was present. The findings of our research uphold the understanding of the principles of gluten digestion in beef cattle and corroborate recommendations that recognize beef as a naturally gluten-free food. PMID:28841165

  5. Treetrimmer: a method for phylogenetic dataset size reduction.

    PubMed

    Maruyama, Shinichiro; Eveleigh, Robert J M; Archibald, John M

    2013-04-12

    With rapid advances in genome sequencing and bioinformatics, it is now possible to generate phylogenetic trees containing thousands of operational taxonomic units (OTUs) from a wide range of organisms. However, use of rigorous tree-building methods on such large datasets is prohibitive and manual 'pruning' of sequence alignments is time consuming and raises concerns over reproducibility. There is a need for bioinformatic tools with which to objectively carry out such pruning procedures. Here we present 'TreeTrimmer', a bioinformatics procedure that removes unnecessary redundancy in large phylogenetic datasets, alleviating the size effect on more rigorous downstream analyses. The method identifies and removes user-defined 'redundant' sequences, e.g., orthologous sequences from closely related organisms and 'recently' evolved lineage-specific paralogs. Representative OTUs are retained for more rigorous re-analysis. TreeTrimmer reduces the OTU density of phylogenetic trees without sacrificing taxonomic diversity while retaining the original tree topology, thereby speeding up downstream computer-intensive analyses, e.g., Bayesian and maximum likelihood tree reconstructions, in a reproducible fashion.

  6. Academic Rigor in the College Classroom: Two Federal Commissions Strive to Define Rigor in the Past 70 Years

    ERIC Educational Resources Information Center

    Francis, Clay

    2018-01-01

    Historic notions of academic rigor usually follow from critiques of the system--we often define our goals for academically rigorous work through the lens of our shortcomings. This chapter discusses how the Truman Commission in 1947 and the Spellings Commission in 2006 shaped the way we think about academic rigor in today's context.

  7. Emergency cricothyrotomy for trismus caused by instantaneous rigor in cardiac arrest patients.

    PubMed

    Lee, Jae Hee; Jung, Koo Young

    2012-07-01

    Instantaneous rigor as muscle stiffening occurring in the moment of death (or cardiac arrest) can be confused with rigor mortis. If trismus is caused by instantaneous rigor, orotracheal intubation is impossible and a surgical airway should be secured. Here, we report 2 patients who had emergency cricothyrotomy for trismus caused by instantaneous rigor. This case report aims to help physicians understand instantaneous rigor and to emphasize the importance of securing a surgical airway quickly on the occurrence of trismus. Copyright © 2012 Elsevier Inc. All rights reserved.

  8. Formalization of Generalized Constraint Language: A Crucial Prelude to Computing With Words.

    PubMed

    Khorasani, Elham S; Rahimi, Shahram; Calvert, Wesley

    2013-02-01

    The generalized constraint language (GCL), introduced by Zadeh, serves as a basis for computing with words (CW). It provides an agenda to express the imprecise and fuzzy information embedded in natural language and allows reasoning with perceptions. Despite its fundamental role, the definition of GCL has remained informal since its introduction by Zadeh, and to our knowledge, no attempt has been made to formulate a rigorous theoretical framework for GCL. Such formalization is necessary for further theoretical and practical advancement of CW for two important reasons. First, it provides the underlying infrastructure for the development of useful inference patterns based on sound theories. Second, it determines the scope of GCL and hence facilitates the translation of natural language expressions into GCL. This paper is an attempt to step in this direction by providing a formal syntax together with a compositional semantics for GCL. A soundness theorem is defined, and Zadeh's deduction rules are proved to be valid in the defined semantics. Furthermore, a discussion is provided on how the proposed language may be used in practice.

  9. Toward an Accurate Theoretical Framework for Describing Ensembles for Proteins under Strongly Denaturing Conditions

    PubMed Central

    Tran, Hoang T.; Pappu, Rohit V.

    2006-01-01

    Our focus is on an appropriate theoretical framework for describing highly denatured proteins. In high concentrations of denaturants, proteins behave like polymers in a good solvent and ensembles for denatured proteins can be modeled by ignoring all interactions except excluded volume (EV) effects. To assay conformational preferences of highly denatured proteins, we quantify a variety of properties for EV-limit ensembles of 23 two-state proteins. We find that modeled denatured proteins can be best described as follows. Average shapes are consistent with prolate ellipsoids. Ensembles are characterized by large correlated fluctuations. Sequence-specific conformational preferences are restricted to local length scales that span five to nine residues. Beyond local length scales, chain properties follow well-defined power laws that are expected for generic polymers in the EV limit. The average available volume is filled inefficiently, and cavities of all sizes are found within the interiors of denatured proteins. All properties characterized from simulated ensembles match predictions from rigorous field theories. We use our results to resolve between conflicting proposals for structure in ensembles for highly denatured states. PMID:16766618

  10. Theoretical modeling of PEB procedure on EUV resist using FDM formulation

    NASA Astrophysics Data System (ADS)

    Kim, Muyoung; Moon, Junghwan; Choi, Joonmyung; Lee, Byunghoon; Jeong, Changyoung; Kim, Heebom; Cho, Maenghyo

    2018-03-01

    Semiconductor manufacturing industry has reduced the size of wafer for enhanced productivity and performance, and Extreme Ultraviolet (EUV) light source is considered as a promising solution for downsizing. A series of EUV lithography procedures contain complex photo-chemical reaction on photoresist, and it causes technical difficulties on constructing theoretical framework which facilitates rigorous investigation of underlying mechanism. Thus, we formulated finite difference method (FDM) model of post exposure bake (PEB) process on positive chemically amplified resist (CAR), and it involved acid diffusion coupled-deprotection reaction. The model is based on Fick's second law and first-order chemical reaction rate law for diffusion and deprotection, respectively. Two kinetic parameters, diffusion coefficient of acid and rate constant of deprotection, which were obtained by experiment and atomic scale simulation were applied to the model. As a result, we obtained time evolutional protecting ratio of each functional group in resist monomer which can be used to predict resulting polymer morphology after overall chemical reactions. This achievement will be the cornerstone of multiscale modeling which provides fundamental understanding on important factors for EUV performance and rational design of the next-generation photoresist.

  11. A theoretical reassessment of microbial maintenance and implications for microbial ecology modeling.

    PubMed

    Wang, Gangsheng; Post, Wilfred M

    2012-09-01

    We attempted to reconcile three microbial maintenance models (Herbert, Pirt, and Compromise) through a theoretical reassessment. We provided a rigorous proof that the true growth yield coefficient (Y(G)) is the ratio of the specific maintenance rate (a in Herbert) to the maintenance coefficient (m in Pirt). Other findings from this study include: (1) the Compromise model is identical to the Herbert for computing microbial growth and substrate consumption, but it expresses the dependence of maintenance on both microbial biomass and substrate; (2) the maximum specific growth rate in the Herbert (μ(max,H)) is higher than those in the other two models (μ(max,P) and μ(max,C)), and the difference is the physiological maintenance factor (m(q) = a); and (3) the overall maintenance coefficient (m(T)) is more sensitive to m(q) than to the specific growth rate (μ(G)) and Y(G). Our critical reassessment of microbial maintenance provides a new approach for quantifying some important components in soil microbial ecology models. © This article is a US government work and is in the public domain in the USA.

  12. Steady-state distributions of probability fluxes on complex networks

    NASA Astrophysics Data System (ADS)

    Chełminiak, Przemysław; Kurzyński, Michał

    2017-02-01

    We consider a simple model of the Markovian stochastic dynamics on complex networks to examine the statistical properties of the probability fluxes. The additional transition, called hereafter a gate, powered by the external constant force breaks a detailed balance in the network. We argue, using a theoretical approach and numerical simulations, that the stationary distributions of the probability fluxes emergent under such conditions converge to the Gaussian distribution. By virtue of the stationary fluctuation theorem, its standard deviation depends directly on the square root of the mean flux. In turn, the nonlinear relation between the mean flux and the external force, which provides the key result of the present study, allows us to calculate the two parameters that entirely characterize the Gaussian distribution of the probability fluxes both close to as well as far from the equilibrium state. Also, the other effects that modify these parameters, such as the addition of shortcuts to the tree-like network, the extension and configuration of the gate and a change in the network size studied by means of computer simulations are widely discussed in terms of the rigorous theoretical predictions.

  13. Translational medicine in the Age of Big Data.

    PubMed

    Tatonetti, Nicholas P

    2017-10-12

    The ability to collect, store and analyze massive amounts of molecular and clinical data is fundamentally transforming the scientific method and its application in translational medicine. Collecting observations has always been a prerequisite for discovery, and great leaps in scientific understanding are accompanied by an expansion of this ability. Particle physics, astronomy and climate science, for example, have all greatly benefited from the development of new technologies enabling the collection of larger and more diverse data. Unlike medicine, however, each of these fields also has a mature theoretical framework on which new data can be evaluated and incorporated-to say it another way, there are no 'first principals' from which a healthy human could be analytically derived. The worry, and it is a valid concern, is that, without a strong theoretical underpinning, the inundation of data will cause medical research to devolve into a haphazard enterprise without discipline or rigor. The Age of Big Data harbors tremendous opportunity for biomedical advances, but will also be treacherous and demanding on future scientists. © The Author 2017. Published by Oxford University Press.

  14. Inelastic electron tunneling mediated by a molecular quantum rotator

    NASA Astrophysics Data System (ADS)

    Sugimoto, Toshiki; Kunisada, Yuji; Fukutani, Katsuyuki

    2017-12-01

    Inelastic electron tunneling (IET) accompanying nuclear motion is not only of fundamental physical interest but also has strong impacts on chemical and biological processes in nature. Although excitation of rotational motion plays an important role in enhancing electric conductance at a low bias, the mechanism of rotational excitation remains veiled. Here, we present a basic theoretical framework of IET that explicitly takes into consideration quantum angular momentum, focusing on a molecular H2 rotator trapped in a nanocavity between two metallic electrodes as a model system. It is shown that orientationally anisotropic electrode-rotator coupling is the origin of angular-momentum exchange between the electron and molecule; we found that the anisotropic coupling imposes rigorous selection rules in rotational excitation. In addition, rotational symmetry breaking induced by the anisotropic potential lifts the degeneracy of the energy level of the degenerated rotational state of the quantum rotator and tunes the threshold bias voltage that triggers rotational IET. Our theoretical results provide a paradigm for physical understanding of the rotational IET process and spectroscopy, as well as molecular-level design of electron-rotation coupling in nanoelectronics.

  15. Psycho-educational strategies to promote fluid adherence in adult hemodialysis patients: a review of intervention studies.

    PubMed

    Welch, Janet L; Thomas-Hawkins, Charlotte

    2005-07-01

    We reviewed psycho-educational intervention studies that were designed to reduce interdialytic weight gain (IDWG) in adult hemodialysis patients. Our goals were to critique research methods, describe the effectiveness of tested interventions, and make recommendations for future research. Medline, PsychInfo, and the Cumulative Index to Nursing and Applied Health (CINAHL) databases were searched to identify empirical work. Each study was evaluated in terms of sample, design, theoretical framework, intervention delivery, and outcome. Nine studies were reviewed. Self-monitoring appears to be a promising strategy to be considered to reduce IDWG. Theory was not usually used to guide interventions, designs generally had control groups, interventions were delivered individually, more than one intervention was delivered at a time, the duration of the intervention varied greatly, there was no long-term follow-up, IDWG was the only outcome, and IDWG was operationalized in different ways. Theoretical models and methodological rigor are needed to guide future research. Specific recommendations on design, measurement, and conceptual issues are offered to enhance the effectiveness of future research.

  16. 5D-Tracking of a nanorod in a focused laser beam--a theoretical concept.

    PubMed

    Griesshammer, Markus; Rohrbach, Alexander

    2014-03-10

    Back-focal plane (BFP) interferometry is a very fast and precise method to track the 3D position of a sphere within a focused laser beam using a simple quadrant photo diode (QPD). Here we present a concept of how to track and recover the 5D state of a cylindrical nanorod (3D position and 2 tilt angles) in a laser focus by analyzing the interference of unscattered light and light scattered at the cylinder. The analytical theoretical approach is based on Rayleigh-Gans scattering together with a local field approximation for an infinitely thin cylinder. The approximated BFP intensities compare well with those from a more rigorous numerical approach. It turns out that a displacement of the cylinder results in a modulation of the BFP intensity pattern, whereas a tilt of the cylinder results in a shift of this pattern. We therefore propose the concept of a local QPD in the BFP of a detection lens, where the QPD center is shifted by the angular coordinates of the cylinder tilt.

  17. On Mathematical Modeling Of Quantum Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Achuthan, P.; Dept. of Mathematics, Indian Institute of Technology, Madras, 600 036; Narayanankutty, Karuppath

    2009-07-02

    The world of physical systems at the most fundamental levels is replete with efficient, interesting models possessing sufficient ability to represent the reality to a considerable extent. So far, quantum mechanics (QM) forming the basis of almost all natural phenomena, has found beyond doubt its intrinsic ingenuity, capacity and robustness to stand the rigorous tests of validity from and through appropriate calculations and experiments. No serious failures of quantum mechanical predictions have been reported, yet. However, Albert Einstein, the greatest theoretical physicist of the twentieth century and some other eminent men of science have stated firmly and categorically that QM,more » though successful by and large, is incomplete. There are classical and quantum reality models including those based on consciousness. Relativistic quantum theoretical approaches to clearly understand the ultimate nature of matter as well as radiation have still much to accomplish in order to qualify for a final theory of everything (TOE). Mathematical models of better, suitable character as also strength are needed to achieve satisfactory explanation of natural processes and phenomena. We, in this paper, discuss some of these matters with certain apt illustrations as well.« less

  18. Using the Origin and Pawn, Positive Affect, CASPM, and Cognitive Anxiety Content Analysis Scales in Counseling Research

    ERIC Educational Resources Information Center

    Viney, Linda L.; Caputi, Peter

    2005-01-01

    Content analysis scales apply rigorous measurement to verbal communications and make possible the quantification of text in counseling research. The limitations of the Origin and Pawn Scales (M. T. Westbrook & L. L. Viney, 1980), the Positive Affect Scale (M. T. Westbrook, 1976), the Content Analysis Scales of Psychosocial Maturity (CASPM; L.…

  19. Design Optimization and Analysis of a Composite Honeycomb Intertank

    NASA Technical Reports Server (NTRS)

    Finckenor, Jeffrey; Spurrier, Mike

    1998-01-01

    Intertanks, the structure between tanks of launch vehicles, are prime candidates for weight reduction of rockets. This paper discusses the optimization and detailed analysis of a 96 in (2.44 m) diameter, 77 in (1.85 m) tall intertank. The structure has composite face sheets and an aluminum honeycomb core. The ends taper to a thick built up laminate for a double lap bolted shear joint. It is made in 8 full length panels joined with bonded double lap joints. The nominal load is 4000 lb/in (7 x 10(exp 5) N/m). Optimization is by Genetic Algorithm and minimizes weight by varying C, core thickness, number and orientation of acreage and buildup plies, and the size, number and spacing of bolts. A variety of cases were run with populations up to 2000 and chromosomes as long as 150 bits. Constraints were buckling, face stresses (normal, shear, wrinkling and dimpling, bolt stress, and bolt hole stresses (bearing, net tension, wedge splitting, shear out and tension/shear out). Analysis is by a combination of theoretical solutions and empirical data. After optimization, a series of coupon tests were performed in conjunction with a rigorous analysis involving a variety of finite element models. The analysis and test resulted in several small changes to the optimized design. The intertank has undergone a 250,000 lb (1.1 x 10(exp 6) N) limit load test and been mated with a composite liquid hydrogen tank. The tank/intertank unit is being installed in a test stand where it will see 200 thermal/load cycles. Afterwards the intertank will be demated and loaded in compression to failure.

  20. Semiconductor Quantum Electron Wave Transport, Diffraction, and Interference: Analysis, Device, and Measurement.

    NASA Astrophysics Data System (ADS)

    Henderson, Gregory Newell

    Semiconductor device dimensions are rapidly approaching a fundamental limit where drift-diffusion equations and the depletion approximation are no longer valid. In this regime, quantum effects can dominate device response. To increase further device density and speed, new devices must be designed that use these phenomena to positive advantage. In addition, quantum effects provide opportunities for a new class of devices which can perform functions previously unattainable with "conventional" semiconductor devices. This thesis has described research in the analysis of electron wave effects in semiconductors and the development of methods for the design, fabrication, and characterization of quantum devices based on these effects. First, an exact set of quantitative analogies are presented which allow the use of well understood optical design and analysis tools for the development of electron wave semiconductor devices. Motivated by these analogies, methods are presented for modeling electron wave grating diffraction using both an exact rigorous coupled-wave analysis and approximate analyses which are useful for grating design. Example electron wave grating switch and multiplexer designs are presented. In analogy to thin-film optics, the design and analysis of electron wave Fabry-Perot interference filters are also discussed. An innovative technique has been developed for testing these (and other) electron wave structures using Ballistic Electron Emission Microscopy (BEEM). This technique uses a liquid-helium temperature scanning tunneling microscope (STM) to perform spectroscopy of the electron transmittance as a function of electron energy. Experimental results show that BEEM can resolve even weak quantum effects, such as the reflectivity of a single interface between materials. Finally, methods are discussed for incorporating asymmetric electron wave Fabry-Perot filters into optoelectronic devices. Theoretical and experimental results show that such structures could be the basis for a new type of electrically pumped mid - to far-infrared semiconductor laser.

  1. Study of the quality characteristics in cold-smoked salmon (Salmo salar) originating from pre- or post-rigor raw material.

    PubMed

    Birkeland, S; Akse, L

    2010-01-01

    Improved slaughtering procedures in the salmon industry have caused a delayed onset of rigor mortis and, thus, a potential for pre-rigor secondary processing. The aim of this study was to investigate the effect of rigor status at time of processing on quality traits color, texture, sensory, microbiological, in injection salted, and cold-smoked Atlantic salmon (Salmo salar). Injection of pre-rigor fillets caused a significant (P<0.001) contraction (-7.9%± 0.9%) on the caudal-cranial axis. No significant differences in instrumental color (a*, b*, C*, or h*), texture (hardness), or sensory traits (aroma, color, taste, and texture) were observed between pre- or post-rigor processed fillets; however, post-rigor (1477 ± 38 g) fillets had a significant (P>0.05) higher fracturability than pre-rigor fillets (1369 ± 71 g). Pre-rigor fillets were significantly (P<0.01) lighter, L*, (39.7 ± 1.0) than post-rigor fillets (37.8 ± 0.8) and had significantly lower (P<0.05) aerobic plate count (APC), 1.4 ± 0.4 log CFU/g against 2.6 ± 0.6 log CFU/g, and psychrotrophic count (PC), 2.1 ± 0.2 log CFU/g against 3.0 ± 0.5 log CFU/g, than post-rigor processed fillets. This study showed that similar quality characteristics can be obtained in cold-smoked products processed either pre- or post-rigor when using suitable injection salting protocols and smoking techniques. © 2010 Institute of Food Technologists®

  2. Leveraging graph topology and semantic context for pharmacovigilance through twitter-streams.

    PubMed

    Eshleman, Ryan; Singh, Rahul

    2016-10-06

    Adverse drug events (ADEs) constitute one of the leading causes of post-therapeutic death and their identification constitutes an important challenge of modern precision medicine. Unfortunately, the onset and effects of ADEs are often underreported complicating timely intervention. At over 500 million posts per day, Twitter is a commonly used social media platform. The ubiquity of day-to-day personal information exchange on Twitter makes it a promising target for data mining for ADE identification and intervention. Three technical challenges are central to this problem: (1) identification of salient medical keywords in (noisy) tweets, (2) mapping drug-effect relationships, and (3) classification of such relationships as adverse or non-adverse. We use a bipartite graph-theoretic representation called a drug-effect graph (DEG) for modeling drug and side effect relationships by representing the drugs and side effects as vertices. We construct individual DEGs on two data sources. The first DEG is constructed from the drug-effect relationships found in FDA package inserts as recorded in the SIDER database. The second DEG is constructed by mining the history of Twitter users. We use dictionary-based information extraction to identify medically-relevant concepts in tweets. Drugs, along with co-occurring symptoms are connected with edges weighted by temporal distance and frequency. Finally, information from the SIDER DEG is integrate with the Twitter DEG and edges are classified as either adverse or non-adverse using supervised machine learning. We examine both graph-theoretic and semantic features for the classification task. The proposed approach can identify adverse drug effects with high accuracy with precision exceeding 85 % and F1 exceeding 81 %. When compared with leading methods at the state-of-the-art, which employ un-enriched graph-theoretic analysis alone, our method leads to improvements ranging between 5 and 8 % in terms of the aforementioned measures. Additionally, we employ our method to discover several ADEs which, though present in medical literature and Twitter-streams, are not represented in the SIDER databases. We present a DEG integration model as a powerful formalism for the analysis of drug-effect relationships that is general enough to accommodate diverse data sources, yet rigorous enough to provide a strong mechanism for ADE identification.

  3. Interventions to Increase Attendance at Psychotherapy: A Meta-Analysis of Randomized Controlled Trials

    ERIC Educational Resources Information Center

    Oldham, Mary; Kellett, Stephen; Miles, Eleanor; Sheeran, Paschal

    2012-01-01

    Objective: Rates of nonattendance for psychotherapy hinder the effective delivery of evidence-based treatments. Although many strategies have been developed to increase attendance, the effectiveness of these strategies has not been quantified. Our aim in the present study was to undertake a meta-analysis of rigorously controlled studies to…

  4. A Comparative Study of Definitions on Limit and Continuity of Functions

    ERIC Educational Resources Information Center

    Shipman, Barbara A.

    2012-01-01

    Differences in definitions of limit and continuity of functions as treated in courses on calculus and in rigorous undergraduate analysis yield contradictory outcomes and unexpected language. There are results about limits in calculus that are false by the definitions of analysis, functions not continuous by one definition and continuous by…

  5. Tutoring Adolescents in Literacy: A Meta-Analysis

    ERIC Educational Resources Information Center

    Jun, Seung Won; Ramirez, Gloria; Cumming, Alister

    2010-01-01

    What does research reveal about tutoring adolescents in literacy? We conducted a meta-analysis, identifying 152 published studies, of which 12 met rigorous inclusion criteria. We analyzed the 12 studies for the effects of tutoring according to the type, focus, and amount of tutoring; the number, age, and language background of students; and the…

  6. Interactive visual analysis promotes exploration of long-term ecological data

    Treesearch

    T.N. Pham; J.A. Jones; R. Metoyer; F.J. Swanson; R.J. Pabst

    2013-01-01

    Long-term ecological data are crucial in helping ecologists understand ecosystem function and environmental change. Nevertheless, these kinds of data sets are difficult to analyze because they are usually large, multivariate, and spatiotemporal. Although existing analysis tools such as statistical methods and spreadsheet software permit rigorous tests of pre-conceived...

  7. An International Meta-Analysis of Reading Recovery

    ERIC Educational Resources Information Center

    D'Agostino, Jerome V.; Harmey, Sinéad J.

    2016-01-01

    Reading Recovery is one of the most researched literacy programs worldwide. Although there have been at least 4 quantitative reviews of its effectiveness, none have considered all rigorous group-comparison studies from all implementing nations from the late 1970s to 2015. Using a hierarchical linear modeling (HLM) v-known analysis, we examined if…

  8. Effect of Pre-rigor Salting Levels on Physicochemical and Textural Properties of Chicken Breast Muscles.

    PubMed

    Kim, Hyun-Wook; Hwang, Ko-Eun; Song, Dong-Heon; Kim, Yong-Jae; Ham, Youn-Kyung; Yeo, Eui-Joo; Jeong, Tae-Jun; Choi, Yun-Sang; Kim, Cheon-Jei

    2015-01-01

    This study was conducted to evaluate the effect of pre-rigor salting level (0-4% NaCl concentration) on physicochemical and textural properties of pre-rigor chicken breast muscles. The pre-rigor chicken breast muscles were de-boned 10 min post-mortem and salted within 25 min post-mortem. An increase in pre-rigor salting level led to the formation of high ultimate pH of chicken breast muscles at post-mortem 24 h. The addition of minimum of 2% NaCl significantly improved water holding capacity, cooking loss, protein solubility, and hardness when compared to the non-salting chicken breast muscle (p<0.05). On the other hand, the increase in pre-rigor salting level caused the inhibition of myofibrillar protein degradation and the acceleration of lipid oxidation. However, the difference in NaCl concentration between 3% and 4% had no great differences in the results of physicochemical and textural properties due to pre-rigor salting effects (p>0.05). Therefore, our study certified the pre-rigor salting effect of chicken breast muscle salted with 2% NaCl when compared to post-rigor muscle salted with equal NaCl concentration, and suggests that the 2% NaCl concentration is minimally required to ensure the definite pre-rigor salting effect on chicken breast muscle.

  9. Effect of Pre-rigor Salting Levels on Physicochemical and Textural Properties of Chicken Breast Muscles

    PubMed Central

    Choi, Yun-Sang

    2015-01-01

    This study was conducted to evaluate the effect of pre-rigor salting level (0-4% NaCl concentration) on physicochemical and textural properties of pre-rigor chicken breast muscles. The pre-rigor chicken breast muscles were de-boned 10 min post-mortem and salted within 25 min post-mortem. An increase in pre-rigor salting level led to the formation of high ultimate pH of chicken breast muscles at post-mortem 24 h. The addition of minimum of 2% NaCl significantly improved water holding capacity, cooking loss, protein solubility, and hardness when compared to the non-salting chicken breast muscle (p<0.05). On the other hand, the increase in pre-rigor salting level caused the inhibition of myofibrillar protein degradation and the acceleration of lipid oxidation. However, the difference in NaCl concentration between 3% and 4% had no great differences in the results of physicochemical and textural properties due to pre-rigor salting effects (p>0.05). Therefore, our study certified the pre-rigor salting effect of chicken breast muscle salted with 2% NaCl when compared to post-rigor muscle salted with equal NaCl concentration, and suggests that the 2% NaCl concentration is minimally required to ensure the definite pre-rigor salting effect on chicken breast muscle. PMID:26761884

  10. A Proposed Solution to the Problem with Using Completely Random Data to Assess the Number of Factors with Parallel Analysis

    ERIC Educational Resources Information Center

    Green, Samuel B.; Levy, Roy; Thompson, Marilyn S.; Lu, Min; Lo, Wen-Juo

    2012-01-01

    A number of psychometricians have argued for the use of parallel analysis to determine the number of factors. However, parallel analysis must be viewed at best as a heuristic approach rather than a mathematically rigorous one. The authors suggest a revision to parallel analysis that could improve its accuracy. A Monte Carlo study is conducted to…

  11. Experimental evaluation of rigor mortis. VI. Effect of various causes of death on the evolution of rigor mortis.

    PubMed

    Krompecher, T; Bergerioux, C; Brandt-Casadevall, C; Gujer, H R

    1983-07-01

    The evolution of rigor mortis was studied in cases of nitrogen asphyxia, drowning and strangulation, as well as in fatal intoxications due to strychnine, carbon monoxide and curariform drugs, using a modified method of measurement. Our experiments demonstrated that: (1) Strychnine intoxication hastens the onset and passing of rigor mortis. (2) CO intoxication delays the resolution of rigor mortis. (3) The intensity of rigor may vary depending upon the cause of death. (4) If the stage of rigidity is to be used to estimate the time of death, it is necessary: (a) to perform a succession of objective measurements of rigor mortis intensity; and (b) to verify the eventual presence of factors that could play a role in the modification of its development.

  12. Numerical investigation of exact coherent structures in turbulent small-aspect-ratio Taylor-Couette flow

    NASA Astrophysics Data System (ADS)

    Krygier, Michael; Crowley, Christopher J.; Schatz, Michael F.; Grigoriev, Roman O.

    2017-11-01

    As suggested by recent theoretical and experimental studies, fluid turbulence can be described as a walk between neighborhoods of unstable nonchaotic solutions of the Navier-Stokes equation known as exact coherent structures (ECS). Finding ECS in an experimentally-accessible setting is the first step toward rigorous testing of the dynamical role of ECS in 3D turbulence. We found several ECS (both relative periodic orbits and relative equilibria) in a weakly turbulent regime of small-aspect-ratio Taylor-Couette flow with counter-rotating cylinders. This talk will discuss how the geometry of these solutions guides the evolution of turbulent flow in the simulations. This work is supported by the Army Research Office (Contract # W911NF-15-1-0471).

  13. Multifrequency multi-qubit entanglement based on plasmonic hot spots

    PubMed Central

    Ren, Jun; Wu, Tong; Zhang, Xiangdong

    2015-01-01

    The theoretical method to study strong coupling between an ensemble of quantum emitters (QEs) and surface plasmons excited by the nanoparticle cluster has been presented by using a rigorous first-principles electromagnetic Green’s tensor technique. We have demonstrated that multi-qubit entanglements for two-level QEs can be produced at different coupling resonance frequencies, when they locate in the hot spots of the metallic nanoparticle cluster. The duration of quantum beats for such an entanglement can reach two orders longer than that for the entanglement in a photonic cavity. The phenomenon originates from collective coupling resonance excitation of the cluster. At the frequency of single scattering resonance, the entanglement cannot be produced although the single QE spontaneous decay rate is very big. PMID:26350051

  14. Heterogenous Combustion of Porous Graphite Particles in Normal and Microgravity

    NASA Technical Reports Server (NTRS)

    Chelliah, Harsha K.; Miller, Fletcher J.; Delisle, Andrew J.

    2001-01-01

    Combustion of solid fuel particles has many important applications, including power generation and space propulsion systems. The current models available for describing the combustion process of these particles, especially porous solid particles, include various simplifying approximations. One of the most limiting approximations is the lumping of the physical properties of the porous fuel with the heterogeneous chemical reaction rate constants. The primary objective of the present work is to develop a rigorous model that could decouple such physical and chemical effects from the global heterogeneous reaction rates. For the purpose of validating this model, experiments with porous graphite particles of varying sizes and porosity are being performed. The details of this experimental and theoretical model development effort are described.

  15. Equilibration, thermalisation, and the emergence of statistical mechanics in closed quantum systems

    NASA Astrophysics Data System (ADS)

    Gogolin, Christian; Eisert, Jens

    2016-05-01

    We review selected advances in the theoretical understanding of complex quantum many-body systems with regard to emergent notions of quantum statistical mechanics. We cover topics such as equilibration and thermalisation in pure state statistical mechanics, the eigenstate thermalisation hypothesis, the equivalence of ensembles, non-equilibration dynamics following global and local quenches as well as ramps. We also address initial state independence, absence of thermalisation, and many-body localisation. We elucidate the role played by key concepts for these phenomena, such as Lieb-Robinson bounds, entanglement growth, typicality arguments, quantum maximum entropy principles and the generalised Gibbs ensembles, and quantum (non-)integrability. We put emphasis on rigorous approaches and present the most important results in a unified language.

  16. Theoretical study of energy states of two-dimensional electron gas in pseudomorphically strained InAs HEMTs taking into account the non-parabolicity of the conduction band

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nishio, Yui; Yamaguchi, Satoshi; Yamazaki, Youichi

    2013-12-04

    We determined rigorously the energy states of a two-dimensional electron gas (2DEG) in high electron mobility transistors (HEMTs) with a pseudomorphically strained InAs channel (InAs PHEMTs) taking into account the non-parabolicity of the conduction band for InAs. The sheet carrier concentration of 2DEG for the non-parabolic energy band was about 50% larger than that for the parabolic energy band and most of the electrons are confined strongly in the InAs layer. In addition, the threshold voltage for InAs PHEMTs was about 0.21 V lower than that for conventional InGaAs HEMTs.

  17. Prediction of the limit of detection of an optical resonant reflection biosensor.

    PubMed

    Hong, Jongcheol; Kim, Kyung-Hyun; Shin, Jae-Heon; Huh, Chul; Sung, Gun Yong

    2007-07-09

    A prediction of the limit of detection of an optical resonant reflection biosensor is presented. An optical resonant reflection biosensor using a guided-mode resonance filter is one of the most promising label-free optical immunosensors due to a sharp reflectance peak and a high sensitivity to the changes of optical path length. We have simulated this type of biosensor using rigorous coupled wave theory to calculate the limit of detection of the thickness of the target protein layer. Theoretically, our biosensor has an estimated ability to detect thickness change approximately the size of typical antigen proteins. We have also investigated the effects of the absorption and divergence of the incident light on the detection ability of the biosensor.

  18. Habit, custom, and power: a multi-level theory of population health.

    PubMed

    Zimmerman, Frederick J

    2013-03-01

    In multi-level theory, individual behavior flows from cognitive habits, either directly through social referencing, rules of thumb, or automatic behaviors; or indirectly through the shaping of rationality itself by framing or heuristics. Although behavior does not arise from individually rational optimization, it generally appears to be rational, because the cognitive habits that guide behavior evolve toward optimality. However, power imbalances shaped by particular social, political, and economic structures can distort this evolution, leading to individual behavior that fails to maximize individual or social well-being. Replacing the dominant rational-choice paradigm with a multi-level theoretical paradigm involving habit, custom, and power will enable public health to engage in rigorous new areas of research. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. A Method for Computing Leading-Edge Loads

    NASA Technical Reports Server (NTRS)

    Rhode, Richard V; Pearson, Henry A

    1933-01-01

    In this report a formula is developed that enables the determination of the proper design load for the portion of the wing forward of the front spar. The formula is inherently rational in concept, as it takes into account the most important variables that affect the leading-edge load, although theoretical rigor has been sacrificed for simplicity and ease of application. Some empirical corrections, based on pressure distribution measurements on the PW-9 and M-3 airplanes have been introduced to provide properly for biplanes. Results from the formula check experimental values in a variety of cases with good accuracy in the critical loading conditions. The use of the method for design purposes is therefore felt to be justified and is recommended.

  20. Basis for paraxial surface-plasmon-polariton packets

    NASA Astrophysics Data System (ADS)

    Martinez-Herrero, Rosario; Manjavacas, Alejandro

    2016-12-01

    We present a theoretical framework for the study of surface-plasmon polariton (SPP) packets propagating along a lossy metal-dielectric interface within the paraxial approximation. Using a rigorous formulation based on the plane-wave spectrum formalism, we introduce a set of modes that constitute a complete basis set for the solutions of Maxwell's equations for a metal-dielectric interface in the paraxial approximation. The use of this set of modes allows us to fully analyze the evolution of the transversal structure of SPP packets beyond the single plane-wave approximation. As a paradigmatic example, we analyze the case of a Gaussian SPP mode, for which, exploiting the analogy with paraxial optical beams, we introduce a set of parameters that characterize its propagation.

  1. Electrodynamic multiple-scattering method for the simulation of optical trapping atop periodic metamaterials

    NASA Astrophysics Data System (ADS)

    Yannopapas, Vassilios; Paspalakis, Emmanuel

    2018-07-01

    We present a new theoretical tool for simulating optical trapping of nanoparticles in the presence of an arbitrary metamaterial design. The method is based on rigorously solving Maxwell's equations for the metamaterial via a hybrid discrete-dipole approximation/multiple-scattering technique and direct calculation of the optical force exerted on the nanoparticle by means of the Maxwell stress tensor. We apply the method to the case of a spherical polystyrene probe trapped within the optical landscape created by illuminating of a plasmonic metamaterial consisting of periodically arranged tapered metallic nanopyramids. The developed technique is ideally suited for general optomechanical calculations involving metamaterial designs and can compete with purely numerical methods such as finite-difference or finite-element schemes.

  2. Guidelines for preparing high school psychology teachers: course-based and standards-based approaches.

    PubMed

    2013-01-01

    Psychology is one of the most popular elective high school courses. The high school psychology course provides the foundation for students to benefit from psychological perspectives on personal and contemporary issues and learn the rules of evidence and theoretical frameworks of the discipline. The guidelines presented here constitute the second of two reports in this issue of the American Psychologist (January 2013) representing recent American Psychological Association (APA) policies that support high-quality instruction in the teaching of high school psychology. These guidelines, aligned to the standards presented in the preceding report, describe models for the preparation of preservice psychology teachers. The two reports together demonstrate the rigor and competency that should be expected in psychology instruction at the high school level.

  3. Equilibration, thermalisation, and the emergence of statistical mechanics in closed quantum systems.

    PubMed

    Gogolin, Christian; Eisert, Jens

    2016-05-01

    We review selected advances in the theoretical understanding of complex quantum many-body systems with regard to emergent notions of quantum statistical mechanics. We cover topics such as equilibration and thermalisation in pure state statistical mechanics, the eigenstate thermalisation hypothesis, the equivalence of ensembles, non-equilibration dynamics following global and local quenches as well as ramps. We also address initial state independence, absence of thermalisation, and many-body localisation. We elucidate the role played by key concepts for these phenomena, such as Lieb-Robinson bounds, entanglement growth, typicality arguments, quantum maximum entropy principles and the generalised Gibbs ensembles, and quantum (non-)integrability. We put emphasis on rigorous approaches and present the most important results in a unified language.

  4. Total focusing method (TFM) robustness to material deviations

    NASA Astrophysics Data System (ADS)

    Painchaud-April, Guillaume; Badeau, Nicolas; Lepage, Benoit

    2018-04-01

    The total focusing method (TFM) is becoming an accepted nondestructive evaluation method for industrial inspection. What was a topic of discussion in the applied research community just a few years ago is now being deployed in critical industrial applications, such as inspecting welds in pipelines. However, the method's sensitivity to unexpected parametric changes (material and geometric) has not been rigorously assessed. In this article, we investigate the robustness of TFM in relation to unavoidable deviations from modeled nominal inspection component characteristics, such as sound velocities and uncertainties about the parts' internal and external diameters. We also review TFM's impact on the standard inspection modes often encountered in industrial inspections, and we present a theoretical model supported by empirical observations to illustrate the discussion.

  5. Current thinking in qualitative research: evidence-based practice, moral philosophies, and political struggle.

    PubMed

    Papadimitriou, Christina; Magasi, Susan; Frank, Gelya

    2012-01-01

    In this introduction to the special issue on current thinking in qualitative research and occupational therapy and science, the authors focus on the importance of rigorous qualitative research to inform occupational therapy practice. The authors chosen for this special issue reflect a "second generation of qualitative researchers" who are critical, theoretically sophisticated, methodologically productive, and politically relevant to show that working with disabled clients is political work. Three themes emerged across the articles included in this special issue: (1) recognizing and addressing social justice issues; (2) learning from clients' experiences; and (3) critically reframing occupational therapy's role. These themes can inform occupational therapy practice, research, and education to reflect a more client-centered and politically engaging approach. Copyright 2012, SLACK Incorporated.

  6. Darwinism and positivism as methodological influences on the development of psychology.

    PubMed

    Mackenzie, B

    1976-10-01

    The methodological significance of evolutionary theory for psychology may be distinguished from its substantive or theoretical significance. The methodological significance was that evolutionay theory broadened the current conceptors of scientific method and rendered them relatively independent of physics. It thereby made the application of the "scientific method" to psychology much more feasible than it had been previously, and thus established the possibility of a wide-ranging scientific psychology for the first time. The methodological eclecticism that made scientific psychology possible did not, however, remain a feature of psychology for very long. Psychology's methodology rapidly became restricted and codified through the influence of, and in imitation of, the rigorously positivistic orientation of physics around the turn of the twentieth century.

  7. Sympathetic cooling of polyatomic molecules with S-state atoms in a magnetic trap.

    PubMed

    Tscherbul, T V; Yu, H-G; Dalgarno, A

    2011-02-18

    We present a rigorous theoretical study of low-temperature collisions of polyatomic molecular radicals with (1)S(0) atoms in the presence of an external magnetic field. Accurate quantum scattering calculations based on ab initio and scaled interaction potentials show that collision-induced spin relaxation of the prototypical organic molecule CH(2)(X(3)B(1)) (methylene) and nine other triatomic radicals in cold (3)He gas occurs at a slow rate, demonstrating that cryogenic buffer-gas cooling and magnetic trapping of these molecules is feasible with current technology. Our calculations further suggest that it may be possible to create ultracold gases of polyatomic molecules by sympathetic cooling with alkaline-earth atoms in a magnetic trap.

  8. Nonlinear dynamics of mini-satellite respinup by weak internal controllable torques

    NASA Astrophysics Data System (ADS)

    Somov, Yevgeny

    2014-12-01

    Contemporary space engineering advanced new problem before theoretical mechanics and motion control theory: a spacecraft directed respinup by the weak restricted control internal forces. The paper presents some results on this problem, which is very actual for energy supply of information mini-satellites (for communication, geodesy, radio- and opto-electronic observation of the Earth et al.) with electro-reaction plasma thrusters and gyro moment cluster based on the reaction wheels or the control moment gyros. The solution achieved is based on the methods for synthesis of nonlinear robust control and on rigorous analytical proof for the required spacecraft rotation stability by Lyapunov function method. These results were verified by a computer simulation of strongly nonlinear oscillatory processes at respinuping of a flexible spacecraft.

  9. Neutron-neutron quasifree scattering in nd breakup at 10 MeV

    NASA Astrophysics Data System (ADS)

    Malone, R. C.; Crowe, B.; Crowell, A. S.; Cumberbatch, L. C.; Esterline, J. H.; Fallin, B. A.; Friesen, F. Q. L.; Han, Z.; Howell, C. R.; Markoff, D.; Ticehurst, D.; Tornow, W.; Witała, H.

    2016-03-01

    The neutron-deuteron (nd) breakup reaction provides a rich environment for testing theoretical models of the neutron-neutron (nn) interaction. Current theoretical predictions based on rigorous ab-initio calculations agree well with most experimental data for this system, but there remain a few notable discrepancies. The cross section for nn quasifree (QFS) scattering is one such anomaly. Two recent experiments reported cross sections for this particular nd breakup configuration that exceed theoretical calculations by almost 20% at incident neutron energies of 26 and 25 MeV [1, 2]. The theoretical values can be brought into agreement with these results by increasing the strength of the 1S0 nn potential matrix element by roughly 10%. However, this modification of the nn effective range parameter and/or the 1S0 scattering length causes substantial charge-symmetry breaking in the nucleon-nucleon force and suggests the possibility of a weakly bound di-neutron state [3]. We are conducting new measurements of the cross section for nn QFS in nd breakup. The measurements are performed at incident neutron beam energies below 20 MeV. The neutron beam is produced via the 2H(d, n)3He reaction. The target is a deuterated plastic cylinder. Our measurements utilize time-of-flight techniques with a pulsed neutron beam and detection of the two emitted neutrons in coincidence. A description of our initial measurements at 10 MeV for a single scattering angle will be presented along with preliminary results. Also, plans for measurements at other energies with broad angular coverage will be discussed.

  10. RIGOR MORTIS AND THE INFLUENCE OF CALCIUM AND MAGNESIUM SALTS UPON ITS DEVELOPMENT.

    PubMed

    Meltzer, S J; Auer, J

    1908-01-01

    Calcium salts hasten and magnesium salts retard the development of rigor mortis, that is, when these salts are administered subcutaneously or intravenously. When injected intra-arterially, concentrated solutions of both kinds of salts cause nearly an immediate onset of a strong stiffness of the muscles which is apparently a contraction, brought on by a stimulation caused by these salts and due to osmosis. This contraction, if strong, passes over without a relaxation into a real rigor. This form of rigor may be classed as work-rigor (Arbeitsstarre). In animals, at least in frogs, with intact cords, the early contraction and the following rigor are stronger than in animals with destroyed cord. If M/8 solutions-nearly equimolecular to "physiological" solutions of sodium chloride-are used, even when injected intra-arterially, calcium salts hasten and magnesium salts retard the onset of rigor. The hastening and retardation in this case as well as in the cases of subcutaneous and intravenous injections, are ion effects and essentially due to the cations, calcium and magnesium. In the rigor hastened by calcium the effects of the extensor muscles mostly prevail; in the rigor following magnesium injection, on the other hand, either the flexor muscles prevail or the muscles become stiff in the original position of the animal at death. There seems to be no difference in the degree of stiffness in the final rigor, only the onset and development of the rigor is hastened in the case of the one salt and retarded in the other. Calcium hastens also the development of heat rigor. No positive facts were obtained with regard to the effect of magnesium upon heat vigor. Calcium also hastens and magnesium retards the onset of rigor in the left ventricle of the heart. No definite data were gathered with regard to the effects of these salts upon the right ventricle.

  11. RIGOR MORTIS AND THE INFLUENCE OF CALCIUM AND MAGNESIUM SALTS UPON ITS DEVELOPMENT

    PubMed Central

    Meltzer, S. J.; Auer, John

    1908-01-01

    Calcium salts hasten and magnesium salts retard the development of rigor mortis, that is, when these salts are administered subcutaneously or intravenously. When injected intra-arterially, concentrated solutions of both kinds of salts cause nearly an immediate onset of a strong stiffness of the muscles which is apparently a contraction, brought on by a stimulation caused by these salts and due to osmosis. This contraction, if strong, passes over without a relaxation into a real rigor. This form of rigor may be classed as work-rigor (Arbeitsstarre). In animals, at least in frogs, with intact cords, the early contraction and the following rigor are stronger than in animals with destroyed cord. If M/8 solutions—nearly equimolecular to "physiological" solutions of sodium chloride—are used, even when injected intra-arterially, calcium salts hasten and magnesium salts retard the onset of rigor. The hastening and retardation in this case as well as in the cases of subcutaneous and intravenous injections, are ion effects and essentially due to the cations, calcium and magnesium. In the rigor hastened by calcium the effects of the extensor muscles mostly prevail; in the rigor following magnesium injection, on the other hand, either the flexor muscles prevail or the muscles become stiff in the original position of the animal at death. There seems to be no difference in the degree of stiffness in the final rigor, only the onset and development of the rigor is hastened in the case of the one salt and retarded in the other. Calcium hastens also the development of heat rigor. No positive facts were obtained with regard to the effect of magnesium upon heat vigor. Calcium also hastens and magnesium retards the onset of rigor in the left ventricle of the heart. No definite data were gathered with regard to the effects of these salts upon the right ventricle. PMID:19867124

  12. Attitude stability of spinning satellites

    NASA Technical Reports Server (NTRS)

    Caughey, T. K.

    1980-01-01

    Some problems of attitude stability of spinning satellites are treated in a rigorous manner. With certain restrictions, linearized stability analysis correctly predicts the attitude stability of spinning satellites, even in the critical cases of the Liapunov-Poincare stability theory.

  13. Analysis of Perfluorinated Chemicals and Their Fluorinated Precursors in Sludge: Method Development and Initial Results

    EPA Science Inventory

    A rigorous method was developed to maximize the extraction efficacy for perfluorocarboxylic acids (PFCAs), perfluorosulfonates (PFSAs), fluorotelomer alcohols (FTOHs), fluorotelomer acrylates (FTAc), perfluorosulfonamides (FOSAs), and perfluorosulfonamidoethanols (FOSEs) from was...

  14. Random waves in the brain: Symmetries and defect generation in the visual cortex

    NASA Astrophysics Data System (ADS)

    Schnabel, M.; Kaschube, M.; Löwel, S.; Wolf, F.

    2007-06-01

    How orientation maps in the visual cortex of the brain develop is a matter of long standing debate. Experimental and theoretical evidence suggests that their development represents an activity-dependent self-organization process. Theoretical analysis [1] exploring this hypothesis predicted that maps at an early developmental stage are realizations of Gaussian random fields exhibiting a rigorous lower bound for their densities of topological defects, called pinwheels. As a consequence, lower pinwheel densities, if observed in adult animals, are predicted to develop through the motion and annihilation of pinwheel pairs. Despite of being valid for a large class of developmental models this result depends on the symmetries of the models and thus of the predicted random field ensembles. In [1] invariance of the orientation map's statistical properties under independent space rotations and orientation shifts was assumed. However, full rotation symmetry appears to be broken by interactions of cortical neurons, e.g. selective couplings between groups of neurons with collinear orientation preferences [2]. A recently proposed new symmetry, called shift-twist symmetry [3], stating that spatial rotations have to occur together with orientation shifts in order to be an appropriate symmetry transformation, is more consistent with this organization. Here we generalize our random field approach to this important symmetry class. We propose a new class of shift-twist symmetric Gaussian random fields and derive the general correlation functions of this ensemble. It turns out that despite strong effects of the shift-twist symmetry on the structure of the correlation functions and on the map layout the lower bound on the pinwheel densities remains unaffected, predicting pinwheel annihilation in systems with low pinwheel densities.

  15. Density profiles in the Scrape-Off Layer interpreted through filament dynamics

    NASA Astrophysics Data System (ADS)

    Militello, Fulvio

    2017-10-01

    We developed a new theoretical framework to clarify the relation between radial Scrape-Off Layer density profiles and the fluctuations that generate them. The framework provides an interpretation of the experimental features of the profiles and of the turbulence statistics on the basis of simple properties of the filaments, such as their radial motion and their draining towards the divertor. L-mode and inter-ELM filaments are described as a Poisson process in which each event is independent and modelled with a wave function of amplitude and width statistically distributed according to experimental observations and evolving according to fluid equations. We will rigorously show that radially accelerating filaments, less efficient parallel exhaust and also a statistical distribution of their radial velocity can contribute to induce flatter profiles in the far SOL and therefore enhance plasma-wall interactions. A quite general result of our analysis is the resiliency of this non-exponential nature of the profiles and the increase of the relative fluctuation amplitude towards the wall, as experimentally observed. According to the framework, profile broadening at high fueling rates can be caused by interactions with neutrals (e.g. charge exchange) in the divertor or by a significant radial acceleration of the filaments. The framework assumptions were tested with 3D numerical simulations of seeded SOL filaments based on a two fluid model. In particular, filaments interact through the electrostatic field they generate only when they are in close proximity (separation comparable to their width in the drift plane), thus justifying our independence hypothesis. In addition, we will discuss how isolated filament motion responds to variations in the plasma conditions, and specifically divertor conditions. Finally, using the theoretical framework we will reproduce and interpret experimental results obtained on JET, MAST and HL-2A.

  16. Perspective: Optical measurement of feature dimensions and shapes by scatterometry

    NASA Astrophysics Data System (ADS)

    Diebold, Alain C.; Antonelli, Andy; Keller, Nick

    2018-05-01

    The use of optical scattering to measure feature shape and dimensions, scatterometry, is now routine during semiconductor manufacturing. Scatterometry iteratively improves an optical model structure using simulations that are compared to experimental data from an ellipsometer. These simulations are done using the rigorous coupled wave analysis for solving Maxwell's equations. In this article, we describe the Mueller matrix spectroscopic ellipsometry based scatterometry. Next, the rigorous coupled wave analysis for Maxwell's equations is presented. Following this, several example measurements are described as they apply to specific process steps in the fabrication of gate-all-around (GAA) transistor structures. First, simulations of measurement sensitivity for the inner spacer etch back step of horizontal GAA transistor processing are described. Next, the simulated metrology sensitivity for sacrificial (dummy) amorphous silicon etch back step of vertical GAA transistor processing is discussed. Finally, we present the application of plasmonically active test structures for improving the sensitivity of the measurement of metal linewidths.

  17. A rigorous approach to investigating common assumptions about disease transmission: Process algebra as an emerging modelling methodology for epidemiology.

    PubMed

    McCaig, Chris; Begon, Mike; Norman, Rachel; Shankland, Carron

    2011-03-01

    Changing scale, for example, the ability to move seamlessly from an individual-based model to a population-based model, is an important problem in many fields. In this paper, we introduce process algebra as a novel solution to this problem in the context of models of infectious disease spread. Process algebra allows us to describe a system in terms of the stochastic behaviour of individuals, and is a technique from computer science. We review the use of process algebra in biological systems, and the variety of quantitative and qualitative analysis techniques available. The analysis illustrated here solves the changing scale problem: from the individual behaviour we can rigorously derive equations to describe the mean behaviour of the system at the level of the population. The biological problem investigated is the transmission of infection, and how this relates to individual interactions.

  18. Agent-Centric Approach for Cybersecurity Decision-Support with Partial Observability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tipireddy, Ramakrishna; Chatterjee, Samrat; Paulson, Patrick R.

    Generating automated cyber resilience policies for real-world settings is a challenging research problem that must account for uncertainties in system state over time and dynamics between attackers and defenders. In addition to understanding attacker and defender motives and tools, and identifying “relevant” system and attack data, it is also critical to develop rigorous mathematical formulations representing the defender’s decision-support problem under uncertainty. Game-theoretic approaches involving cyber resource allocation optimization with Markov decision processes (MDP) have been previously proposed in the literature. Moreover, advancements in reinforcement learning approaches have motivated the development of partially observable stochastic games (POSGs) in various multi-agentmore » problem domains with partial information. Recent advances in cyber-system state space modeling have also generated interest in potential applicability of POSGs for cybersecurity. However, as is the case in strategic card games such as poker, research challenges using game-theoretic approaches for practical cyber defense applications include: 1) solving for equilibrium and designing efficient algorithms for large-scale, general problems; 2) establishing mathematical guarantees that equilibrium exists; 3) handling possible existence of multiple equilibria; and 4) exploitation of opponent weaknesses. Inspired by advances in solving strategic card games while acknowledging practical challenges associated with the use of game-theoretic approaches in cyber settings, this paper proposes an agent-centric approach for cybersecurity decision-support with partial system state observability.« less

  19. Long persistence of rigor mortis at constant low temperature.

    PubMed

    Varetto, Lorenzo; Curto, Ombretta

    2005-01-06

    We studied the persistence of rigor mortis by using physical manipulation. We tested the mobility of the knee on 146 corpses kept under refrigeration at Torino's city mortuary at a constant temperature of +4 degrees C. We found a persistence of complete rigor lasting for 10 days in all the cadavers we kept under observation; and in one case, rigor lasted for 16 days. Between the 11th and the 17th days, a progressively increasing number of corpses showed a change from complete into partial rigor (characterized by partial bending of the articulation). After the 17th day, all the remaining corpses showed partial rigor and in the two cadavers that were kept under observation "à outrance" we found the absolute resolution of rigor mortis occurred on the 28th day. Our results prove that it is possible to find a persistence of rigor mortis that is much longer than the expected when environmental conditions resemble average outdoor winter temperatures in temperate zones. Therefore, this datum must be considered when a corpse is found in those environmental conditions so that when estimating the time of death, we are not misled by the long persistence of rigor mortis.

  20. A framework for characterizing eHealth literacy demands and barriers.

    PubMed

    Chan, Connie V; Kaufman, David R

    2011-11-17

    Consumer eHealth interventions are of a growing importance in the individual management of health and health behaviors. However, a range of access, resources, and skills barriers prevent health care consumers from fully engaging in and benefiting from the spectrum of eHealth interventions. Consumers may engage in a range of eHealth tasks, such as participating in health discussion forums and entering information into a personal health record. eHealth literacy names a set of skills and knowledge that are essential for productive interactions with technology-based health tools, such as proficiency in information retrieval strategies, and communicating health concepts effectively. We propose a theoretical and methodological framework for characterizing complexity of eHealth tasks, which can be used to diagnose and describe literacy barriers and inform the development of solution strategies. We adapted and integrated two existing theoretical models relevant to the analysis of eHealth literacy into a single framework to systematically categorize and describe task demands and user performance on tasks needed by health care consumers in the information age. The method derived from the framework is applied to (1) code task demands using a cognitive task analysis, and (2) code user performance on tasks. The framework and method are applied to the analysis of a Web-based consumer eHealth task with information-seeking and decision-making demands. We present the results from the in-depth analysis of the task performance of a single user as well as of 20 users on the same task to illustrate both the detailed analysis and the aggregate measures obtained and potential analyses that can be performed using this method. The analysis shows that the framework can be used to classify task demands as well as the barriers encountered in user performance of the tasks. Our approach can be used to (1) characterize the challenges confronted by participants in performing the tasks, (2) determine the extent to which application of the framework to the cognitive task analysis can predict and explain the problems encountered by participants, and (3) inform revisions to the framework to increase accuracy of predictions. The results of this illustrative application suggest that the framework is useful for characterizing task complexity and for diagnosing and explaining barriers encountered in task completion. The framework and analytic approach can be a potentially powerful generative research platform to inform development of rigorous eHealth examination and design instruments, such as to assess eHealth competence, to design and evaluate consumer eHealth tools, and to develop an eHealth curriculum.

Top