Sample records for rigorous theoretical framework

  1. Approximation Methods for Inverse Problems Governed by Nonlinear Parabolic Systems

    DTIC Science & Technology

    1999-12-17

    We present a rigorous theoretical framework for approximation of nonlinear parabolic systems with delays in the context of inverse least squares...numerical results demonstrating the convergence are given for a model of dioxin uptake and elimination in a distributed liver model that is a special case of the general theoretical framework .

  2. Rigorous Measures of Implementation: A Methodological Framework for Evaluating Innovative STEM Programs

    ERIC Educational Resources Information Center

    Cassata-Widera, Amy; Century, Jeanne; Kim, Dae Y.

    2011-01-01

    The practical need for multidimensional measures of fidelity of implementation (FOI) of reform-based science, technology, engineering, and mathematics (STEM) instructional materials, combined with a theoretical need in the field for a shared conceptual framework that could support accumulating knowledge on specific enacted program elements across…

  3. Relevance and Rigor in International Business Teaching: Using the CSA-FSA Matrix

    ERIC Educational Resources Information Center

    Collinson, Simon C.; Rugman, Alan M.

    2011-01-01

    We advance three propositions in this paper. First, teaching international business (IB) at any level needs to be theoretically driven, using mainstream frameworks to organize thinking. Second, these frameworks need to be made relevant to the experiences of the students; for example, by using them in case studies. Third, these parameters of rigor…

  4. Testing Adaptive Toolbox Models: A Bayesian Hierarchical Approach

    ERIC Educational Resources Information Center

    Scheibehenne, Benjamin; Rieskamp, Jorg; Wagenmakers, Eric-Jan

    2013-01-01

    Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox…

  5. Theory and applications of structured light single pixel imaging

    NASA Astrophysics Data System (ADS)

    Stokoe, Robert J.; Stockton, Patrick A.; Pezeshki, Ali; Bartels, Randy A.

    2018-02-01

    Many single-pixel imaging techniques have been developed in recent years. Though the methods of image acquisition vary considerably, the methods share unifying features that make general analysis possible. Furthermore, the methods developed thus far are based on intuitive processes that enable simple and physically-motivated reconstruction algorithms, however, this approach may not leverage the full potential of single-pixel imaging. We present a general theoretical framework of single-pixel imaging based on frame theory, which enables general, mathematically rigorous analysis. We apply our theoretical framework to existing single-pixel imaging techniques, as well as provide a foundation for developing more-advanced methods of image acquisition and reconstruction. The proposed frame theoretic framework for single-pixel imaging results in improved noise robustness, decrease in acquisition time, and can take advantage of special properties of the specimen under study. By building on this framework, new methods of imaging with a single element detector can be developed to realize the full potential associated with single-pixel imaging.

  6. Enhancing rigor and practice of scoping reviews in social policy research: considerations from a worked example on the Americans with disabilities act.

    PubMed

    Harris, Sarah Parker; Gould, Robert; Fujiura, Glenn

    2015-01-01

    There is increasing theoretical consideration about the use of systematic and scoping reviews of evidence in informing disability and rehabilitation research and practice. Indicative of this trend, this journal published a piece by Rumrill, Fitzgerald and Merchant in 2010 explaining the utility and process for conducting reviews of intervention-based research. There is still need to consider how to apply such rigor when conducting more exploratory reviews of heterogeneous research. This article explores the challenges, benefits, and procedures for conducting rigorous exploratory scoping reviews of diverse evidence. The article expands upon Rumrill, Fitzgerald and Merchant's framework and considers its application to more heterogeneous evidence on the impact of social policy. A worked example of a scoping review of the Americans with Disabilities Act is provided with a procedural framework for conducting scoping reviews on the effects of a social policy. The need for more nuanced techniques for enhancing rigor became apparent during the review process. There are multiple methodological steps that can enhance the utility of exploratory scoping reviews. The potential of systematic consideration during the exploratory review process is shown as a viable method to enhance the rigor in reviewing diverse bodies of evidence.

  7. Statistical comparison of a hybrid approach with approximate and exact inference models for Fusion 2+

    NASA Astrophysics Data System (ADS)

    Lee, K. David; Wiesenfeld, Eric; Gelfand, Andrew

    2007-04-01

    One of the greatest challenges in modern combat is maintaining a high level of timely Situational Awareness (SA). In many situations, computational complexity and accuracy considerations make the development and deployment of real-time, high-level inference tools very difficult. An innovative hybrid framework that combines Bayesian inference, in the form of Bayesian Networks, and Possibility Theory, in the form of Fuzzy Logic systems, has recently been introduced to provide a rigorous framework for high-level inference. In previous research, the theoretical basis and benefits of the hybrid approach have been developed. However, lacking is a concrete experimental comparison of the hybrid framework with traditional fusion methods, to demonstrate and quantify this benefit. The goal of this research, therefore, is to provide a statistical analysis on the comparison of the accuracy and performance of hybrid network theory, with pure Bayesian and Fuzzy systems and an inexact Bayesian system approximated using Particle Filtering. To accomplish this task, domain specific models will be developed under these different theoretical approaches and then evaluated, via Monte Carlo Simulation, in comparison to situational ground truth to measure accuracy and fidelity. Following this, a rigorous statistical analysis of the performance results will be performed, to quantify the benefit of hybrid inference to other fusion tools.

  8. Rigorous theoretical framework for particle sizing in turbid colloids using light refraction.

    PubMed

    García-Valenzuela, Augusto; Barrera, Rubén G; Gutierrez-Reyes, Edahí

    2008-11-24

    Using a non-local effective-medium approach, we analyze the refraction of light in a colloidal medium. We discuss the theoretical grounds and all the necessary precautions to design and perform experiments to measure the effective refractive index in dilute colloids. As an application, we show that it is possible to retrieve the size of small dielectric particles in a colloid by measuring the complex effective refractive index and the volume fraction occupied by the particles.

  9. Changes in Residents' Self-Efficacy Beliefs in a Clinically Rich Graduate Teacher Education Program

    ERIC Educational Resources Information Center

    Reynolds, Heather M.; Wagle, A. Tina; Mahar, Donna; Yannuzzi, Leigh; Tramonte, Barbara; King, Joseph

    2016-01-01

    Increasing the clinical preparation of teachers in the United States to meet greater rigor in K-12 education has become a goal of institutions of higher education, especially since the publication of the National Council for the Accreditation of Teacher Education Blue Ribbon Panel Report on Clinical Practice. Using a theoretical framework grounded…

  10. A Theoretical Framework for Lagrangian Descriptors

    NASA Astrophysics Data System (ADS)

    Lopesino, C.; Balibrea-Iniesta, F.; García-Garrido, V. J.; Wiggins, S.; Mancho, A. M.

    This paper provides a theoretical background for Lagrangian Descriptors (LDs). The goal of achieving rigorous proofs that justify the ability of LDs to detect invariant manifolds is simplified by introducing an alternative definition for LDs. The definition is stated for n-dimensional systems with general time dependence, however we rigorously prove that this method reveals the stable and unstable manifolds of hyperbolic points in four particular 2D cases: a hyperbolic saddle point for linear autonomous systems, a hyperbolic saddle point for nonlinear autonomous systems, a hyperbolic saddle point for linear nonautonomous systems and a hyperbolic saddle point for nonlinear nonautonomous systems. We also discuss further rigorous results which show the ability of LDs to highlight additional invariants sets, such as n-tori. These results are just a simple extension of the ergodic partition theory which we illustrate by applying this methodology to well-known examples, such as the planar field of the harmonic oscillator and the 3D ABC flow. Finally, we provide a thorough discussion on the requirement of the objectivity (frame-invariance) property for tools designed to reveal phase space structures and their implications for Lagrangian descriptors.

  11. Sociomateriality: a theoretical framework for studying distributed medical education.

    PubMed

    MacLeod, Anna; Kits, Olga; Whelan, Emma; Fournier, Cathy; Wilson, Keith; Power, Gregory; Mann, Karen; Tummons, Jonathan; Brown, Peggy Alexiadis

    2015-11-01

    Distributed medical education (DME) is a type of distance learning in which students participate in medical education from diverse geographic locations using Web conferencing, videoconferencing, e-learning, and similar tools. DME is becoming increasingly widespread in North America and around the world.Although relatively new to medical education, distance learning has a long history in the broader field of education and a related body of literature that speaks to the importance of engaging in rigorous and theoretically informed studies of distance learning. The existing DME literature is helpful, but it has been largely descriptive and lacks a critical "lens"-that is, a theoretical perspective from which to rigorously conceptualize and interrogate DME's social (relationships, people) and material (technologies, tools) aspects.The authors describe DME and theories about distance learning and show that such theories focus on social, pedagogical, and cognitive considerations without adequately taking into account material factors. They address this gap by proposing sociomateriality as a theoretical framework allowing researchers and educators to study DME and (1) understand and consider previously obscured actors, infrastructure, and other factors that, on the surface, seem unrelated and even unimportant; (2) see clearly how the social and material components of learning are intertwined in fluid, messy, and often uncertain ways; and (3) perhaps think differently, even in ways that disrupt traditional approaches, as they explore DME. The authors conclude that DME brings with it substantial investments of social and material resources, and therefore needs careful study, using approaches that embrace its complexity.

  12. Testing adaptive toolbox models: a Bayesian hierarchical approach.

    PubMed

    Scheibehenne, Benjamin; Rieskamp, Jörg; Wagenmakers, Eric-Jan

    2013-01-01

    Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox framework. How can a toolbox model be quantitatively specified? How can the number of toolbox strategies be limited to prevent uncontrolled strategy sprawl? How can a toolbox model be formally tested against alternative theories? The authors show how these challenges can be met by using Bayesian inference techniques. By means of parameter recovery simulations and the analysis of empirical data across a variety of domains (i.e., judgment and decision making, children's cognitive development, function learning, and perceptual categorization), the authors illustrate how Bayesian inference techniques allow toolbox models to be quantitatively specified, strategy sprawl to be contained, and toolbox models to be rigorously tested against competing theories. The authors demonstrate that their approach applies at the individual level but can also be generalized to the group level with hierarchical Bayesian procedures. The suggested Bayesian inference techniques represent a theoretical and methodological advancement for toolbox theories of cognition and behavior.

  13. Quantitative force measurements using frequency modulation atomic force microscopy—theoretical foundations

    NASA Astrophysics Data System (ADS)

    Sader, John E.; Uchihashi, Takayuki; Higgins, Michael J.; Farrell, Alan; Nakayama, Yoshikazu; Jarvis, Suzanne P.

    2005-03-01

    Use of the atomic force microscope (AFM) in quantitative force measurements inherently requires a theoretical framework enabling conversion of the observed deflection properties of the cantilever to an interaction force. In this paper, the theoretical foundations of using frequency modulation atomic force microscopy (FM-AFM) in quantitative force measurements are examined and rigorously elucidated, with consideration being given to both 'conservative' and 'dissipative' interactions. This includes a detailed discussion of the underlying assumptions involved in such quantitative force measurements, the presentation of globally valid explicit formulae for evaluation of so-called 'conservative' and 'dissipative' forces, discussion of the origin of these forces, and analysis of the applicability of FM-AFM to quantitative force measurements in liquid.

  14. QTest: Quantitative Testing of Theories of Binary Choice.

    PubMed

    Regenwetter, Michel; Davis-Stober, Clintin P; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of "Random Cumulative Prospect Theory." A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences.

  15. Reviews of theoretical frameworks: Challenges and judging the quality of theory application.

    PubMed

    Hean, Sarah; Anderson, Liz; Green, Chris; John, Carol; Pitt, Richard; O'Halloran, Cath

    2016-06-01

    Rigorous reviews of available information, from a range of resources, are required to support medical and health educators in their decision making. The aim of this article is to highlight the importance of a review of theoretical frameworks specifically as a supplement to reviews that focus on a synthesis of the empirical evidence alone. Establishing a shared understanding of theory as a concept is highlighted as a challenge and some practical strategies to achieving this are presented. This article also introduces the concept of theoretical quality, arguing that a critique of how theory is applied should complement the methodological appraisal of the literature in a review. We illustrate the challenge of establishing a shared meaning of theory through reference to experiences of an on-going review of this kind conducted in the field of interprofessional education (IPE) and use a high scoring paper selected in this review to illustrate how theoretical quality can be assessed. In reaching a shared understanding of theory as a concept, practical strategies that promote experiential and practical ways of knowing are required in addition to more propositional ways of sharing knowledge. Concepts of parsimony, testability, operational adequacy and empirical adequacy are explored as concepts that establish theoretical quality. Reviews of theoretical frameworks used in medical education are required to inform educational practice. Review teams should make time and effort to reach a shared understanding of the term theory. Theory reviews, and reviews more widely, should add an assessment of theory application to the protocol of their review method.

  16. Development of a theoretical framework for analyzing cerebrospinal fluid dynamics

    PubMed Central

    Cohen, Benjamin; Voorhees, Abram; Vedel, Søren; Wei, Timothy

    2009-01-01

    Background To date hydrocephalus researchers acknowledge the need for rigorous but utilitarian fluid mechanics understanding and methodologies in studying normal and hydrocephalic intracranial dynamics. Pressure volume models and electric circuit analogs introduced pressure into volume conservation; but control volume analysis enforces independent conditions on pressure and volume. Previously, utilization of clinical measurements has been limited to understanding of the relative amplitude and timing of flow, volume and pressure waveforms; qualitative approaches without a clear framework for meaningful quantitative comparison. Methods Control volume analysis is presented to introduce the reader to the theoretical background of this foundational fluid mechanics technique for application to general control volumes. This approach is able to directly incorporate the diverse measurements obtained by clinicians to better elucidate intracranial dynamics and progression to disorder. Results Several examples of meaningful intracranial control volumes and the particular measurement sets needed for the analysis are discussed. Conclusion Control volume analysis provides a framework to guide the type and location of measurements and also a way to interpret the resulting data within a fundamental fluid physics analysis. PMID:19772652

  17. The thermodynamics of dense granular flow and jamming

    NASA Astrophysics Data System (ADS)

    Lu, Shih Yu

    The scope of the thesis is to propose, based on experimental evidence and theoretical validation, a quantifiable connection between systems that exhibit the jamming phenomenon. When jammed, some materials that flow are able to resist deformation so that they appear solid-like on the laboratory scale. But unlike ordinary fusion, which has a critically defined criterion in pressure and temperature, jamming occurs under a wide range of conditions. These condition have been rigorously investigated but at the moment, no self-consistent framework can apply to grains, foam and colloids that may have suddenly ceased to flow. To quantify the jamming behavior, a constitutive model of dense granular flows is deduced from shear-flow experiments. The empirical equations are then generalized, via a thermodynamic approach, into an equation-of-state for jamming. Notably, the unifying theory also predicts the experimental data on the behavior of molecular glassy liquids. This analogy paves a crucial road map for a unifying theoretical framework in condensed matter, for example, ranging from sand to fire retardants to toothpaste.

  18. Late-Onset ADHD: Understanding the Evidence and Building Theoretical Frameworks.

    PubMed

    Caye, Arthur; Sibley, Margaret H; Swanson, James M; Rohde, Luis Augusto

    2017-11-13

    The traditional definition of Attention-Deficit/Hyperactivity Disorder (ADHD), assuming onset in childhood, has been challenged by evidence from four recent birth-cohort studies that reported most adults with ADHD lacked a childhood categorical ADHD diagnosis. Late onset of symptoms was evaluated in the long-term follow-up of the Multimodal Treatment study of ADHD (MTA). In most cases, other factors were present that discounted the late onset of ADHD symptoms and excluded the diagnosis of ADHD. We offer two theoretical frameworks for understanding the ADHD trajectory throughout the life cycle: (1) the complex phenotype model, and (2) the restricted phenotype model. We conclude that (a) late onset (after age 12) is a valid trajectory for ADHD symptoms, (b) the percentage of these cases with onset after adolescence is yet uncertain, and (c) the percentage meeting exclusion criteria for diagnosis of ADHD is influenced by the rigor of the methodology used to obtain evidence and whether or not DSM exclusionary criteria are applied.

  19. QTest: Quantitative Testing of Theories of Binary Choice

    PubMed Central

    Regenwetter, Michel; Davis-Stober, Clintin P.; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of “Random Cumulative Prospect Theory.” A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences. PMID:24999495

  20. Effective Floquet Hamiltonian theory of multiple-quantum NMR in anisotropic solids involving quadrupolar spins: Challenges and Perspectives

    NASA Astrophysics Data System (ADS)

    Ganapathy, Vinay; Ramachandran, Ramesh

    2017-10-01

    The response of a quadrupolar nucleus (nuclear spin with I > 1/2) to an oscillating radio-frequency pulse/field is delicately dependent on the ratio of the quadrupolar coupling constant to the amplitude of the pulse in addition to its duration and oscillating frequency. Consequently, analytic description of the excitation process in the density operator formalism has remained less transparent within existing theoretical frameworks. As an alternative, the utility of the "concept of effective Floquet Hamiltonians" is explored in the present study to explicate the nuances of the excitation process in multilevel systems. Employing spin I = 3/2 as a case study, a unified theoretical framework for describing the excitation of multiple-quantum transitions in static isotropic and anisotropic solids is proposed within the framework of perturbation theory. The challenges resulting from the anisotropic nature of the quadrupolar interactions are addressed within the effective Hamiltonian framework. The possible role of the various interaction frames on the convergence of the perturbation corrections is discussed along with a proposal for a "hybrid method" for describing the excitation process in anisotropic solids. Employing suitable model systems, the validity of the proposed hybrid method is substantiated through a rigorous comparison between simulations emerging from exact numerical and analytic methods.

  1. A consortium-driven framework to guide the implementation of ICH M7 Option 4 control strategies.

    PubMed

    Barber, Chris; Antonucci, Vincent; Baumann, Jens-Christoph; Brown, Roland; Covey-Crump, Elizabeth; Elder, David; Elliott, Eric; Fennell, Jared W; Gallou, Fabrice; Ide, Nathan D; Jordine, Guido; Kallemeyn, Jeffrey M; Lauwers, Dirk; Looker, Adam R; Lovelle, Lucie E; McLaughlin, Mark; Molzahn, Robert; Ott, Martin; Schils, Didier; Oestrich, Rolf Schulte; Stevenson, Neil; Talavera, Pere; Teasdale, Andrew; Urquhart, Michael W; Varie, David L; Welch, Dennie

    2017-11-01

    The ICH M7 Option 4 control of (potentially) mutagenic impurities is based on the use of scientific principles in lieu of routine analytical testing. This approach can reduce the burden of analytical testing without compromising patient safety, provided a scientifically rigorous approach is taken which is backed up by sufficient theoretical and/or analytical data. This paper introduces a consortium-led initiative and offers a proposal on the supporting evidence that could be presented in regulatory submissions. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. The MIXED framework: A novel approach to evaluating mixed-methods rigor.

    PubMed

    Eckhardt, Ann L; DeVon, Holli A

    2017-10-01

    Evaluation of rigor in mixed-methods (MM) research is a persistent challenge due to the combination of inconsistent philosophical paradigms, the use of multiple research methods which require different skill sets, and the need to combine research at different points in the research process. Researchers have proposed a variety of ways to thoroughly evaluate MM research, but each method fails to provide a framework that is useful for the consumer of research. In contrast, the MIXED framework is meant to bridge the gap between an academic exercise and practical assessment of a published work. The MIXED framework (methods, inference, expertise, evaluation, and design) borrows from previously published frameworks to create a useful tool for the evaluation of a published study. The MIXED framework uses an experimental eight-item scale that allows for comprehensive integrated assessment of MM rigor in published manuscripts. Mixed methods are becoming increasingly prevalent in nursing and healthcare research requiring researchers and consumers to address issues unique to MM such as evaluation of rigor. © 2017 John Wiley & Sons Ltd.

  3. Stochastic and Deterministic Models for the Metastatic Emission Process: Formalisms and Crosslinks.

    PubMed

    Gomez, Christophe; Hartung, Niklas

    2018-01-01

    Although the detection of metastases radically changes prognosis of and treatment decisions for a cancer patient, clinically undetectable micrometastases hamper a consistent classification into localized or metastatic disease. This chapter discusses mathematical modeling efforts that could help to estimate the metastatic risk in such a situation. We focus on two approaches: (1) a stochastic framework describing metastatic emission events at random times, formalized via Poisson processes, and (2) a deterministic framework describing the micrometastatic state through a size-structured density function in a partial differential equation model. Three aspects are addressed in this chapter. First, a motivation for the Poisson process framework is presented and modeling hypotheses and mechanisms are introduced. Second, we extend the Poisson model to account for secondary metastatic emission. Third, we highlight an inherent crosslink between the stochastic and deterministic frameworks and discuss its implications. For increased accessibility the chapter is split into an informal presentation of the results using a minimum of mathematical formalism and a rigorous mathematical treatment for more theoretically interested readers.

  4. Toward an Accurate Theoretical Framework for Describing Ensembles for Proteins under Strongly Denaturing Conditions

    PubMed Central

    Tran, Hoang T.; Pappu, Rohit V.

    2006-01-01

    Our focus is on an appropriate theoretical framework for describing highly denatured proteins. In high concentrations of denaturants, proteins behave like polymers in a good solvent and ensembles for denatured proteins can be modeled by ignoring all interactions except excluded volume (EV) effects. To assay conformational preferences of highly denatured proteins, we quantify a variety of properties for EV-limit ensembles of 23 two-state proteins. We find that modeled denatured proteins can be best described as follows. Average shapes are consistent with prolate ellipsoids. Ensembles are characterized by large correlated fluctuations. Sequence-specific conformational preferences are restricted to local length scales that span five to nine residues. Beyond local length scales, chain properties follow well-defined power laws that are expected for generic polymers in the EV limit. The average available volume is filled inefficiently, and cavities of all sizes are found within the interiors of denatured proteins. All properties characterized from simulated ensembles match predictions from rigorous field theories. We use our results to resolve between conflicting proposals for structure in ensembles for highly denatured states. PMID:16766618

  5. MRF energy minimization and beyond via dual decomposition.

    PubMed

    Komodakis, Nikos; Paragios, Nikos; Tziritas, Georgios

    2011-03-01

    This paper introduces a new rigorous theoretical framework to address discrete MRF-based optimization in computer vision. Such a framework exploits the powerful technique of Dual Decomposition. It is based on a projected subgradient scheme that attempts to solve an MRF optimization problem by first decomposing it into a set of appropriately chosen subproblems, and then combining their solutions in a principled way. In order to determine the limits of this method, we analyze the conditions that these subproblems have to satisfy and demonstrate the extreme generality and flexibility of such an approach. We thus show that by appropriately choosing what subproblems to use, one can design novel and very powerful MRF optimization algorithms. For instance, in this manner we are able to derive algorithms that: 1) generalize and extend state-of-the-art message-passing methods, 2) optimize very tight LP-relaxations to MRF optimization, and 3) take full advantage of the special structure that may exist in particular MRFs, allowing the use of efficient inference techniques such as, e.g., graph-cut-based methods. Theoretical analysis on the bounds related with the different algorithms derived from our framework and experimental results/comparisons using synthetic and real data for a variety of tasks in computer vision demonstrate the extreme potentials of our approach.

  6. Grading Rigor in Counselor Education: A Specifications Grading Framework

    ERIC Educational Resources Information Center

    Bonner, Matthew W.

    2016-01-01

    According to accreditation and professional bodies, evaluation and grading are a high priority in counselor education. Specifications grading, an evaluative tool, can be used to increase grading rigor. This article describes the components of specifications grading and applies the framework of specifications grading to a counseling theories course.

  7. Culture and symptom reporting at menopause.

    PubMed

    Melby, Melissa K; Lock, Margaret; Kaufert, Patricia

    2005-01-01

    The purpose of the present paper is to review recent research on the relationship of culture and menopausal symptoms and propose a biocultural framework that makes use of both biological and cultural parameters in future research. Medline was searched for English-language articles published from 2000 to 2004 using the keyword 'menopause' in the journals--Menopause, Maturitas, Climacteric, Social Science and Medicine, Medical Anthropology Quarterly, Journal of Women's Health, Journal of the American Medical Association, American Journal of Epidemiology, Lancet and British Medical Journal, excluding articles concerning small clinical samples, surgical menopause or HRT. Additionally, references of retrieved articles and reviews were hand-searched. Although a large number of studies and publications exist, methodological differences limit attempts at comparison or systematic review. We outline a theoretical framework in which relevant biological and cultural variables can be operationalized and measured, making it possible for rigorous comparisons in the future. Several studies carried out in Japan, North America and Australia, using similar methodology but different culture/ethnic groups, indicate that differences in symptom reporting are real and highlight the importance of biocultural research. We suggest that both biological variation and cultural differences contribute to the menopausal transition, and that more rigorous data collection is required to elucidate how biology and culture interact in female ageing.

  8. Nuclear magnetic relaxation by the dipolar EMOR mechanism: General theory with applications to two-spin systems.

    PubMed

    Chang, Zhiwei; Halle, Bertil

    2016-02-28

    In aqueous systems with immobilized macromolecules, including biological tissue, the longitudinal spin relaxation of water protons is primarily induced by exchange-mediated orientational randomization (EMOR) of intra- and intermolecular magnetic dipole-dipole couplings. We have embarked on a systematic program to develop, from the stochastic Liouville equation, a general and rigorous theory that can describe relaxation by the dipolar EMOR mechanism over the full range of exchange rates, dipole coupling strengths, and Larmor frequencies. Here, we present a general theoretical framework applicable to spin systems of arbitrary size with symmetric or asymmetric exchange. So far, the dipolar EMOR theory is only available for a two-spin system with symmetric exchange. Asymmetric exchange, when the spin system is fragmented by the exchange, introduces new and unexpected phenomena. Notably, the anisotropic dipole couplings of non-exchanging spins break the axial symmetry in spin Liouville space, thereby opening up new relaxation channels in the locally anisotropic sites, including longitudinal-transverse cross relaxation. Such cross-mode relaxation operates only at low fields; at higher fields it becomes nonsecular, leading to an unusual inverted relaxation dispersion that splits the extreme-narrowing regime into two sub-regimes. The general dipolar EMOR theory is illustrated here by a detailed analysis of the asymmetric two-spin case, for which we present relaxation dispersion profiles over a wide range of conditions as well as analytical results for integral relaxation rates and time-dependent spin modes in the zero-field and motional-narrowing regimes. The general theoretical framework presented here will enable a quantitative analysis of frequency-dependent water-proton longitudinal relaxation in model systems with immobilized macromolecules and, ultimately, will provide a rigorous link between relaxation-based magnetic resonance image contrast and molecular parameters.

  9. Nuclear magnetic relaxation by the dipolar EMOR mechanism: General theory with applications to two-spin systems

    NASA Astrophysics Data System (ADS)

    Chang, Zhiwei; Halle, Bertil

    2016-02-01

    In aqueous systems with immobilized macromolecules, including biological tissue, the longitudinal spin relaxation of water protons is primarily induced by exchange-mediated orientational randomization (EMOR) of intra- and intermolecular magnetic dipole-dipole couplings. We have embarked on a systematic program to develop, from the stochastic Liouville equation, a general and rigorous theory that can describe relaxation by the dipolar EMOR mechanism over the full range of exchange rates, dipole coupling strengths, and Larmor frequencies. Here, we present a general theoretical framework applicable to spin systems of arbitrary size with symmetric or asymmetric exchange. So far, the dipolar EMOR theory is only available for a two-spin system with symmetric exchange. Asymmetric exchange, when the spin system is fragmented by the exchange, introduces new and unexpected phenomena. Notably, the anisotropic dipole couplings of non-exchanging spins break the axial symmetry in spin Liouville space, thereby opening up new relaxation channels in the locally anisotropic sites, including longitudinal-transverse cross relaxation. Such cross-mode relaxation operates only at low fields; at higher fields it becomes nonsecular, leading to an unusual inverted relaxation dispersion that splits the extreme-narrowing regime into two sub-regimes. The general dipolar EMOR theory is illustrated here by a detailed analysis of the asymmetric two-spin case, for which we present relaxation dispersion profiles over a wide range of conditions as well as analytical results for integral relaxation rates and time-dependent spin modes in the zero-field and motional-narrowing regimes. The general theoretical framework presented here will enable a quantitative analysis of frequency-dependent water-proton longitudinal relaxation in model systems with immobilized macromolecules and, ultimately, will provide a rigorous link between relaxation-based magnetic resonance image contrast and molecular parameters.

  10. OCT Amplitude and Speckle Statistics of Discrete Random Media.

    PubMed

    Almasian, Mitra; van Leeuwen, Ton G; Faber, Dirk J

    2017-11-01

    Speckle, amplitude fluctuations in optical coherence tomography (OCT) images, contains information on sub-resolution structural properties of the imaged sample. Speckle statistics could therefore be utilized in the characterization of biological tissues. However, a rigorous theoretical framework relating OCT speckle statistics to structural tissue properties has yet to be developed. As a first step, we present a theoretical description of OCT speckle, relating the OCT amplitude variance to size and organization for samples of discrete random media (DRM). Starting the calculations from the size and organization of the scattering particles, we analytically find expressions for the OCT amplitude mean, amplitude variance, the backscattering coefficient and the scattering coefficient. We assume fully developed speckle and verify the validity of this assumption by experiments on controlled samples of silica microspheres suspended in water. We show that the OCT amplitude variance is sensitive to sub-resolution changes in size and organization of the scattering particles. Experimentally determined and theoretically calculated optical properties are compared and in good agreement.

  11. Crisis in science: in search for new theoretical foundations.

    PubMed

    Schroeder, Marcin J

    2013-09-01

    Recognition of the need for theoretical biology more than half century ago did not bring substantial progress in this direction. Recently, the need for new methods in science, including physics became clear. The breakthrough should be sought in answering the question "What is life?", which can help to explain the mechanisms of consciousness and consequently give insight into the way we comprehend reality. This could help in the search for new methods in the study of both physical and biological phenomena. However, to achieve this, new theoretical discipline will have to be developed with a very general conceptual framework and rigor of mathematical reasoning, allowing it to assume the leading role in science. Since its foundations are in the recognition of the role of life and consciousness in the epistemic process, it could be called biomathics. The prime candidates proposed here for being the fundamental concepts for biomathics are 'information' and 'information integration', with an appropriately general mathematical formalism. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. DG-IMEX Stochastic Galerkin Schemes for Linear Transport Equation with Random Inputs and Diffusive Scalings

    DOE PAGES

    Chen, Zheng; Liu, Liu; Mu, Lin

    2017-05-03

    In this paper, we consider the linear transport equation under diffusive scaling and with random inputs. The method is based on the generalized polynomial chaos approach in the stochastic Galerkin framework. Several theoretical aspects will be addressed. Additionally, a uniform numerical stability with respect to the Knudsen number ϵ, and a uniform in ϵ error estimate is given. For temporal and spatial discretizations, we apply the implicit–explicit scheme under the micro–macro decomposition framework and the discontinuous Galerkin method, as proposed in Jang et al. (SIAM J Numer Anal 52:2048–2072, 2014) for deterministic problem. Lastly, we provide a rigorous proof ofmore » the stochastic asymptotic-preserving (sAP) property. Extensive numerical experiments that validate the accuracy and sAP of the method are conducted.« less

  13. Orthogonal basis with a conicoid first mode for shape specification of optical surfaces.

    PubMed

    Ferreira, Chelo; López, José L; Navarro, Rafael; Sinusía, Ester Pérez

    2016-03-07

    A rigorous and powerful theoretical framework is proposed to obtain systems of orthogonal functions (or shape modes) to represent optical surfaces. The method is general so it can be applied to different initial shapes and different polynomials. Here we present results for surfaces with circular apertures when the first basis function (mode) is a conicoid. The system for aspheres with rotational symmetry is obtained applying an appropriate change of variables to Legendre polynomials, whereas the system for general freeform case is obtained applying a similar procedure to spherical harmonics. Numerical comparisons with standard systems, such as Forbes and Zernike polynomials, are performed and discussed.

  14. A generative, probabilistic model of local protein structure.

    PubMed

    Boomsma, Wouter; Mardia, Kanti V; Taylor, Charles C; Ferkinghoff-Borg, Jesper; Krogh, Anders; Hamelryck, Thomas

    2008-07-01

    Despite significant progress in recent years, protein structure prediction maintains its status as one of the prime unsolved problems in computational biology. One of the key remaining challenges is an efficient probabilistic exploration of the structural space that correctly reflects the relative conformational stabilities. Here, we present a fully probabilistic, continuous model of local protein structure in atomic detail. The generative model makes efficient conformational sampling possible and provides a framework for the rigorous analysis of local sequence-structure correlations in the native state. Our method represents a significant theoretical and practical improvement over the widely used fragment assembly technique by avoiding the drawbacks associated with a discrete and nonprobabilistic approach.

  15. Development of a software safety process and a case study of its use

    NASA Technical Reports Server (NTRS)

    Knight, John C.

    1993-01-01

    The goal of this research is to continue the development of a comprehensive approach to software safety and to evaluate the approach with a case study. The case study is a major part of the project, and it involves the analysis of a specific safety-critical system from the medical equipment domain. The particular application being used was selected because of the availability of a suitable candidate system. We consider the results to be generally applicable and in no way particularly limited by the domain. The research is concentrating on issues raised by the specification and verification phases of the software lifecycle since they are central to our previously-developed rigorous definitions of software safety. The theoretical research is based on our framework of definitions for software safety. In the area of specification, the main topics being investigated are the development of techniques for building system fault trees that correctly incorporate software issues and the development of rigorous techniques for the preparation of software safety specifications. The research results are documented. Another area of theoretical investigation is the development of verification methods tailored to the characteristics of safety requirements. Verification of the correct implementation of the safety specification is central to the goal of establishing safe software. The empirical component of this research is focusing on a case study in order to provide detailed characterizations of the issues as they appear in practice, and to provide a testbed for the evaluation of various existing and new theoretical results, tools, and techniques. The Magnetic Stereotaxis System is summarized.

  16. Children facing a family member's acute illness: a review of intervention studies.

    PubMed

    Spath, Mary L

    2007-07-01

    A review of psycho-educational intervention studies to benefit children adapting to a close (parent, sibling, or grandparent) family member's serious illness was conducted. To review the literature on studies addressing this topic, critique research methods, describe clinical outcomes, and make recommendations for future research efforts. Research citations from 1990 to 2005 from Medline, CINAHL, Health Source: Nursing/Academic Edition, PsycARTICLES, and PsycINFO databases were identified. Citations were reviewed and evaluated for sample, design, theoretical framework, intervention, threats to validity, and outcomes. Reviewed studies were limited to those that included statistical analysis to evaluate interventions and outcomes. Six studies were reviewed. Positive outcomes were reported for all of the interventional strategies used in the studies. Reviewed studies generally lacked a theoretical framework and a control group, were generally composed of small convenience samples, and primarily used non-tested investigator instruments. They were diverse in terms of intervention length and intensity, and measured short-term outcomes related to participant program satisfaction, rather than participant cognitive and behavioral change. The paucity of interventional studies and lack of systematic empirical precision to evaluate intervention effectiveness necessitates future studies that are methodologically rigorous.

  17. Asymptotic theory of time-varying social networks with heterogeneous activity and tie allocation.

    PubMed

    Ubaldi, Enrico; Perra, Nicola; Karsai, Márton; Vezzani, Alessandro; Burioni, Raffaella; Vespignani, Alessandro

    2016-10-24

    The dynamic of social networks is driven by the interplay between diverse mechanisms that still challenge our theoretical and modelling efforts. Amongst them, two are known to play a central role in shaping the networks evolution, namely the heterogeneous propensity of individuals to i) be socially active and ii) establish a new social relationships with their alters. Here, we empirically characterise these two mechanisms in seven real networks describing temporal human interactions in three different settings: scientific collaborations, Twitter mentions, and mobile phone calls. We find that the individuals' social activity and their strategy in choosing ties where to allocate their social interactions can be quantitatively described and encoded in a simple stochastic network modelling framework. The Master Equation of the model can be solved in the asymptotic limit. The analytical solutions provide an explicit description of both the system dynamic and the dynamical scaling laws characterising crucial aspects about the evolution of the networks. The analytical predictions match with accuracy the empirical observations, thus validating the theoretical approach. Our results provide a rigorous dynamical system framework that can be extended to include other processes shaping social dynamics and to generate data driven predictions for the asymptotic behaviour of social networks.

  18. Asymptotic theory of time-varying social networks with heterogeneous activity and tie allocation

    NASA Astrophysics Data System (ADS)

    Ubaldi, Enrico; Perra, Nicola; Karsai, Márton; Vezzani, Alessandro; Burioni, Raffaella; Vespignani, Alessandro

    2016-10-01

    The dynamic of social networks is driven by the interplay between diverse mechanisms that still challenge our theoretical and modelling efforts. Amongst them, two are known to play a central role in shaping the networks evolution, namely the heterogeneous propensity of individuals to i) be socially active and ii) establish a new social relationships with their alters. Here, we empirically characterise these two mechanisms in seven real networks describing temporal human interactions in three different settings: scientific collaborations, Twitter mentions, and mobile phone calls. We find that the individuals’ social activity and their strategy in choosing ties where to allocate their social interactions can be quantitatively described and encoded in a simple stochastic network modelling framework. The Master Equation of the model can be solved in the asymptotic limit. The analytical solutions provide an explicit description of both the system dynamic and the dynamical scaling laws characterising crucial aspects about the evolution of the networks. The analytical predictions match with accuracy the empirical observations, thus validating the theoretical approach. Our results provide a rigorous dynamical system framework that can be extended to include other processes shaping social dynamics and to generate data driven predictions for the asymptotic behaviour of social networks.

  19. A combinatorial framework to quantify peak/pit asymmetries in complex dynamics.

    PubMed

    Hasson, Uri; Iacovacci, Jacopo; Davis, Ben; Flanagan, Ryan; Tagliazucchi, Enzo; Laufs, Helmut; Lacasa, Lucas

    2018-02-23

    We explore a combinatorial framework which efficiently quantifies the asymmetries between minima and maxima in local fluctuations of time series. We first showcase its performance by applying it to a battery of synthetic cases. We find rigorous results on some canonical dynamical models (stochastic processes with and without correlations, chaotic processes) complemented by extensive numerical simulations for a range of processes which indicate that the methodology correctly distinguishes different complex dynamics and outperforms state of the art metrics in several cases. Subsequently, we apply this methodology to real-world problems emerging across several disciplines including cases in neurobiology, finance and climate science. We conclude that differences between the statistics of local maxima and local minima in time series are highly informative of the complex underlying dynamics and a graph-theoretic extraction procedure allows to use these features for statistical learning purposes.

  20. A systems-theoretical framework for health and disease: inflammation and preconditioning from an abstract modeling point of view.

    PubMed

    Voit, Eberhard O

    2009-01-01

    Modern advances in molecular biology have produced enormous amounts of data characterizing physiological and disease states in cells and organisms. While bioinformatics has facilitated the organizing and mining of these data, it is the task of systems biology to merge the available information into dynamic, explanatory and predictive models. This article takes a step into this direction. It proposes a conceptual approach toward formalizing health and disease and illustrates it in the context of inflammation and preconditioning. Instead of defining health and disease states, the emphasis is on simplexes in a high-dimensional biomarker space. These simplexes are bounded by physiological constraints and permit the quantitative characterization of personalized health trajectories, health risk profiles that change with age, and the efficacy of different treatment options. The article mainly focuses on concepts but also briefly describes how the proposed concepts might be formulated rigorously within a mathematical framework.

  1. Computational Approaches to the Chemical Equilibrium Constant in Protein-ligand Binding.

    PubMed

    Montalvo-Acosta, Joel José; Cecchini, Marco

    2016-12-01

    The physiological role played by protein-ligand recognition has motivated the development of several computational approaches to the ligand binding affinity. Some of them, termed rigorous, have a strong theoretical foundation but involve too much computation to be generally useful. Some others alleviate the computational burden by introducing strong approximations and/or empirical calibrations, which also limit their general use. Most importantly, there is no straightforward correlation between the predictive power and the level of approximation introduced. Here, we present a general framework for the quantitative interpretation of protein-ligand binding based on statistical mechanics. Within this framework, we re-derive self-consistently the fundamental equations of some popular approaches to the binding constant and pinpoint the inherent approximations. Our analysis represents a first step towards the development of variants with optimum accuracy/efficiency ratio for each stage of the drug discovery pipeline. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Limit analysis of hollow spheres or spheroids with Hill orthotropic matrix

    NASA Astrophysics Data System (ADS)

    Pastor, Franck; Pastor, Joseph; Kondo, Djimedo

    2012-03-01

    Recent theoretical studies of the literature are concerned by the hollow sphere or spheroid (confocal) problems with orthotropic Hill type matrix. They have been developed in the framework of the limit analysis kinematical approach by using very simple trial velocity fields. The present Note provides, through numerical upper and lower bounds, a rigorous assessment of the approximate criteria derived in these theoretical works. To this end, existing static 3D codes for a von Mises matrix have been easily extended to the orthotropic case. Conversely, instead of the non-obvious extension of the existing kinematic codes, a new original mixed approach has been elaborated on the basis of the plane strain structure formulation earlier developed by F. Pastor (2007). Indeed, such a formulation does not need the expressions of the unit dissipated powers. Interestingly, it delivers a numerical code better conditioned and notably more rapid than the previous one, while preserving the rigorous upper bound character of the corresponding numerical results. The efficiency of the whole approach is first demonstrated through comparisons of the results to the analytical upper bounds of Benzerga and Besson (2001) or Monchiet et al. (2008) in the case of spherical voids in the Hill matrix. Moreover, we provide upper and lower bounds results for the hollow spheroid with the Hill matrix which are compared to those of Monchiet et al. (2008).

  3. Rigorous Science: a How-To Guide.

    PubMed

    Casadevall, Arturo; Fang, Ferric C

    2016-11-08

    Proposals to improve the reproducibility of biomedical research have emphasized scientific rigor. Although the word "rigor" is widely used, there has been little specific discussion as to what it means and how it can be achieved. We suggest that scientific rigor combines elements of mathematics, logic, philosophy, and ethics. We propose a framework for rigor that includes redundant experimental design, sound statistical analysis, recognition of error, avoidance of logical fallacies, and intellectual honesty. These elements lead to five actionable recommendations for research education. Copyright © 2016 Casadevall and Fang.

  4. Density profiles in the Scrape-Off Layer interpreted through filament dynamics

    NASA Astrophysics Data System (ADS)

    Militello, Fulvio

    2017-10-01

    We developed a new theoretical framework to clarify the relation between radial Scrape-Off Layer density profiles and the fluctuations that generate them. The framework provides an interpretation of the experimental features of the profiles and of the turbulence statistics on the basis of simple properties of the filaments, such as their radial motion and their draining towards the divertor. L-mode and inter-ELM filaments are described as a Poisson process in which each event is independent and modelled with a wave function of amplitude and width statistically distributed according to experimental observations and evolving according to fluid equations. We will rigorously show that radially accelerating filaments, less efficient parallel exhaust and also a statistical distribution of their radial velocity can contribute to induce flatter profiles in the far SOL and therefore enhance plasma-wall interactions. A quite general result of our analysis is the resiliency of this non-exponential nature of the profiles and the increase of the relative fluctuation amplitude towards the wall, as experimentally observed. According to the framework, profile broadening at high fueling rates can be caused by interactions with neutrals (e.g. charge exchange) in the divertor or by a significant radial acceleration of the filaments. The framework assumptions were tested with 3D numerical simulations of seeded SOL filaments based on a two fluid model. In particular, filaments interact through the electrostatic field they generate only when they are in close proximity (separation comparable to their width in the drift plane), thus justifying our independence hypothesis. In addition, we will discuss how isolated filament motion responds to variations in the plasma conditions, and specifically divertor conditions. Finally, using the theoretical framework we will reproduce and interpret experimental results obtained on JET, MAST and HL-2A.

  5. High-Contrast Gratings based Spoof Surface Plasmons

    NASA Astrophysics Data System (ADS)

    Li, Zhuo; Liu, Liangliang; Xu, Bingzheng; Ning, Pingping; Chen, Chen; Xu, Jia; Chen, Xinlei; Gu, Changqing; Qing, Quan

    2016-02-01

    In this work, we explore the existence of spoof surface plasmons (SSPs) supported by deep-subwavelength high-contrast gratings (HCGs) on a perfect electric conductor plane. The dispersion relation of the HCGs-based SSPs is derived analyt- ically by combining multimode network theory with rigorous mode matching method, which has nearly the same form with and can be degenerated into that of the SSPs arising from deep-subwavelength metallic gratings (MGs). Numerical simula- tions validate the analytical dispersion relation and an effective medium approximation is also presented to obtain the same analytical dispersion formula. This work sets up a unified theoretical framework for SSPs and opens up new vistas in surface plasmon optics.

  6. Evaluating WHO Healthy Cities in Europe: issues and perspectives.

    PubMed

    de Leeuw, Evelyne

    2013-10-01

    In this introductory article, we situate the findings of the Phase IV evaluation effort of the WHO European Healthy Cities Network in its historic evolutionary development. We review each of the contributions to this supplement in terms of the theoretical and methodological frameworks applied. Although the findings of each are both relevant and generated with a scholarly rigor that is appropriate to the context in which the evaluation took place, we find that particularly these contextual factors have not contributed to optimum quality of research. Any drawbacks in individual contributions cannot be attributed to their analysts and authors but relate to the complicated and evolving nature of the project. These factors are also reviewed.

  7. Quantum Approximate Methods for the Atomistic Modeling of Multicomponent Alloys. Chapter 7

    NASA Technical Reports Server (NTRS)

    Bozzolo, Guillermo; Garces, Jorge; Mosca, Hugo; Gargano, pablo; Noebe, Ronald D.; Abel, Phillip

    2007-01-01

    This chapter describes the role of quantum approximate methods in the understanding of complex multicomponent alloys at the atomic level. The need to accelerate materials design programs based on economical and efficient modeling techniques provides the framework for the introduction of approximations and simplifications in otherwise rigorous theoretical schemes. As a promising example of the role that such approximate methods might have in the development of complex systems, the BFS method for alloys is presented and applied to Ru-rich Ni-base superalloys and also to the NiAI(Ti,Cu) system, highlighting the benefits that can be obtained from introducing simple modeling techniques to the investigation of such complex systems.

  8. Multi-Disciplinary Knowledge Synthesis for Human Health Assessment on Earth and in Space

    NASA Astrophysics Data System (ADS)

    Christakos, G.

    We discuss methodological developments in multi-disciplinary knowledge synthesis (KS) of human health assessment. A theoretical KS framework can provide the rational means for the assimilation of various information bases (general, site-specific etc.) that are relevant to the life system of interest. KS-based techniques produce a realistic representation of the system, provide a rigorous assessment of the uncertainty sources, and generate informative health state predictions across space-time. The underlying epistemic cognition methodology is based on teleologic criteria and stochastic logic principles. The mathematics of KS involves a powerful and versatile spatiotemporal random field model that accounts rigorously for the uncertainty features of the life system and imposes no restriction on the shape of the probability distributions or the form of the predictors. KS theory is instrumental in understanding natural heterogeneities, assessing crucial human exposure correlations and laws of physical change, and explaining toxicokinetic mechanisms and dependencies in a spatiotemporal life system domain. It is hoped that a better understanding of KS fundamentals would generate multi-disciplinary models that are useful for the maintenance of human health on Earth and in Space.

  9. PRO development: rigorous qualitative research as the crucial foundation.

    PubMed

    Lasch, Kathryn Eilene; Marquis, Patrick; Vigneux, Marc; Abetz, Linda; Arnould, Benoit; Bayliss, Martha; Crawford, Bruce; Rosa, Kathleen

    2010-10-01

    Recently published articles have described criteria to assess qualitative research in the health field in general, but very few articles have delineated qualitative methods to be used in the development of Patient-Reported Outcomes (PROs). In fact, how PROs are developed with subject input through focus groups and interviews has been given relatively short shrift in the PRO literature when compared to the plethora of quantitative articles on the psychometric properties of PROs. If documented at all, most PRO validation articles give little for the reader to evaluate the content validity of the measures and the credibility and trustworthiness of the methods used to develop them. Increasingly, however, scientists and authorities want to be assured that PRO items and scales have meaning and relevance to subjects. This article was developed by an international, interdisciplinary group of psychologists, psychometricians, regulatory experts, a physician, and a sociologist. It presents rigorous and appropriate qualitative research methods for developing PROs with content validity. The approach described combines an overarching phenomenological theoretical framework with grounded theory data collection and analysis methods to yield PRO items and scales that have content validity.

  10. PRO development: rigorous qualitative research as the crucial foundation

    PubMed Central

    Marquis, Patrick; Vigneux, Marc; Abetz, Linda; Arnould, Benoit; Bayliss, Martha; Crawford, Bruce; Rosa, Kathleen

    2010-01-01

    Recently published articles have described criteria to assess qualitative research in the health field in general, but very few articles have delineated qualitative methods to be used in the development of Patient-Reported Outcomes (PROs). In fact, how PROs are developed with subject input through focus groups and interviews has been given relatively short shrift in the PRO literature when compared to the plethora of quantitative articles on the psychometric properties of PROs. If documented at all, most PRO validation articles give little for the reader to evaluate the content validity of the measures and the credibility and trustworthiness of the methods used to develop them. Increasingly, however, scientists and authorities want to be assured that PRO items and scales have meaning and relevance to subjects. This article was developed by an international, interdisciplinary group of psychologists, psychometricians, regulatory experts, a physician, and a sociologist. It presents rigorous and appropriate qualitative research methods for developing PROs with content validity. The approach described combines an overarching phenomenological theoretical framework with grounded theory data collection and analysis methods to yield PRO items and scales that have content validity. PMID:20512662

  11. Reconstructing constructivism: causal models, Bayesian learning mechanisms, and the theory theory.

    PubMed

    Gopnik, Alison; Wellman, Henry M

    2012-11-01

    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and nontechnical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and the psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists.

  12. The rank correlated SLW model of gas radiation in non-uniform media

    NASA Astrophysics Data System (ADS)

    Solovjov, Vladimir P.; Andre, Frederic; Lemonnier, Denis; Webb, Brent W.

    2017-08-01

    A comprehensive theoretical development of possible reference approaches in modelling of radiation transfer in non-uniform gaseous media is developed within the framework of the Generalized SLW Model. The notion of absorption spectrum ;correlation; adopted currently for global methods in gas radiation is critically revisited and replaced by a less restrictive concept of rank correlated spectrum. Within this framework it is shown that eight different reference approaches are possible, of which only three have been reported in the literature. Among the approaches presented is a novel Rank Correlated SLW Model, which is distinguished by the fact that i) it does not require the specification of a reference gas thermodynamic state, and ii) it preserves the emission term in the spectrally integrated Radiative Transfer Equation. Construction of this reference model requires only two absorption line blackbody distribution functions, and subdivision into gray gases can be performed using standard quadratures. Consequently, this new reference approach appears to have significant advantages over all other methods, and is, in general, a significant improvement in the global modelling of gas radiation. All reference approaches are summarized in the present work, and their use in radiative transfer prediction is demonstrated for simple example cases. Further, a detailed rigorous theoretical development of the improved methods is provided.

  13. Assessing collaborative computing: development of the Collaborative-Computing Observation Instrument (C-COI)

    NASA Astrophysics Data System (ADS)

    Israel, Maya; Wherfel, Quentin M.; Shehab, Saadeddine; Ramos, Evan A.; Metzger, Adam; Reese, George C.

    2016-07-01

    This paper describes the development, validation, and uses of the Collaborative Computing Observation Instrument (C-COI), a web-based analysis instrument that classifies individual and/or collaborative behaviors of students during computing problem-solving (e.g. coding, programming). The C-COI analyzes data gathered through video and audio screen recording software that captures students' computer screens as they program, and their conversations with their peers or adults. The instrument allows researchers to organize and quantify these data to track behavioral patterns that could be further analyzed for deeper understanding of persistence and/or collaborative interactions. The article provides a rationale for the C-COI including the development of a theoretical framework for measuring collaborative interactions in computer-mediated environments. This theoretical framework relied on the computer-supported collaborative learning literature related to adaptive help seeking, the joint problem-solving space in which collaborative computing occurs, and conversations related to outcomes and products of computational activities. Instrument development and validation also included ongoing advisory board feedback from experts in computer science, collaborative learning, and K-12 computing as well as classroom observations to test out the constructs in the C-COI. These processes resulted in an instrument with rigorous validation procedures and a high inter-rater reliability.

  14. An investigation into the use of recorded music as a surgical intervention: A systematic, critical review of methodologies used in recent adult controlled trials.

    PubMed

    Williams, Courtney; Hine, Trevor

    2018-04-01

    While music is being increasingly used as a surgical intervention, the types of music used and the reasons underlying their selection remain inconsistent. Empirical research into the efficacy of such musical interventions is therefore problematic. To provide clear guidelines for musical selection and employment in surgical interventions, created through a synthesis of the literature. The aim is to examine how music is implemented in surgical situations, and to provide guidance for the selection and composition of music for future interventions. English language quantitative surgical intervention studies from Science Direct, ProQuest, and Sage Journals Online, all published within the last 10 years and featuring recorded music, were systematically reviewed. Variables investigated included: the time the intervention was performed, the intervention length, the outcomes targeted, music description (general and specific), theoretical frameworks underlying the selection of the music, whether or not a musical expert was involved, participant music history, and the participants' feedback on the chosen music. Several aspects contribute to the lack of scientific rigour regarding music selection in this field, including the lack of a theoretical framework or frameworks, no involvement of musical experts, failure to list the music tracks used, and the use of vague and subjective terms in general music descriptions. Patients are frequently allowed to select music (risking both choosing music that has an adverse effect and making study replication difficult), and patient music history and listening habits are rarely considered. Crucially, five primary theoretical frameworks underlying the effectiveness of music arose in the literature (distraction, relaxation, emotional shift, entrainment, and endogenous analgesia), however music was rarely selected to enhance any of these mechanisms. Further research needs to be conducted to ensure that music is selected according to a theoretical framework and more rigorous and replicable methodology. Music interventions can be made more effective at improving psychological states and reducing physiological arousal by selecting music conducive to specific mechanisms, and also by considering at what point during the surgical experience the music would be most effective. Greater involvement of music experts in interventions would help to ensure that the most appropriate music was chosen, and that it is clearly and precisely described. Copyright © 2018. Published by Elsevier Ltd.

  15. A Draft Conceptual Framework of Relevant Theories to Inform Future Rigorous Research on Student Service-Learning Outcomes

    ERIC Educational Resources Information Center

    Whitley, Meredith A.

    2014-01-01

    While the quality and quantity of research on service-learning has increased considerably over the past 20 years, researchers as well as governmental and funding agencies have called for more rigor in service-learning research. One key variable in improving rigor is using relevant existing theories to improve the research. The purpose of this…

  16. Rigorous Science: a How-To Guide

    PubMed Central

    Fang, Ferric C.

    2016-01-01

    ABSTRACT Proposals to improve the reproducibility of biomedical research have emphasized scientific rigor. Although the word “rigor” is widely used, there has been little specific discussion as to what it means and how it can be achieved. We suggest that scientific rigor combines elements of mathematics, logic, philosophy, and ethics. We propose a framework for rigor that includes redundant experimental design, sound statistical analysis, recognition of error, avoidance of logical fallacies, and intellectual honesty. These elements lead to five actionable recommendations for research education. PMID:27834205

  17. Status of rates and rate equations for thermal leptogenesis

    NASA Astrophysics Data System (ADS)

    Biondini, S.; Bödeker, D.; Brambilla, N.; Garny, M.; Ghiglieri, J.; Hohenegger, A.; Laine, M.; Mendizabal, S.; Millington, P.; Salvio, A.; Vairo, A.

    2018-02-01

    In many realizations of leptogenesis, heavy right-handed neutrinos play the main role in the generation of an imbalance between matter and antimatter in the early Universe. Hence, it is relevant to address quantitatively their dynamics in a hot and dense environment by taking into account the various thermal aspects of the problem at hand. The strong washout regime offers an interesting framework to carry out calculations systematically and reduce theoretical uncertainties. Indeed, any matter-antimatter asymmetry generated when the temperature of the hot plasma T exceeds the right-handed neutrino mass scale M is efficiently erased, and one can focus on the temperature window T ≪ M. We review recent progress in the thermal field theoretic derivation of the key ingredients for the leptogenesis mechanism: the right-handed neutrino production rate, the CP asymmetry in the heavy-neutrino decays and the washout rates. The derivation of evolution equations for the heavy-neutrino and lepton-asymmetry number densities, their rigorous formulation and applicability are also discussed.

  18. Establishing a Research Agenda for Understanding the Role and Impact of Mental Health Peer Specialists.

    PubMed

    Chinman, Matthew; McInnes, D Keith; Eisen, Susan; Ellison, Marsha; Farkas, Marianne; Armstrong, Moe; Resnick, Sandra G

    2017-09-01

    Mental health peer specialists are individuals with serious mental illnesses who receive training to use their lived experiences to help others with serious mental illnesses in clinical settings. This Open Forum discusses the state of the research for mental health peer specialists and suggests a research agenda to advance the field. Studies have suggested that peer specialists vary widely in their roles, settings, and theoretical orientations. Theories of action have been proposed, but none have been tested. Outcome studies have shown benefits of peer specialists; however, many studies have methodological shortcomings. Qualitative descriptions of peer specialists are plentiful but lack grounding in implementation science frameworks. A research agenda advancing the field could include empirically testing theoretical mechanisms of peer specialists, developing a measure of peer specialist fidelity, conducting more rigorous outcomes studies, involving peer specialists in executing the research, and assessing various factors that influence implementing peer specialist services and testing strategies that could address those factors.

  19. Complex basis functions for molecular resonances: Methodology and applications

    NASA Astrophysics Data System (ADS)

    White, Alec; McCurdy, C. William; Head-Gordon, Martin

    The computation of positions and widths of metastable electronic states is a challenge for molecular electronic structure theory because, in addition to the difficulty of the many-body problem, such states obey scattering boundary conditions. These resonances cannot be addressed with naïve application of traditional bound state electronic structure theory. Non-Hermitian electronic structure methods employing complex basis functions is one way that we may rigorously treat resonances within the framework of traditional electronic structure theory. In this talk, I will discuss our recent work in this area including the methodological extension from single determinant SCF-based approaches to highly correlated levels of wavefunction-based theory such as equation of motion coupled cluster and many-body perturbation theory. These approaches provide a hierarchy of theoretical methods for the computation of positions and widths of molecular resonances. Within this framework, we may also examine properties of resonances including the dependence of these parameters on molecular geometry. Some applications of these methods to temporary anions and dianions will also be discussed.

  20. Pattern formation in mass conserving reaction-diffusion systems

    NASA Astrophysics Data System (ADS)

    Brauns, Fridtjof; Halatek, Jacob; Frey, Erwin

    We present a rigorous theoretical framework able to generalize and unify pattern formation for quantitative mass conserving reaction-diffusion models. Mass redistribution controls chemical equilibria locally. Separation of diffusive mass redistribution on the level of conserved species provides a general mathematical procedure to decompose complex reaction-diffusion systems into effectively independent functional units, and to reveal the general underlying bifurcation scenarios. We apply this framework to Min protein pattern formation and identify the mechanistic roles of both involved protein species. MinD generates polarity through phase separation, whereas MinE takes the role of a control variable regulating the existence of MinD phases. Hence, polarization and not oscillations is the generic core dynamics of Min proteins in vivo. This establishes an intrinsic mechanistic link between the Min system and a broad class of intracellular pattern forming systems based on bistability and phase separation (wave-pinning). Oscillations are facilitated by MinE redistribution and can be understood mechanistically as relaxation oscillations of the polarization direction.

  1. Basis for paraxial surface-plasmon-polariton packets

    NASA Astrophysics Data System (ADS)

    Martinez-Herrero, Rosario; Manjavacas, Alejandro

    2016-12-01

    We present a theoretical framework for the study of surface-plasmon polariton (SPP) packets propagating along a lossy metal-dielectric interface within the paraxial approximation. Using a rigorous formulation based on the plane-wave spectrum formalism, we introduce a set of modes that constitute a complete basis set for the solutions of Maxwell's equations for a metal-dielectric interface in the paraxial approximation. The use of this set of modes allows us to fully analyze the evolution of the transversal structure of SPP packets beyond the single plane-wave approximation. As a paradigmatic example, we analyze the case of a Gaussian SPP mode, for which, exploiting the analogy with paraxial optical beams, we introduce a set of parameters that characterize its propagation.

  2. Guidelines for preparing high school psychology teachers: course-based and standards-based approaches.

    PubMed

    2013-01-01

    Psychology is one of the most popular elective high school courses. The high school psychology course provides the foundation for students to benefit from psychological perspectives on personal and contemporary issues and learn the rules of evidence and theoretical frameworks of the discipline. The guidelines presented here constitute the second of two reports in this issue of the American Psychologist (January 2013) representing recent American Psychological Association (APA) policies that support high-quality instruction in the teaching of high school psychology. These guidelines, aligned to the standards presented in the preceding report, describe models for the preparation of preservice psychology teachers. The two reports together demonstrate the rigor and competency that should be expected in psychology instruction at the high school level.

  3. Dynamics of a Chlorophyll Dimer in Collective and Local Thermal Environments

    DOE PAGES

    Merkli, M.; Berman, Gennady Petrovich; Sayre, Richard Thomas; ...

    2016-01-30

    Here we present a theoretical analysis of exciton transfer and decoherence effects in a photosynthetic dimer interacting with collective (correlated) and local (uncorrelated) protein-solvent environments. Our approach is based on the framework of the spin-boson model. We derive explicitly the thermal relaxation and decoherence rates of the exciton transfer process, valid for arbitrary temperatures and for arbitrary (in particular, large) interaction constants between the dimer and the environments. We establish a generalization of the Marcus formula, giving reaction rates for dimer levels possibly individually and asymmetrically coupled to environments. We identify rigorously parameter regimes for the validity of the generalizedmore » Marcus formula. The existence of long living quantum coherences at ambient temperatures emerges naturally from our approach.« less

  4. Academic Rigor or Academic Rigor Mortis? Supervising Dissertations Is Serious Business

    ERIC Educational Resources Information Center

    Wright, Robin Redmon

    2017-01-01

    This reflection considers the importance of and responsibility to graduate research supervision through an examination of a published dissertation that has had significant influence on the country's current immigration debate. The author exhorts both graduate students and adult education faculty to insist on clearly stated theoretical and…

  5. Development and utilization of complementary communication channels for treatment decision making and survivorship issues among cancer patients: The CIS Research Consortium Experience.

    PubMed

    Fleisher, Linda; Wen, Kuang Yi; Miller, Suzanne M; Diefenbach, Michael; Stanton, Annette L; Ropka, Mary; Morra, Marion; Raich, Peter C

    2015-11-01

    Cancer patients and survivors are assuming active roles in decision-making and digital patient support tools are widely used to facilitate patient engagement. As part of Cancer Information Service Research Consortium's randomized controlled trials focused on the efficacy of eHealth interventions to promote informed treatment decision-making for newly diagnosed prostate and breast cancer patients, and post-treatment breast cancer, we conducted a rigorous process evaluation to examine the actual use of and perceived benefits of two complementary communication channels -- print and eHealth interventions. The three Virtual Cancer Information Service (V-CIS) interventions were developed through a rigorous developmental process, guided by self-regulatory theory, informed decision-making frameworks, and health communications best practices. Control arm participants received NCI print materials; experimental arm participants received the additional V-CIS patient support tool. Actual usage data from the web-based V-CIS was also obtained and reported. Print materials were highly used by all groups. About 60% of the experimental group reported using the V-CIS. Those who did use the V-CIS rated it highly on improvements in knowledge, patient-provider communication and decision-making. The findings show that how patients actually use eHealth interventions either singularly or within the context of other communication channels is complex. Integrating rigorous best practices and theoretical foundations is essential and multiple communication approaches should be considered to support patient preferences.

  6. Towards a rigorous mesoscale modeling of reactive flow and transport in an evolving porous medium and its applications to soil science

    NASA Astrophysics Data System (ADS)

    Ray, Nadja; Rupp, Andreas; Knabner, Peter

    2016-04-01

    Soil is arguably the most prominent example of a natural porous medium that is composed of a porous matrix and a pore space. Within this framework and in terms of soil's heterogeneity, we first consider transport and fluid flow at the pore scale. From there, we develop a mechanistic model and upscale it mathematically to transfer our model from the small scale to that of the mesoscale (laboratory scale). The mathematical framework of (periodic) homogenization (in principal) rigorously facilitates such processes by exactly computing the effective coefficients/parameters by means of the pore geometry and processes. In our model, various small-scale soil processes may be taken into account: molecular diffusion, convection, drift emerging from electric forces, and homogeneous reactions of chemical species in a solvent. Additionally, our model may consider heterogeneous reactions at the porous matrix, thus altering both the porosity and the matrix. Moreover, our model may additionally address biophysical processes, such as the growth of biofilms and how this affects the shape of the pore space. Both of the latter processes result in an intrinsically variable soil structure in space and time. Upscaling such models under the assumption of a locally periodic setting must be performed meticulously to preserve information regarding the complex coupling of processes in the evolving heterogeneous medium. Generally, a micro-macro model emerges that is then comprised of several levels of couplings: Macroscopic equations that describe the transport and fluid flow at the scale of the porous medium (mesoscale) include averaged time- and space-dependent coefficient functions. These functions may be explicitly computed by means of auxiliary cell problems (microscale). Finally, the pore space in which the cell problems are defined is time- and space dependent and its geometry inherits information from the transport equation's solutions. Numerical computations using mixed finite elements and potentially random initial data, e.g. that of porosity, complement our theoretical results. Our investigations contribute to the theoretical understanding of the link between soil formation and soil functions. This general framework may be applied to various problems in soil science for a range of scales, such as the formation and turnover of microaggregates or soil remediation.

  7. Evolution in Stage-Structured Populations

    PubMed Central

    Barfield, Michael; Holt, Robert D.; Gomulkiewicz, Richard

    2016-01-01

    For many organisms, stage is a better predictor of demographic rates than age. Yet no general theoretical framework exists for understanding or predicting evolution in stage-structured populations. Here, we provide a general modeling approach that can be used to predict evolution and demography of stage-structured populations. This advances our ability to understand evolution in stage-structured populations to a level previously available only for populations structured by age. We use this framework to provide the first rigorous proof that Lande’s theorem, which relates adaptive evolution to population growth, applies to stage-classified populations, assuming only normality and that evolution is slow relative to population dynamics. We extend this theorem to allow for different means or variances among stages. Our next major result is the formulation of Price’s theorem, a fundamental law of evolution, for stage-structured populations. In addition, we use data from Trillium grandiflorum to demonstrate how our models can be applied to a real-world population and thereby show their practical potential to generate accurate projections of evolutionary and population dynamics. Finally, we use our framework to compare rates of evolution in age- versus stage-structured populations, which shows how our methods can yield biological insights about evolution in stage-structured populations. PMID:21460563

  8. Reconstructing constructivism: Causal models, Bayesian learning mechanisms and the theory theory

    PubMed Central

    Gopnik, Alison; Wellman, Henry M.

    2012-01-01

    We propose a new version of the “theory theory” grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and non-technical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists. PMID:22582739

  9. A Cluster-Randomized Trial of Restorative Practices: An Illustration to Spur High-Quality Research and Evaluation.

    PubMed

    Acosta, Joie D; Chinman, Matthew; Ebener, Patricia; Phillips, Andrea; Xenakis, Lea; Malone, Patrick S

    2016-01-01

    Restorative Practices in schools lack rigorous evaluation studies. As an example of rigorous school-based research, this paper describes the first randomized control trial of restorative practices to date, the Study of Restorative Practices. It is a 5-year, cluster-randomized controlled trial (RCT) of the Restorative Practices Intervention (RPI) in 14 middle schools in Maine to assess whether RPI impacts both positive developmental outcomes and problem behaviors and whether the effects persist during the transition from middle to high school. The two-year RPI intervention began in the 2014-2015 school year. The study's rationale and theoretical concerns are discussed along with methodological concerns including teacher professional development. The theoretical rationale and description of the methods from this study may be useful to others conducting rigorous research and evaluation in this area.

  10. Multiplicative Multitask Feature Learning

    PubMed Central

    Wang, Xin; Bi, Jinbo; Yu, Shipeng; Sun, Jiangwen; Song, Minghu

    2016-01-01

    We investigate a general framework of multiplicative multitask feature learning which decomposes individual task’s model parameters into a multiplication of two components. One of the components is used across all tasks and the other component is task-specific. Several previous methods can be proved to be special cases of our framework. We study the theoretical properties of this framework when different regularization conditions are applied to the two decomposed components. We prove that this framework is mathematically equivalent to the widely used multitask feature learning methods that are based on a joint regularization of all model parameters, but with a more general form of regularizers. Further, an analytical formula is derived for the across-task component as related to the task-specific component for all these regularizers, leading to a better understanding of the shrinkage effects of different regularizers. Study of this framework motivates new multitask learning algorithms. We propose two new learning formulations by varying the parameters in the proposed framework. An efficient blockwise coordinate descent algorithm is developed suitable for solving the entire family of formulations with rigorous convergence analysis. Simulation studies have identified the statistical properties of data that would be in favor of the new formulations. Extensive empirical studies on various classification and regression benchmark data sets have revealed the relative advantages of the two new formulations by comparing with the state of the art, which provides instructive insights into the feature learning problem with multiple tasks. PMID:28428735

  11. Assessing Sensitivity of Early Head Start Study Findings to Manipulated Randomization Threats

    ERIC Educational Resources Information Center

    Green, Sheridan

    2013-01-01

    Increasing demands for design rigor and an emphasis on evidence-based practice on a national level indicated a need for further guidance related to successful implementation of randomized studies in education. Rigorous and meaningful experimental research and its conclusions help establish a valid theoretical and evidence base for educational…

  12. Does McNemar's test compare the sensitivities and specificities of two diagnostic tests?

    PubMed

    Kim, Soeun; Lee, Woojoo

    2017-02-01

    McNemar's test is often used in practice to compare the sensitivities and specificities for the evaluation of two diagnostic tests. For correct evaluation of accuracy, an intuitive recommendation is to test the diseased and the non-diseased groups separately so that the sensitivities can be compared among the diseased, and specificities can be compared among the healthy group of people. This paper provides a rigorous theoretical framework for this argument and study the validity of McNemar's test regardless of the conditional independence assumption. We derive McNemar's test statistic under the null hypothesis considering both assumptions of conditional independence and conditional dependence. We then perform power analyses to show how the result is affected by the amount of the conditional dependence under alternative hypothesis.

  13. A Cluster-Randomized Trial of Restorative Practices: An Illustration to Spur High-Quality Research and Evaluation

    PubMed Central

    Acosta, Joie D.; Chinman, Matthew; Ebener, Patricia; Phillips, Andrea; Xenakis, Lea; Malone, Patrick S.

    2017-01-01

    Restorative Practices in schools lack rigorous evaluation studies. As an example of rigorous school-based research, this paper describes the first randomized control trial of restorative practices to date, the Study of Restorative Practices. It is a 5-year, cluster-randomized controlled trial (RCT) of the Restorative Practices Intervention (RPI) in 14 middle schools in Maine to assess whether RPI impacts both positive developmental outcomes and problem behaviors and whether the effects persist during the transition from middle to high school. The two-year RPI intervention began in the 2014–2015 school year. The study’s rationale and theoretical concerns are discussed along with methodological concerns including teacher professional development. The theoretical rationale and description of the methods from this study may be useful to others conducting rigorous research and evaluation in this area. PMID:28936104

  14. Theoretical rationale for music selection in oncology intervention research: an integrative review.

    PubMed

    Burns, Debra S

    2012-01-01

    Music-based interventions have helped patients with cancer improve their quality of life, decrease treatment related distress, and manage pain. However, quantitative findings from music intervention studies are inconsistent. The purpose of this review was to explore the theoretical underpinnings for the selection of the music stimuli used to influence targeted outcomes. It was hypothesized that disparate findings were due in part to the atheoretical nature of music selection and the resulting diversity in music stimuli between and within studies. A systematic research synthesis including a comprehensive database and reference list search resulted in 22 studies. Included studies were compiled into two tables cataloging intervention theory, intervention content, and outcomes. A majority of studies did not provide a rationale or intervention theory for the delivery of music or choice of outcomes. Recorded music was the most common delivery method, but the specific music was rarely included within the report. Only two studies that included a theoretical framework reported null results on at least some of the outcomes. Null results are partially explained by an incomplete or mismatch in intervention theory and music selection and delivery. While the inclusion of an intervention theory does not guarantee positive results, including a theoretical rationale for the use of music, particular therapeutic processes or mechanisms, and the specifics of how music is selected and delivered increases scientific rigor and the probability of clinical translation.

  15. Formalization of Generalized Constraint Language: A Crucial Prelude to Computing With Words.

    PubMed

    Khorasani, Elham S; Rahimi, Shahram; Calvert, Wesley

    2013-02-01

    The generalized constraint language (GCL), introduced by Zadeh, serves as a basis for computing with words (CW). It provides an agenda to express the imprecise and fuzzy information embedded in natural language and allows reasoning with perceptions. Despite its fundamental role, the definition of GCL has remained informal since its introduction by Zadeh, and to our knowledge, no attempt has been made to formulate a rigorous theoretical framework for GCL. Such formalization is necessary for further theoretical and practical advancement of CW for two important reasons. First, it provides the underlying infrastructure for the development of useful inference patterns based on sound theories. Second, it determines the scope of GCL and hence facilitates the translation of natural language expressions into GCL. This paper is an attempt to step in this direction by providing a formal syntax together with a compositional semantics for GCL. A soundness theorem is defined, and Zadeh's deduction rules are proved to be valid in the defined semantics. Furthermore, a discussion is provided on how the proposed language may be used in practice.

  16. University Students' Strategies for Constructing Hypothesis when Tackling Paper-and-Pencil Tasks in Physics

    NASA Astrophysics Data System (ADS)

    Guisasola, Jenaro; Ceberio, Mikel; Zubimendi, José Luis

    2006-09-01

    The study we present tries to explore how first year engineering students formulate hypotheses in order to construct their own problem solving structure when confronted with problems in physics. Under the constructivistic perspective of the teaching-learning process, the formulation of hypotheses plays a key role in contrasting the coherence of the students' ideas with the theoretical frame. The main research instrument used to identify students' reasoning is the written report by the student on how they have attempted four problem solving tasks in which they have been asked explicitly to formulate hypotheses. The protocols used in the assessment of the solutions consisted of a semi-quantitative study based on grids designed for the analysis of written answers. In this paper we have included two of the tasks used and the corresponding scheme for the categorisation of the answers. Details of the other two tasks are also outlined. According to our findings we would say that the majority of students judge a hypothesis to be plausible if it is congruent with their previous knowledge without rigorously checking it against the theoretical framework explained in class.

  17. Theoretical modeling of PEB procedure on EUV resist using FDM formulation

    NASA Astrophysics Data System (ADS)

    Kim, Muyoung; Moon, Junghwan; Choi, Joonmyung; Lee, Byunghoon; Jeong, Changyoung; Kim, Heebom; Cho, Maenghyo

    2018-03-01

    Semiconductor manufacturing industry has reduced the size of wafer for enhanced productivity and performance, and Extreme Ultraviolet (EUV) light source is considered as a promising solution for downsizing. A series of EUV lithography procedures contain complex photo-chemical reaction on photoresist, and it causes technical difficulties on constructing theoretical framework which facilitates rigorous investigation of underlying mechanism. Thus, we formulated finite difference method (FDM) model of post exposure bake (PEB) process on positive chemically amplified resist (CAR), and it involved acid diffusion coupled-deprotection reaction. The model is based on Fick's second law and first-order chemical reaction rate law for diffusion and deprotection, respectively. Two kinetic parameters, diffusion coefficient of acid and rate constant of deprotection, which were obtained by experiment and atomic scale simulation were applied to the model. As a result, we obtained time evolutional protecting ratio of each functional group in resist monomer which can be used to predict resulting polymer morphology after overall chemical reactions. This achievement will be the cornerstone of multiscale modeling which provides fundamental understanding on important factors for EUV performance and rational design of the next-generation photoresist.

  18. Translational medicine in the Age of Big Data.

    PubMed

    Tatonetti, Nicholas P

    2017-10-12

    The ability to collect, store and analyze massive amounts of molecular and clinical data is fundamentally transforming the scientific method and its application in translational medicine. Collecting observations has always been a prerequisite for discovery, and great leaps in scientific understanding are accompanied by an expansion of this ability. Particle physics, astronomy and climate science, for example, have all greatly benefited from the development of new technologies enabling the collection of larger and more diverse data. Unlike medicine, however, each of these fields also has a mature theoretical framework on which new data can be evaluated and incorporated-to say it another way, there are no 'first principals' from which a healthy human could be analytically derived. The worry, and it is a valid concern, is that, without a strong theoretical underpinning, the inundation of data will cause medical research to devolve into a haphazard enterprise without discipline or rigor. The Age of Big Data harbors tremendous opportunity for biomedical advances, but will also be treacherous and demanding on future scientists. © The Author 2017. Published by Oxford University Press.

  19. Inelastic electron tunneling mediated by a molecular quantum rotator

    NASA Astrophysics Data System (ADS)

    Sugimoto, Toshiki; Kunisada, Yuji; Fukutani, Katsuyuki

    2017-12-01

    Inelastic electron tunneling (IET) accompanying nuclear motion is not only of fundamental physical interest but also has strong impacts on chemical and biological processes in nature. Although excitation of rotational motion plays an important role in enhancing electric conductance at a low bias, the mechanism of rotational excitation remains veiled. Here, we present a basic theoretical framework of IET that explicitly takes into consideration quantum angular momentum, focusing on a molecular H2 rotator trapped in a nanocavity between two metallic electrodes as a model system. It is shown that orientationally anisotropic electrode-rotator coupling is the origin of angular-momentum exchange between the electron and molecule; we found that the anisotropic coupling imposes rigorous selection rules in rotational excitation. In addition, rotational symmetry breaking induced by the anisotropic potential lifts the degeneracy of the energy level of the degenerated rotational state of the quantum rotator and tunes the threshold bias voltage that triggers rotational IET. Our theoretical results provide a paradigm for physical understanding of the rotational IET process and spectroscopy, as well as molecular-level design of electron-rotation coupling in nanoelectronics.

  20. Psycho-educational strategies to promote fluid adherence in adult hemodialysis patients: a review of intervention studies.

    PubMed

    Welch, Janet L; Thomas-Hawkins, Charlotte

    2005-07-01

    We reviewed psycho-educational intervention studies that were designed to reduce interdialytic weight gain (IDWG) in adult hemodialysis patients. Our goals were to critique research methods, describe the effectiveness of tested interventions, and make recommendations for future research. Medline, PsychInfo, and the Cumulative Index to Nursing and Applied Health (CINAHL) databases were searched to identify empirical work. Each study was evaluated in terms of sample, design, theoretical framework, intervention delivery, and outcome. Nine studies were reviewed. Self-monitoring appears to be a promising strategy to be considered to reduce IDWG. Theory was not usually used to guide interventions, designs generally had control groups, interventions were delivered individually, more than one intervention was delivered at a time, the duration of the intervention varied greatly, there was no long-term follow-up, IDWG was the only outcome, and IDWG was operationalized in different ways. Theoretical models and methodological rigor are needed to guide future research. Specific recommendations on design, measurement, and conceptual issues are offered to enhance the effectiveness of future research.

  1. Transcranial Electrical Stimulation

    PubMed Central

    Fertonani, Anna; Miniussi, Carlo

    2016-01-01

    In recent years, there has been remarkable progress in the understanding and practical use of transcranial electrical stimulation (tES) techniques. Nevertheless, to date, this experimental effort has not been accompanied by substantial reflections on the models and mechanisms that could explain the stimulation effects. Given these premises, the aim of this article is to provide an updated picture of what we know about the theoretical models of tES that have been proposed to date, contextualized in a more specific and unitary framework. We demonstrate that these models can explain the tES behavioral effects as distributed along a continuum from stimulation dependent to network activity dependent. In this framework, we also propose that stochastic resonance is a useful mechanism to explain the general online neuromodulation effects of tES. Moreover, we highlight the aspects that should be considered in future research. We emphasize that tES is not an “easy-to-use” technique; however, it may represent a very fruitful approach if applied within rigorous protocols, with deep knowledge of both the behavioral and cognitive aspects and the more recent advances in the application of stimulation. PMID:26873962

  2. Developing a theoretical framework for complex community-based interventions.

    PubMed

    Angeles, Ricardo N; Dolovich, Lisa; Kaczorowski, Janusz; Thabane, Lehana

    2014-01-01

    Applying existing theories to research, in the form of a theoretical framework, is necessary to advance knowledge from what is already known toward the next steps to be taken. This article proposes a guide on how to develop a theoretical framework for complex community-based interventions using the Cardiovascular Health Awareness Program as an example. Developing a theoretical framework starts with identifying the intervention's essential elements. Subsequent steps include the following: (a) identifying and defining the different variables (independent, dependent, mediating/intervening, moderating, and control); (b) postulating mechanisms how the independent variables will lead to the dependent variables; (c) identifying existing theoretical models supporting the theoretical framework under development; (d) scripting the theoretical framework into a figure or sets of statements as a series of hypotheses, if/then logic statements, or a visual model; (e) content and face validation of the theoretical framework; and (f) revising the theoretical framework. In our example, we combined the "diffusion of innovation theory" and the "health belief model" to develop our framework. Using the Cardiovascular Health Awareness Program as the model, we demonstrated a stepwise process of developing a theoretical framework. The challenges encountered are described, and an overview of the strategies employed to overcome these challenges is presented.

  3. The Importance of the C3 Framework

    ERIC Educational Resources Information Center

    Social Education, 2013

    2013-01-01

    "The C3 Framework for Social Studies State Standards will soon be released under the title "The College, Career, and Civic Life (C3) Framework for Social Studies State Standards: State Guidance for Enhancing the Rigor of K-12 Civics, Economics, Geography, and History." The C3 Project Director and Lead Writer was NCSS member Kathy…

  4. Systematic searching for theory to inform systematic reviews: is it feasible? Is it desirable?

    PubMed

    Booth, Andrew; Carroll, Christopher

    2015-09-01

    In recognising the potential value of theory in understanding how interventions work comes a challenge - how to make identification of theory less haphazard? To explore the feasibility of systematic identification of theory. We searched PubMed for published reviews (1998-2012) that had explicitly sought to identify theory. Systematic searching may be characterised by a structured question, methodological filters and an itemised search procedure. We constructed a template (BeHEMoTh - Behaviour of interest; Health context; Exclusions; Models or Theories) for use when systematically identifying theory. The authors tested the template within two systematic reviews. Of 34 systematic reviews, only 12 reviews (35%) reported a method for identifying theory. Nineteen did not specify how they identified studies containing theory. Data were unavailable for three reviews. Candidate terms include concept(s)/conceptual, framework(s), model(s), and theory/theories/theoretical. Information professionals must overcome inadequate reporting and the use of theory out of context. The review team faces an additional concern in lack of 'theory fidelity'. Based on experience with two systematic reviews, the BeHEMoTh template and procedure offers a feasible and useful approach for identification of theory. Applications include realist synthesis, framework synthesis or review of complex interventions. The procedure requires rigorous evaluation. © 2015 Health Libraries Group.

  5. Comparison of Optimal Design Methods in Inverse Problems

    PubMed Central

    Banks, H. T.; Holm, Kathleen; Kappel, Franz

    2011-01-01

    Typical optimal design methods for inverse or parameter estimation problems are designed to choose optimal sampling distributions through minimization of a specific cost function related to the resulting error in parameter estimates. It is hoped that the inverse problem will produce parameter estimates with increased accuracy using data collected according to the optimal sampling distribution. Here we formulate the classical optimal design problem in the context of general optimization problems over distributions of sampling times. We present a new Prohorov metric based theoretical framework that permits one to treat succinctly and rigorously any optimal design criteria based on the Fisher Information Matrix (FIM). A fundamental approximation theory is also included in this framework. A new optimal design, SE-optimal design (standard error optimal design), is then introduced in the context of this framework. We compare this new design criteria with the more traditional D-optimal and E-optimal designs. The optimal sampling distributions from each design are used to compute and compare standard errors; the standard errors for parameters are computed using asymptotic theory or bootstrapping and the optimal mesh. We use three examples to illustrate ideas: the Verhulst-Pearl logistic population model [13], the standard harmonic oscillator model [13] and a popular glucose regulation model [16, 19, 29]. PMID:21857762

  6. Mass, Momentum and Kinetic Energy of a Relativistic Particle

    ERIC Educational Resources Information Center

    Zanchini, Enzo

    2010-01-01

    A rigorous definition of mass in special relativity, proposed in a recent paper, is recalled and employed to obtain simple and rigorous deductions of the expressions of momentum and kinetic energy for a relativistic particle. The whole logical framework appears as the natural extension of the classical one. Only the first, second and third laws of…

  7. Combining frozen-density embedding with the conductor-like screening model using Lagrangian techniques for response properties.

    PubMed

    Schieschke, Nils; Di Remigio, Roberto; Frediani, Luca; Heuser, Johannes; Höfener, Sebastian

    2017-07-15

    We present the explicit derivation of an approach to the multiscale description of molecules in complex environments that combines frozen-density embedding (FDE) with continuum solvation models, in particular the conductor-like screening model (COSMO). FDE provides an explicit atomistic description of molecule-environment interactions at reduced computational cost, while the outer continuum layer accounts for the effect of long-range isotropic electrostatic interactions. Our treatment is based on a variational Lagrangian framework, enabling rigorous derivations of ground- and excited-state response properties. As an example of the flexibility of the theoretical framework, we derive and discuss FDE + COSMO analytical molecular gradients for excited states within the Tamm-Dancoff approximation (TDA) and for ground states within second-order Møller-Plesset perturbation theory (MP2) and a second-order approximate coupled cluster with singles and doubles (CC2). It is shown how this method can be used to describe vertical electronic excitation (VEE) energies and Stokes shifts for uracil in water and carbostyril in dimethyl sulfoxide (DMSO), respectively. In addition, VEEs for some simplified protein models are computed, illustrating the performance of this method when applied to larger systems. The interaction terms between the FDE subsystem densities and the continuum can influence excitation energies up to 0.3 eV and, thus, cannot be neglected for general applications. We find that the net influence of the continuum in presence of the first FDE shell on the excitation energy amounts to about 0.05 eV for the cases investigated. The present work is an important step toward rigorously derived ab initio multilayer and multiscale modeling approaches. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  8. A framework for characterizing eHealth literacy demands and barriers.

    PubMed

    Chan, Connie V; Kaufman, David R

    2011-11-17

    Consumer eHealth interventions are of a growing importance in the individual management of health and health behaviors. However, a range of access, resources, and skills barriers prevent health care consumers from fully engaging in and benefiting from the spectrum of eHealth interventions. Consumers may engage in a range of eHealth tasks, such as participating in health discussion forums and entering information into a personal health record. eHealth literacy names a set of skills and knowledge that are essential for productive interactions with technology-based health tools, such as proficiency in information retrieval strategies, and communicating health concepts effectively. We propose a theoretical and methodological framework for characterizing complexity of eHealth tasks, which can be used to diagnose and describe literacy barriers and inform the development of solution strategies. We adapted and integrated two existing theoretical models relevant to the analysis of eHealth literacy into a single framework to systematically categorize and describe task demands and user performance on tasks needed by health care consumers in the information age. The method derived from the framework is applied to (1) code task demands using a cognitive task analysis, and (2) code user performance on tasks. The framework and method are applied to the analysis of a Web-based consumer eHealth task with information-seeking and decision-making demands. We present the results from the in-depth analysis of the task performance of a single user as well as of 20 users on the same task to illustrate both the detailed analysis and the aggregate measures obtained and potential analyses that can be performed using this method. The analysis shows that the framework can be used to classify task demands as well as the barriers encountered in user performance of the tasks. Our approach can be used to (1) characterize the challenges confronted by participants in performing the tasks, (2) determine the extent to which application of the framework to the cognitive task analysis can predict and explain the problems encountered by participants, and (3) inform revisions to the framework to increase accuracy of predictions. The results of this illustrative application suggest that the framework is useful for characterizing task complexity and for diagnosing and explaining barriers encountered in task completion. The framework and analytic approach can be a potentially powerful generative research platform to inform development of rigorous eHealth examination and design instruments, such as to assess eHealth competence, to design and evaluate consumer eHealth tools, and to develop an eHealth curriculum.

  9. A Framework for Characterizing eHealth Literacy Demands and Barriers

    PubMed Central

    Chan, Connie V

    2011-01-01

    Background Consumer eHealth interventions are of a growing importance in the individual management of health and health behaviors. However, a range of access, resources, and skills barriers prevent health care consumers from fully engaging in and benefiting from the spectrum of eHealth interventions. Consumers may engage in a range of eHealth tasks, such as participating in health discussion forums and entering information into a personal health record. eHealth literacy names a set of skills and knowledge that are essential for productive interactions with technology-based health tools, such as proficiency in information retrieval strategies, and communicating health concepts effectively. Objective We propose a theoretical and methodological framework for characterizing complexity of eHealth tasks, which can be used to diagnose and describe literacy barriers and inform the development of solution strategies. Methods We adapted and integrated two existing theoretical models relevant to the analysis of eHealth literacy into a single framework to systematically categorize and describe task demands and user performance on tasks needed by health care consumers in the information age. The method derived from the framework is applied to (1) code task demands using a cognitive task analysis, and (2) code user performance on tasks. The framework and method are applied to the analysis of a Web-based consumer eHealth task with information-seeking and decision-making demands. We present the results from the in-depth analysis of the task performance of a single user as well as of 20 users on the same task to illustrate both the detailed analysis and the aggregate measures obtained and potential analyses that can be performed using this method. Results The analysis shows that the framework can be used to classify task demands as well as the barriers encountered in user performance of the tasks. Our approach can be used to (1) characterize the challenges confronted by participants in performing the tasks, (2) determine the extent to which application of the framework to the cognitive task analysis can predict and explain the problems encountered by participants, and (3) inform revisions to the framework to increase accuracy of predictions. Conclusions The results of this illustrative application suggest that the framework is useful for characterizing task complexity and for diagnosing and explaining barriers encountered in task completion. The framework and analytic approach can be a potentially powerful generative research platform to inform development of rigorous eHealth examination and design instruments, such as to assess eHealth competence, to design and evaluate consumer eHealth tools, and to develop an eHealth curriculum. PMID:22094891

  10. The need for international nursing diagnosis research and a theoretical framework.

    PubMed

    Lunney, Margaret

    2008-01-01

    To describe the need for nursing diagnosis research and a theoretical framework for such research. A linguistics theory served as the foundation for the theoretical framework. Reasons for additional nursing diagnosis research are: (a) file names are needed for implementation of electronic health records, (b) international consensus is needed for an international classification, and (c) continuous changes occur in clinical practice. A theoretical framework used by the author is explained. Theoretical frameworks provide support for nursing diagnosis research. Linguistics theory served as an appropriate exemplar theory to support nursing research. Additional nursing diagnosis studies based upon a theoretical framework are needed and linguistics theory can provide an appropriate structure for this research.

  11. Single-case synthesis tools I: Comparing tools to evaluate SCD quality and rigor.

    PubMed

    Zimmerman, Kathleen N; Ledford, Jennifer R; Severini, Katherine E; Pustejovsky, James E; Barton, Erin E; Lloyd, Blair P

    2018-03-03

    Tools for evaluating the quality and rigor of single case research designs (SCD) are often used when conducting SCD syntheses. Preferred components include evaluations of design features related to the internal validity of SCD to obtain quality and/or rigor ratings. Three tools for evaluating the quality and rigor of SCD (Council for Exceptional Children, What Works Clearinghouse, and Single-Case Analysis and Design Framework) were compared to determine if conclusions regarding the effectiveness of antecedent sensory-based interventions for young children changed based on choice of quality evaluation tool. Evaluation of SCD quality differed across tools, suggesting selection of quality evaluation tools impacts evaluation findings. Suggestions for selecting an appropriate quality and rigor assessment tool are provided and across-tool conclusions are drawn regarding the quality and rigor of studies. Finally, authors provide guidance for using quality evaluations in conjunction with outcome analyses when conducting syntheses of interventions evaluated in the context of SCD. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Q and A about the College, Career, and Civic Life (C3) Framework for Social Studies State Standards

    ERIC Educational Resources Information Center

    Herczog, Michelle

    2013-01-01

    The "College, Career, and Civic Life (C3) Framework for Social Studies State Standards: State Guidance for Enhancing the Rigor of K-12 Civics, Economics, Geography, and History" will soon be released. The C3 Framework was developed to serve two audiences: for states to upgrade their state social studies standards, and for…

  13. DESCQA: Synthetic Sky Catalog Validation Framework

    NASA Astrophysics Data System (ADS)

    Mao, Yao-Yuan; Uram, Thomas D.; Zhou, Rongpu; Kovacs, Eve; Ricker, Paul M.; Kalmbach, J. Bryce; Padilla, Nelson; Lanusse, François; Zu, Ying; Tenneti, Ananth; Vikraman, Vinu; DeRose, Joseph

    2018-04-01

    The DESCQA framework provides rigorous validation protocols for assessing the quality of high-quality simulated sky catalogs in a straightforward and comprehensive way. DESCQA enables the inspection, validation, and comparison of an inhomogeneous set of synthetic catalogs via the provision of a common interface within an automated framework. An interactive web interface is also available at portal.nersc.gov/project/lsst/descqa.

  14. The C3 Framework: One Year Later - an Interview with Kathy Swan

    ERIC Educational Resources Information Center

    Social Education, 2014

    2014-01-01

    On September 17, 2013 (Constitution Day), the C3 Framework was released under the title "The College, Career and Civic Life (C3) Framework for Social Studies State Standards: Guidance for Enhancing the Rigor of K-12 Civics, Economics, Geography, and History." The C3 Project Director and lead writer was NCSS member Kathy Swan, who is…

  15. Property-Based Software Engineering Measurement

    NASA Technical Reports Server (NTRS)

    Briand, Lionel; Morasca, Sandro; Basili, Victor R.

    1995-01-01

    Little theory exists in the field of software system measurement. Concepts such as complexity, coupling, cohesion or even size are very often subject to interpretation and appear to have inconsistent definitions in the literature. As a consequence, there is little guidance provided to the analyst attempting to define proper measures for specific problems. Many controversies in the literature are simply misunderstandings and stem from the fact that some people talk about different measurement concepts under the same label (complexity is the most common case). There is a need to define unambiguously the most important measurement concepts used in the measurement of software products. One way of doing so is to define precisely what mathematical properties characterize these concepts regardless of the specific software artifacts to which these concepts are applied. Such a mathematical framework could generate a consensus in the software engineering community and provide a means for better communication among researchers, better guidelines for analysis, and better evaluation methods for commercial static analyzers for practitioners. In this paper, we propose a mathematical framework which is generic, because it is not specific to any particular software artifact, and rigorous, because it is based on precise mathematical concepts. This framework defines several important measurement concepts (size, length, complexity, cohesion, coupling). It is not intended to be complete or fully objective; other frameworks could have been proposed and different choices could have been made. However, we believe that the formalism and properties we introduce are convenient and intuitive. In addition, we have reviewed the literature on this subject and compared it with our work. This framework contributes constructively to a firmer theoretical ground of software measurement.

  16. Property-Based Software Engineering Measurement

    NASA Technical Reports Server (NTRS)

    Briand, Lionel C.; Morasca, Sandro; Basili, Victor R.

    1997-01-01

    Little theory exists in the field of software system measurement. Concepts such as complexity, coupling, cohesion or even size are very often subject to interpretation and appear to have inconsistent definitions in the literature. As a consequence, there is little guidance provided to the analyst attempting to define proper measures for specific problems. Many controversies in the literature are simply misunderstandings and stem from the fact that some people talk about different measurement concepts under the same label (complexity is the most common case). There is a need to define unambiguously the most important measurement concepts used in the measurement of software products. One way of doing so is to define precisely what mathematical properties characterize these concepts, regardless of the specific software artifacts to which these concepts are applied. Such a mathematical framework could generate a consensus in the software engineering community and provide a means for better communication among researchers, better guidelines for analysts, and better evaluation methods for commercial static analyzers for practitioners. In this paper, we propose a mathematical framework which is generic, because it is not specific to any particular software artifact and rigorous, because it is based on precise mathematical concepts. We use this framework to propose definitions of several important measurement concepts (size, length, complexity, cohesion, coupling). It does not intend to be complete or fully objective; other frameworks could have been proposed and different choices could have been made. However, we believe that the formalisms and properties we introduce are convenient and intuitive. This framework contributes constructively to a firmer theoretical ground of software measurement.

  17. Framework based on communicability and flow to analyze complex network dynamics

    NASA Astrophysics Data System (ADS)

    Gilson, M.; Kouvaris, N. E.; Deco, G.; Zamora-López, G.

    2018-05-01

    Graph theory constitutes a widely used and established field providing powerful tools for the characterization of complex networks. The intricate topology of networks can also be investigated by means of the collective dynamics observed in the interactions of self-sustained oscillations (synchronization patterns) or propagationlike processes such as random walks. However, networks are often inferred from real-data-forming dynamic systems, which are different from those employed to reveal their topological characteristics. This stresses the necessity for a theoretical framework dedicated to the mutual relationship between the structure and dynamics in complex networks, as the two sides of the same coin. Here we propose a rigorous framework based on the network response over time (i.e., Green function) to study interactions between nodes across time. For this purpose we define the flow that describes the interplay between the network connectivity and external inputs. This multivariate measure relates to the concepts of graph communicability and the map equation. We illustrate our theory using the multivariate Ornstein-Uhlenbeck process, which describes stable and non-conservative dynamics, but the formalism can be adapted to other local dynamics for which the Green function is known. We provide applications to classical network examples, such as small-world ring and hierarchical networks. Our theory defines a comprehensive framework that is canonically related to directed and weighted networks, thus paving a way to revise the standards for network analysis, from the pairwise interactions between nodes to the global properties of networks including community detection.

  18. Comments in reply: new directions in migration research.

    PubMed

    Shaw, R P

    1986-01-01

    The author comments on a review of his recent book NEW DIRECTIONS IN MIGRATION RESEARCH and reflects on theory and model specification, problems of estimation and statistical inference, realities of temporal and spatial heterogeneity, choices of explanatory variables, and the importance of broader political issues in migration studies. A core hypothesis is that market forces have declined as influences on internal migration in Canada over the last 30 years. Theoretical underpinnings include declining relevance of wage considerations in the decision to migrate on the assumption that marginal utility of money diminishes and marginal utility of leisure increases as society becomes wealthier. The author perceives the human capital model to have limitations and is especially troubled by the "as if" clause--that all migrants behave "as if" they calculate benefits and risks with equal rigor. The author has "shadowed" and not quantified the costs involved. He implies that normative frameworks for future migration research and planning should be established.

  19. The Complex Nature of Bilinguals' Language Usage Modulates Task-Switching Outcomes

    PubMed Central

    Yang, Hwajin; Hartanto, Andree; Yang, Sujin

    2016-01-01

    In view of inconsistent findings regarding bilingual advantages in executive functions (EF), we reviewed the literature to determine whether bilinguals' different language usage causes measureable changes in the shifting aspects of EF. By drawing on the theoretical framework of the adaptive control hypothesis—which postulates a critical link between bilinguals' varying demands on language control and adaptive cognitive control (Green and Abutalebi, 2013), we examined three factors that characterize bilinguals' language-switching experience: (a) the interactional context of conversational exchanges, (b) frequency of language switching, and (c) typology of code-switching. We also examined whether methodological variations in previous task-switching studies modulate task-specific demands on control processing and lead to inconsistencies in the literature. Our review demonstrates that not only methodological rigor but also a more finely grained, theory-based approach will be required to understand the cognitive consequences of bilinguals' varied linguistic practices in shifting EF. PMID:27199800

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ross, Michael B.; Ku, Jessie C.; Vaccarezza, Victoria M.

    The nanoscale manipulation of matter allows properties to be created in a material that would be difficult or even impossible to achieve in the bulk state. Progress towards such functional nanoscale architectures requires the development of methods to precisely locate nanoscale objects in three dimensions and for the formation of rigorous structure–function relationships across multiple size regimes (beginning from the nanoscale). Here, we use DNA as a programmable ligand to show that two- and three-dimensional mesoscale superlattice crystals with precisely engineered optical properties can be assembled from the bottom up. The superlattices can transition from exhibiting the properties of themore » constituent plasmonic nanoparticles to adopting the photonic properties defined by the mesoscale crystal (here a rhombic dodecahedron) by controlling the spacing between the gold nanoparticle building blocks. Furthermore, we develop a generally applicable theoretical framework that illustrates how crystal habit can be a design consideration for controlling far-field extinction and light confinement in plasmonic metamaterial superlattices.« less

  1. Standard representation and unified stability analysis for dynamic artificial neural network models.

    PubMed

    Kim, Kwang-Ki K; Patrón, Ernesto Ríos; Braatz, Richard D

    2018-02-01

    An overview is provided of dynamic artificial neural network models (DANNs) for nonlinear dynamical system identification and control problems, and convex stability conditions are proposed that are less conservative than past results. The three most popular classes of dynamic artificial neural network models are described, with their mathematical representations and architectures followed by transformations based on their block diagrams that are convenient for stability and performance analyses. Classes of nonlinear dynamical systems that are universally approximated by such models are characterized, which include rigorous upper bounds on the approximation errors. A unified framework and linear matrix inequality-based stability conditions are described for different classes of dynamic artificial neural network models that take additional information into account such as local slope restrictions and whether the nonlinearities within the DANNs are odd. A theoretical example shows reduced conservatism obtained by the conditions. Copyright © 2017. Published by Elsevier Ltd.

  2. The renormalization group and the implicit function theorem for amplitude equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kirkinis, Eleftherios

    2008-07-15

    This article lays down the foundations of the renormalization group (RG) approach for differential equations characterized by multiple scales. The renormalization of constants through an elimination process and the subsequent derivation of the amplitude equation [Chen et al., Phys. Rev. E 54, 376 (1996)] are given a rigorous but not abstract mathematical form whose justification is based on the implicit function theorem. Developing the theoretical framework that underlies the RG approach leads to a systematization of the renormalization process and to the derivation of explicit closed-form expressions for the amplitude equations that can be carried out with symbolic computation formore » both linear and nonlinear scalar differential equations and first order systems but independently of their particular forms. Certain nonlinear singular perturbation problems are considered that illustrate the formalism and recover well-known results from the literature as special cases.« less

  3. Information Superiority via Formal Concept Analysis

    NASA Astrophysics Data System (ADS)

    Koester, Bjoern; Schmidt, Stefan E.

    This chapter will show how to get more mileage out of information. To achieve that, we first start with an introduction to the fundamentals of Formal Concept Analysis (FCA). FCA is a highly versatile field of applied lattice theory, which allows hidden relationships to be uncovered in relational data. Moreover, FCA provides a distinguished supporting framework to subsequently find and fill information gaps in a systematic and rigorous way. In addition, we would like to build bridges via a universal approach to other communities which can be related to FCA in order for other research areas to benefit from a theory that has been elaborated for more than twenty years. Last but not least, the essential benefits of FCA will be presented algorithmically as well as theoretically by investigating a real data set from the MIPT Terrorism Knowledge Base and also by demonstrating an application in the field of Web Information Retrieval and Web Intelligence.

  4. Invited article: Neurology education research.

    PubMed

    Stern, Barney J; Lowenstein, Daniel H; Schuh, Lori A

    2008-03-11

    There is a need to rigorously study the neurologic education of medical students, neurology residents, and neurologists to determine the effectiveness of our educational efforts. We review the status of neurologic education research as it pertains to the groups of interest. We identify opportunities and impediments for education research. The introduction of the Accreditation Council for Graduate Medical Education core competencies, the Accreditation Council of Continuing Medical Education requirement to link continuing medical education to improved physician behavior and patient care, and the American Board of Medical Specialties/American Board of Psychiatry and Neurology-mandated maintenance of certification program represent research opportunities. Challenges include numerous methodologic issues such as definition of the theoretical framework of the study, adequate sample size ascertainment, and securing research funding. State-of-the-art education research will require multidisciplinary research teams and innovative funding strategies. The central goal of all concerned should be defining educational efforts that improve patient outcomes.

  5. A Rigorous Framework for Optimization of Expensive Functions by Surrogates

    NASA Technical Reports Server (NTRS)

    Booker, Andrew J.; Dennis, J. E., Jr.; Frank, Paul D.; Serafini, David B.; Torczon, Virginia; Trosset, Michael W.

    1998-01-01

    The goal of the research reported here is to develop rigorous optimization algorithms to apply to some engineering design problems for which design application of traditional optimization approaches is not practical. This paper presents and analyzes a framework for generating a sequence of approximations to the objective function and managing the use of these approximations as surrogates for optimization. The result is to obtain convergence to a minimizer of an expensive objective function subject to simple constraints. The approach is widely applicable because it does not require, or even explicitly approximate, derivatives of the objective. Numerical results are presented for a 31-variable helicopter rotor blade design example and for a standard optimization test example.

  6. A study on the theoretical and practical accuracy of conoscopic holography-based surface measurements: toward image registration in minimally invasive surgery†

    PubMed Central

    Burgner, J.; Simpson, A. L.; Fitzpatrick, J. M.; Lathrop, R. A.; Herrell, S. D.; Miga, M. I.; Webster, R. J.

    2013-01-01

    Background Registered medical images can assist with surgical navigation and enable image-guided therapy delivery. In soft tissues, surface-based registration is often used and can be facilitated by laser surface scanning. Tracked conoscopic holography (which provides distance measurements) has been recently proposed as a minimally invasive way to obtain surface scans. Moving this technique from concept to clinical use requires a rigorous accuracy evaluation, which is the purpose of our paper. Methods We adapt recent non-homogeneous and anisotropic point-based registration results to provide a theoretical framework for predicting the accuracy of tracked distance measurement systems. Experiments are conducted a complex objects of defined geometry, an anthropomorphic kidney phantom and a human cadaver kidney. Results Experiments agree with model predictions, producing point RMS errors consistently < 1 mm, surface-based registration with mean closest point error < 1 mm in the phantom and a RMS target registration error of 0.8 mm in the human cadaver kidney. Conclusions Tracked conoscopic holography is clinically viable; it enables minimally invasive surface scan accuracy comparable to current clinical methods that require open surgery. PMID:22761086

  7. Engineering education as a complex system

    NASA Astrophysics Data System (ADS)

    Gattie, David K.; Kellam, Nadia N.; Schramski, John R.; Walther, Joachim

    2011-12-01

    This paper presents a theoretical basis for cultivating engineering education as a complex system that will prepare students to think critically and make decisions with regard to poorly understood, ill-structured issues. Integral to this theoretical basis is a solution space construct developed and presented as a benchmark for evaluating problem-solving orientations that emerge within students' thinking as they progress through an engineering curriculum. It is proposed that the traditional engineering education model, while analytically rigorous, is characterised by properties that, although necessary, are insufficient for preparing students to address complex issues of the twenty-first century. A Synthesis and Design Studio model for engineering education is proposed, which maintains the necessary rigor of analysis within a uniquely complex yet sufficiently structured learning environment.

  8. Uncertainty quantification for nuclear density functional theory and information content of new measurements.

    PubMed

    McDonnell, J D; Schunck, N; Higdon, D; Sarich, J; Wild, S M; Nazarewicz, W

    2015-03-27

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.

  9. A framework for optimization and quantification of uncertainty and sensitivity for developing carbon capture systems

    DOE PAGES

    Eslick, John C.; Ng, Brenda; Gao, Qianwen; ...

    2014-12-31

    Under the auspices of the U.S. Department of Energy’s Carbon Capture Simulation Initiative (CCSI), a Framework for Optimization and Quantification of Uncertainty and Sensitivity (FOQUS) has been developed. This tool enables carbon capture systems to be rapidly synthesized and rigorously optimized, in an environment that accounts for and propagates uncertainties in parameters and models. FOQUS currently enables (1) the development of surrogate algebraic models utilizing the ALAMO algorithm, which can be used for superstructure optimization to identify optimal process configurations, (2) simulation-based optimization utilizing derivative free optimization (DFO) algorithms with detailed black-box process models, and (3) rigorous uncertainty quantification throughmore » PSUADE. FOQUS utilizes another CCSI technology, the Turbine Science Gateway, to manage the thousands of simulated runs necessary for optimization and UQ. Thus, this computational framework has been demonstrated for the design and analysis of a solid sorbent based carbon capture system.« less

  10. Evaluative criteria for qualitative research in health care: controversies and recommendations.

    PubMed

    Cohen, Deborah J; Crabtree, Benjamin F

    2008-01-01

    We wanted to review and synthesize published criteria for good qualitative research and develop a cogent set of evaluative criteria. We identified published journal articles discussing criteria for rigorous research using standard search strategies then examined reference sections of relevant journal articles to identify books and book chapters on this topic. A cross-publication content analysis allowed us to identify criteria and understand the beliefs that shape them. Seven criteria for good qualitative research emerged: (1) carrying out ethical research; (2) importance of the research; (3) clarity and coherence of the research report; (4) use of appropriate and rigorous methods; (5) importance of reflexivity or attending to researcher bias; (6) importance of establishing validity or credibility; and (7) importance of verification or reliability. General agreement was observed across publications on the first 4 quality dimensions. On the last 3, important divergent perspectives were observed in how these criteria should be applied to qualitative research, with differences based on the paradigm embraced by the authors. Qualitative research is not a unified field. Most manuscript and grant reviewers are not qualitative experts and are likely to embrace a generic set of criteria rather than those relevant to the particular qualitative approach proposed or reported. Reviewers and researchers need to be aware of this tendency and educate health care researchers about the criteria appropriate for evaluating qualitative research from within the theoretical and methodological framework from which it emerges.

  11. A user-centered model for designing consumer mobile health (mHealth) applications (apps).

    PubMed

    Schnall, Rebecca; Rojas, Marlene; Bakken, Suzanne; Brown, William; Carballo-Dieguez, Alex; Carry, Monique; Gelaude, Deborah; Mosley, Jocelyn Patterson; Travers, Jasmine

    2016-04-01

    Mobile technologies are a useful platform for the delivery of health behavior interventions. Yet little work has been done to create a rigorous and standardized process for the design of mobile health (mHealth) apps. This project sought to explore the use of the Information Systems Research (ISR) framework as guide for the design of mHealth apps. Our work was guided by the ISR framework which is comprised of 3 cycles: Relevance, Rigor and Design. In the Relevance cycle, we conducted 5 focus groups with 33 targeted end-users. In the Rigor cycle, we performed a review to identify technology-based interventions for meeting the health prevention needs of our target population. In the Design Cycle, we employed usability evaluation methods to iteratively develop and refine mock-ups for a mHealth app. Through an iterative process, we identified barriers and facilitators to the use of mHealth technology for HIV prevention for high-risk MSM, developed 'use cases' and identified relevant functional content and features for inclusion in a design document to guide future app development. Findings from our work support the use of the ISR framework as a guide for designing future mHealth apps. Results from this work provide detailed descriptions of the user-centered design and system development and have heuristic value for those venturing into the area of technology-based intervention work. Findings from this study support the use of the ISR framework as a guide for future mHealth app development. Use of the ISR framework is a potentially useful approach for the design of a mobile app that incorporates end-users' design preferences. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Optimal policies of non-cross-resistant chemotherapy on Goldie and Coldman's cancer model.

    PubMed

    Chen, Jeng-Huei; Kuo, Ya-Hui; Luh, Hsing Paul

    2013-10-01

    Mathematical models can be used to study the chemotherapy on tumor cells. Especially, in 1979, Goldie and Coldman proposed the first mathematical model to relate the drug sensitivity of tumors to their mutation rates. Many scientists have since referred to this pioneering work because of its simplicity and elegance. Its original idea has also been extended and further investigated in massive follow-up studies of cancer modeling and optimal treatment. Goldie and Coldman, together with Guaduskas, later used their model to explain why an alternating non-cross-resistant chemotherapy is optimal with a simulation approach. Subsequently in 1983, Goldie and Coldman proposed an extended stochastic based model and provided a rigorous mathematical proof to their earlier simulation work when the extended model is approximated by its quasi-approximation. However, Goldie and Coldman's analytic study of optimal treatments majorly focused on a process with symmetrical parameter settings, and presented few theoretical results for asymmetrical settings. In this paper, we recast and restate Goldie, Coldman, and Guaduskas' model as a multi-stage optimization problem. Under an asymmetrical assumption, the conditions under which a treatment policy can be optimal are derived. The proposed framework enables us to consider some optimal policies on the model analytically. In addition, Goldie, Coldman and Guaduskas' work with symmetrical settings can be treated as a special case of our framework. Based on the derived conditions, this study provides an alternative proof to Goldie and Coldman's work. In addition to the theoretical derivation, numerical results are included to justify the correctness of our work. Copyright © 2013 Elsevier Inc. All rights reserved.

  13. A Theoretical Framework for Examining Geographical Variability in the Microphysical Mechanisms of Precipitation Development.

    DTIC Science & Technology

    1986-06-01

    Energy and Natural Resources SWS Contract Report 391 FINAL REPORT A THEORETICAL FRAMEWORK FOR EXAMINING GEOGRAPHICAL VARIABILITY IN THE MICROPHYSICAL...U) A Theoretical Framework for Examining Geographical Variability in the Microphysical Mechanisms of Precipitation Development 12. PERSONAL AUTHOR(S...concentration. Oter key parameters include the degree of entrainment and stability of the environment. I 5 - T17 Unclassified ,.-. . A THEORETICAL FRAMEWORK FOR

  14. Using Framework Analysis in nursing research: a worked example.

    PubMed

    Ward, Deborah J; Furber, Christine; Tierney, Stephanie; Swallow, Veronica

    2013-11-01

    To demonstrate Framework Analysis using a worked example and to illustrate how criticisms of qualitative data analysis including issues of clarity and transparency can be addressed. Critics of the analysis of qualitative data sometimes cite lack of clarity and transparency about analytical procedures; this can deter nurse researchers from undertaking qualitative studies. Framework Analysis is flexible, systematic, and rigorous, offering clarity, transparency, an audit trail, an option for theme-based and case-based analysis and for readily retrievable data. This paper offers further explanation of the process undertaken which is illustrated with a worked example. Data were collected from 31 nursing students in 2009 using semi-structured interviews. The data collected are not reported directly here but used as a worked example for the five steps of Framework Analysis. Suggestions are provided to guide researchers through essential steps in undertaking Framework Analysis. The benefits and limitations of Framework Analysis are discussed. Nurses increasingly use qualitative research methods and need to use an analysis approach that offers transparency and rigour which Framework Analysis can provide. Nurse researchers may find the detailed critique of Framework Analysis presented in this paper a useful resource when designing and conducting qualitative studies. Qualitative data analysis presents challenges in relation to the volume and complexity of data obtained and the need to present an 'audit trail' for those using the research findings. Framework Analysis is an appropriate, rigorous and systematic method for undertaking qualitative analysis. © 2013 Blackwell Publishing Ltd.

  15. Higher Order Thinking Skills: Challenging All Students to Achieve

    ERIC Educational Resources Information Center

    Williams, R. Bruce

    2007-01-01

    Explicit instruction in thinking skills must be a priority goal of all teachers. In this book, the author presents a framework of the five Rs: Relevancy, Richness, Relatedness, Rigor, and Recursiveness. The framework serves to illuminate instruction in critical and creative thinking skills for K-12 teachers across content areas. Each chapter…

  16. Analysis of Implicit Uncertain Systems. Part 1: Theoretical Framework

    DTIC Science & Technology

    1994-12-07

    Analysis of Implicit Uncertain Systems Part I: Theoretical Framework Fernando Paganini * John Doyle 1 December 7, 1994 Abst rac t This paper...Analysis of Implicit Uncertain Systems Part I: Theoretical Framework 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S...model and a number of constraints relevant to the analysis problem under consideration. In Part I of this paper we propose a theoretical framework which

  17. A Holistic Theoretical Approach to Intellectual Disability: Going Beyond the Four Current Perspectives.

    PubMed

    Schalock, Robert L; Luckasson, Ruth; Tassé, Marc J; Verdugo, Miguel Angel

    2018-04-01

    This article describes a holistic theoretical framework that can be used to explain intellectual disability (ID) and organize relevant information into a usable roadmap to guide understanding and application. Developing the framework involved analyzing the four current perspectives on ID and synthesizing this information into a holistic theoretical framework. Practices consistent with the framework are described, and examples are provided of how multiple stakeholders can apply the framework. The article concludes with a discussion of the advantages and implications of a holistic theoretical approach to ID.

  18. Educational Technology: A Theoretical Discussion

    ERIC Educational Resources Information Center

    Andrews, Barbara; Hakken, David

    1977-01-01

    Views educational technology in relation to the pattern of technological change, argues that the new technology must be rigorously evaluated, and suggests it is best understood as a business approach to education. (DD)

  19. Use of theoretical and conceptual frameworks in qualitative research.

    PubMed

    Green, Helen Elise

    2014-07-01

    To debate the definition and use of theoretical and conceptual frameworks in qualitative research. There is a paucity of literature to help the novice researcher to understand what theoretical and conceptual frameworks are and how they should be used. This paper acknowledges the interchangeable usage of these terms and researchers' confusion about the differences between the two. It discusses how researchers have used theoretical and conceptual frameworks and the notion of conceptual models. Detail is given about how one researcher incorporated a conceptual framework throughout a research project, the purpose for doing so and how this led to a resultant conceptual model. Concepts from Abbott (1988) and Witz ( 1992 ) were used to provide a framework for research involving two case study sites. The framework was used to determine research questions and give direction to interviews and discussions to focus the research. Some research methods do not overtly use a theoretical framework or conceptual framework in their design, but this is implicit and underpins the method design, for example in grounded theory. Other qualitative methods use one or the other to frame the design of a research project or to explain the outcomes. An example is given of how a conceptual framework was used throughout a research project. Theoretical and conceptual frameworks are terms that are regularly used in research but rarely explained. Textbooks should discuss what they are and how they can be used, so novice researchers understand how they can help with research design. Theoretical and conceptual frameworks need to be more clearly understood by researchers and correct terminology used to ensure clarity for novice researchers.

  20. The Stability of Kindergarten Teachers' Effectiveness: A Generalizability Study Comparing the Framework for Teaching and the Classroom Assessment Scoring System

    ERIC Educational Resources Information Center

    Mantzicopoulos, Panayota; French, Brian F.; Patrick, Helen; Watson, J. Samuel; Ahn, Inok

    2018-01-01

    To meet recent accountability mandates, school districts are implementing assessment frameworks to document teachers' effectiveness. Observational assessments play a key role in this process, albeit without compelling evidence of their psychometric rigor. Using a sample of kindergarten teachers, we employed Generalizability theory to investigate…

  1. PISA 2015 Assessment and Analytical Framework: Science, Reading, Mathematic, Financial Literacy and Collaborative Problem Solving

    ERIC Educational Resources Information Center

    OECD Publishing, 2017

    2017-01-01

    What is important for citizens to know and be able to do? The OECD Programme for International Student Assessment (PISA) seeks to answer that question through the most comprehensive and rigorous international assessment of student knowledge and skills. The PISA 2015 Assessment and Analytical Framework presents the conceptual foundations of the…

  2. Towards a rigorous framework for studying 2-player continuous games.

    PubMed

    Shutters, Shade T

    2013-03-21

    The use of 2-player strategic games is one of the most common frameworks for studying the evolution of economic and social behavior. Games are typically played between two players, each given two choices that lie at the extremes of possible behavior (e.g. completely cooperate or completely defect). Recently there has been much interest in studying the outcome of games in which players may choose a strategy from the continuous interval between extremes, requiring the set of two possible choices be replaced by a single continuous equation. This has led to confusion and even errors in the classification of the game being played. The issue is described here specifically in relation to the continuous prisoners dilemma and the continuous snowdrift game. A case study is then presented demonstrating the misclassification that can result from the extension of discrete games into continuous space. The paper ends with a call for a more rigorous and clear framework for working with continuous games. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Testability of evolutionary game dynamics based on experimental economics data

    NASA Astrophysics Data System (ADS)

    Wang, Yijia; Chen, Xiaojie; Wang, Zhijian

    2017-11-01

    Understanding the dynamic processes of a real game system requires an appropriate dynamics model, and rigorously testing a dynamics model is nontrivial. In our methodological research, we develop an approach to testing the validity of game dynamics models that considers the dynamic patterns of angular momentum and speed as measurement variables. Using Rock-Paper-Scissors (RPS) games as an example, we illustrate the geometric patterns in the experiment data. We then derive the related theoretical patterns from a series of typical dynamics models. By testing the goodness-of-fit between the experimental and theoretical patterns, we show that the validity of these models can be evaluated quantitatively. Our approach establishes a link between dynamics models and experimental systems, which is, to the best of our knowledge, the most effective and rigorous strategy for ascertaining the testability of evolutionary game dynamics models.

  4. Uncertainty quantification for nuclear density functional theory and information content of new measurements

    DOE PAGES

    McDonnell, J. D.; Schunck, N.; Higdon, D.; ...

    2015-03-24

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squaresmore » optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. In addition, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.« less

  5. Uncertainty quantification for nuclear density functional theory and information content of new measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDonnell, J. D.; Schunck, N.; Higdon, D.

    2015-03-24

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squaresmore » optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. As a result, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.« less

  6. Dynamics of aging magnetic clouds. [interacted with solar wind

    NASA Technical Reports Server (NTRS)

    Osherovich, V. A.; Farrugia, C. J.; Burlaga, L. F.

    1993-01-01

    The dynamics of radially expanding magnetic clouds is rigorously analyzed within the framework of ideal MHD. The cloud is modelled as a cylindrically symmetric magnetic flux rope. In the force balance we include the gas pressure gradient and the Lorentz force. Interaction with the ambient solar wind due to expansion of the magnetic cloud is represented by a drag force proportional to the bulk velocity. We consider the self-similar expansion of a polytrope, and reduce the problem to an ordinary nonlinear differential equation for the evolution function. Analyzing the asymptotic behavior of the evolution function, we formulate theoretical expectations for the long-term behavior of cloud parameters. We focus on the temporal evolution of (1) the magnetic field strength; (2) the twist of the field lines; (3) the asymmetry of the total field profile; and (4) the bulk flow speed. We present data from two magnetic clouds observed at 1 AU and 2 AU, respectively, and find good agreement with theoretical expectations. For a peak magnetic field strength at 1 AU of 25 nT and a polytropic index of 0.5, we find that a magnetic cloud can be distinguished from the background interplanetary field up to a distance of about 5 AU. Taking larger magnetic fields and bigger polytropic indices this distance can double.

  7. Nucleation and growth of Y2O3 nanoparticles in a RF-ICTP reactor: a discrete sectional study based on CFD simulation supported with experiments

    NASA Astrophysics Data System (ADS)

    Dhamale, G. D.; Tak, A. K.; Mathe, V. L.; Ghorui, S.

    2018-06-01

    Synthesis of yttria (Y2O3) nanoparticles in an atmospheric pressure radiofrequency inductively coupled thermal plasma (RF-ICTP) reactor has been investigated using the discrete-sectional (DS) model of particle nucleation and growth with argon as the plasma gas. Thermal and fluid dynamic information necessary for the investigation have been extracted through rigorous computational fluid dynamic (CFD) study of the system with coupled electromagnetic equations under the extended field approach. The theoretical framework has been benchmarked against published data first, and then applied to investigate the nucleation and growth process of yttrium oxide nanoparticles in the plasma reactor using the discrete-sectional (DS) model. While a variety of nucleation and growth mechanisms are suggested in literature, the study finds that the theory of homogeneous nucleation fits well with the features observed experimentally. Significant influences of the feed rate and quench rate on the distribution of particles sizes are observed. Theoretically obtained size distribution of the particles agrees well with that observed in the experiment. Different thermo-fluid dynamic environments with varied quench rates, encountered by the propagating vapor front inside the reactor under different operating conditions are found to be primarily responsible for variations in the width of the size distribution.

  8. The social development of children with severe learning difficulties: a case study of an inclusive education initiative between two primary schools in Oxfordshire, UK.

    PubMed

    Dew-Hughes, D; Blandford, S

    1999-08-01

    This case study of primary age children in two linked Oxfordshire schools investigated the contribution of staff attitudes and practices to inequalities in education, and contrasted the socialisation of children with similar learning difficulties in different educational placements. Participant observation of a group of children and carers in a special school suggested areas of more rigorous inquiry. Structured observations compared this group with a matched sample of children with similar learning difficulties in a mainstream setting. Staff on both sites were invited to comment on findings arising from analysed data in order to identify attitudes and policies which might account for the observed differences in practice. The study was engendered by experience of differences arising from educational placement. The theoretical stance arose through reviewing previous work, predominantly the debate on inclusive education, and the wider issues of human rights and equal opportunities embedded in the social development of people with disabilities. The theoretical framework underpinning this study is established in some depth. The project was designed to investigate issues of the wider social perspective, by conducting a micro-study of one model of educational inclusion whose outcomes have direct relevance to those issues.

  9. Validation of the theoretical domains framework for use in behaviour change and implementation research.

    PubMed

    Cane, James; O'Connor, Denise; Michie, Susan

    2012-04-24

    An integrative theoretical framework, developed for cross-disciplinary implementation and other behaviour change research, has been applied across a wide range of clinical situations. This study tests the validity of this framework. Validity was investigated by behavioural experts sorting 112 unique theoretical constructs using closed and open sort tasks. The extent of replication was tested by Discriminant Content Validation and Fuzzy Cluster Analysis. There was good support for a refinement of the framework comprising 14 domains of theoretical constructs (average silhouette value 0.29): 'Knowledge', 'Skills', 'Social/Professional Role and Identity', 'Beliefs about Capabilities', 'Optimism', 'Beliefs about Consequences', 'Reinforcement', 'Intentions', 'Goals', 'Memory, Attention and Decision Processes', 'Environmental Context and Resources', 'Social Influences', 'Emotions', and 'Behavioural Regulation'. The refined Theoretical Domains Framework has a strengthened empirical base and provides a method for theoretically assessing implementation problems, as well as professional and other health-related behaviours as a basis for intervention development.

  10. The Promise of Qualitative Research to Inform Theory to Address Health Equity.

    PubMed

    Shelton, Rachel C; Griffith, Derek M; Kegler, Michelle C

    2017-10-01

    Most public health researchers and practitioners agree that we need to accelerate our efforts to eliminate health disparities and promote health equity. The past two decades of research have provided a wealth of descriptive studies, both qualitative and quantitative, that describe the size, scale, and scope of health disparities, as well as the key determinants that affect disparities. We need, however, to shift more aggressively to action informed by this research and develop deeper understandings of how to shape multilevel interventions, influenced by theories across multiple levels of the social-ecologic framework. In this article, we discuss the promising opportunities for qualitative and health equity scholars to advance research and practice through the refinement, expansion, and application of rigorous, theoretically informed qualitative research. In particular, to advance work in the area of theory to inform health equity, we encourage researchers (a) to move toward thinking about mechanisms and theory-building and refining; (b) to explicitly incorporate theories at the social, organizational, community, and policy levels and consider how factors at these levels interact synergistically with factors at the individual and interpersonal levels; (c) consider how the social dimensions that have implications for health equity intersect and interact; and (d) develop and apply more community-engaged, assets-based, and action-oriented theories and frameworks.

  11. Penetration with Long Rods: A Theoretical Framework and Comparison with Instrumented Impacts,

    DTIC Science & Technology

    1980-06-01

    theoretical framework for an experimental program is described. The theory of one dimensional wave propagation is used to show how data from instrumented long rods and targets may be fitted together to give a...the theoretical framework . In the final section the results to date are discussed.

  12. Cosmopolitanism: Extending Our Theoretical Framework for Transcultural Technical Communication Research and Teaching

    ERIC Educational Resources Information Center

    Palmer, Zsuzsanna Bacsa

    2013-01-01

    The effects of globalization on communication products and processes have resulted in document features and interactional practices that are sometimes difficult to describe within current theoretical frameworks of inter/transcultural technical communication. Although it has been recognized in our field that the old theoretical frameworks and…

  13. Navigation in Unfamiliar Cities: A Review of the Literature and a Theoretical Framework (Navigeren in Onbekende Steden: Een Literatuurstudie en een Theoretisch Kader)

    DTIC Science & Technology

    1989-10-02

    REVIEW OF THE LITERATURE AND A J.M.C. Schraagen THEORETICAL FRAMEWORK 2 Nothing from this issue may be reproduced and/or published by print, photoprint...Availability Codes Dist Special 5 Report No.: IZF 1989-36 Title: Navigation in unfamiliar cities: a review of the literature and a theoretical framework Author... theoretical framework sketched above suggests that some people may be better in encoding spatial informa- tion than others. This may be because of their

  14. The impact of cloud vertical profile on liquid water path retrieval based on the bispectral method: A theoretical study based on large-eddy simulations of shallow marine boundary layer clouds.

    PubMed

    Miller, Daniel J; Zhang, Zhibo; Ackerman, Andrew S; Platnick, Steven; Baum, Bryan A

    2016-04-27

    Passive optical retrievals of cloud liquid water path (LWP), like those implemented for Moderate Resolution Imaging Spectroradiometer (MODIS), rely on cloud vertical profile assumptions to relate optical thickness ( τ ) and effective radius ( r e ) retrievals to LWP. These techniques typically assume that shallow clouds are vertically homogeneous; however, an adiabatic cloud model is plausibly more realistic for shallow marine boundary layer cloud regimes. In this study a satellite retrieval simulator is used to perform MODIS-like satellite retrievals, which in turn are compared directly to the large-eddy simulation (LES) output. This satellite simulator creates a framework for rigorous quantification of the impact that vertical profile features have on LWP retrievals, and it accomplishes this while also avoiding sources of bias present in previous observational studies. The cloud vertical profiles from the LES are often more complex than either of the two standard assumptions, and the favored assumption was found to be sensitive to cloud regime (cumuliform/stratiform). Confirming previous studies, drizzle and cloud top entrainment of dry air are identified as physical features that bias LWP retrievals away from adiabatic and toward homogeneous assumptions. The mean bias induced by drizzle-influenced profiles was shown to be on the order of 5-10 g/m 2 . In contrast, the influence of cloud top entrainment was found to be smaller by about a factor of 2. A theoretical framework is developed to explain variability in LWP retrievals by introducing modifications to the adiabatic r e profile. In addition to analyzing bispectral retrievals, we also compare results with the vertical profile sensitivity of passive polarimetric retrieval techniques.

  15. The impact of cloud vertical profile on liquid water path retrieval based on the bispectral method: A theoretical study based on large-eddy simulations of shallow marine boundary layer clouds

    PubMed Central

    Miller, Daniel J.; Zhang, Zhibo; Ackerman, Andrew S.; Platnick, Steven; Baum, Bryan A.

    2018-01-01

    Passive optical retrievals of cloud liquid water path (LWP), like those implemented for Moderate Resolution Imaging Spectroradiometer (MODIS), rely on cloud vertical profile assumptions to relate optical thickness (τ) and effective radius (re) retrievals to LWP. These techniques typically assume that shallow clouds are vertically homogeneous; however, an adiabatic cloud model is plausibly more realistic for shallow marine boundary layer cloud regimes. In this study a satellite retrieval simulator is used to perform MODIS-like satellite retrievals, which in turn are compared directly to the large-eddy simulation (LES) output. This satellite simulator creates a framework for rigorous quantification of the impact that vertical profile features have on LWP retrievals, and it accomplishes this while also avoiding sources of bias present in previous observational studies. The cloud vertical profiles from the LES are often more complex than either of the two standard assumptions, and the favored assumption was found to be sensitive to cloud regime (cumuliform/stratiform). Confirming previous studies, drizzle and cloud top entrainment of dry air are identified as physical features that bias LWP retrievals away from adiabatic and toward homogeneous assumptions. The mean bias induced by drizzle-influenced profiles was shown to be on the order of 5–10 g/m2. In contrast, the influence of cloud top entrainment was found to be smaller by about a factor of 2. A theoretical framework is developed to explain variability in LWP retrievals by introducing modifications to the adiabatic re profile. In addition to analyzing bispectral retrievals, we also compare results with the vertical profile sensitivity of passive polarimetric retrieval techniques. PMID:29637042

  16. A conceptual framework for invasion in microbial communities.

    PubMed

    Kinnunen, Marta; Dechesne, Arnaud; Proctor, Caitlin; Hammes, Frederik; Johnson, David; Quintela-Baluja, Marcos; Graham, David; Daffonchio, Daniele; Fodelianakis, Stilianos; Hahn, Nicole; Boon, Nico; Smets, Barth F

    2016-12-01

    There is a growing interest in controlling-promoting or avoiding-the invasion of microbial communities by new community members. Resource availability and community structure have been reported as determinants of invasion success. However, most invasion studies do not adhere to a coherent and consistent terminology nor always include rigorous interpretations of the processes behind invasion. Therefore, we suggest that a consistent set of definitions and a rigorous conceptual framework are needed. We define invasion in a microbial community as the establishment of an alien microbial type in a resident community and argue how simple criteria to define aliens, residents, and alien establishment can be applied for a wide variety of communities. In addition, we suggest an adoption of the community ecology framework advanced by Vellend (2010) to clarify potential determinants of invasion. This framework identifies four fundamental processes that control community dynamics: dispersal, selection, drift and diversification. While selection has received ample attention in microbial community invasion research, the three other processes are often overlooked. Here, we elaborate on the relevance of all four processes and conclude that invasion experiments should be designed to elucidate the role of dispersal, drift and diversification, in order to obtain a complete picture of invasion as a community process.

  17. A conceptual framework for invasion in microbial communities

    PubMed Central

    Kinnunen, Marta; Dechesne, Arnaud; Proctor, Caitlin; Hammes, Frederik; Johnson, David; Quintela-Baluja, Marcos; Graham, David; Daffonchio, Daniele; Fodelianakis, Stilianos; Hahn, Nicole; Boon, Nico; Smets, Barth F

    2016-01-01

    There is a growing interest in controlling—promoting or avoiding—the invasion of microbial communities by new community members. Resource availability and community structure have been reported as determinants of invasion success. However, most invasion studies do not adhere to a coherent and consistent terminology nor always include rigorous interpretations of the processes behind invasion. Therefore, we suggest that a consistent set of definitions and a rigorous conceptual framework are needed. We define invasion in a microbial community as the establishment of an alien microbial type in a resident community and argue how simple criteria to define aliens, residents, and alien establishment can be applied for a wide variety of communities. In addition, we suggest an adoption of the community ecology framework advanced by Vellend (2010) to clarify potential determinants of invasion. This framework identifies four fundamental processes that control community dynamics: dispersal, selection, drift and diversification. While selection has received ample attention in microbial community invasion research, the three other processes are often overlooked. Here, we elaborate on the relevance of all four processes and conclude that invasion experiments should be designed to elucidate the role of dispersal, drift and diversification, in order to obtain a complete picture of invasion as a community process. PMID:27137125

  18. An ex post facto evaluation framework for place-based police interventions.

    PubMed

    Braga, Anthony A; Hureau, David M; Papachristos, Andrew V

    2011-12-01

    A small but growing body of research evidence suggests that place-based police interventions generate significant crime control gains. While place-based policing strategies have been adopted by a majority of U.S. police departments, very few agencies make a priori commitments to rigorous evaluations. Recent methodological developments were applied to conduct a rigorous ex post facto evaluation of the Boston Police Department's Safe Street Team (SST) hot spots policing program. A nonrandomized quasi-experimental design was used to evaluate the violent crime control benefits of the SST program at treated street segments and intersections relative to untreated street segments and intersections. Propensity score matching techniques were used to identify comparison places in Boston. Growth curve regression models were used to analyze violent crime trends at treatment places relative to control places. UNITS OF ANALYSIS: Using computerized mapping and database software, a micro-level place database of violent index crimes at all street segments and intersections in Boston was created. Yearly counts of violent index crimes between 2000 and 2009 at the treatment and comparison street segments and intersections served as the key outcome measure. The SST program was associated with a statistically significant reduction in violent index crimes at the treatment places relative to the comparison places without displacing crime into proximate areas. To overcome the challenges of evaluation in real-world settings, evaluators need to continuously develop innovative approaches that take advantage of new theoretical and methodological approaches.

  19. Social media and outbreaks of emerging infectious diseases: A systematic review of literature.

    PubMed

    Tang, Lu; Bie, Bijie; Park, Sung-Eun; Zhi, Degui

    2018-04-05

    The public often turn to social media for information during emerging infectious diseases (EIDs) outbreaks. This study identified the major approaches and assessed the rigors in published research articles on EIDs and social media. We searched 5 databases for published journal articles on EIDs and social media. We then evaluated these articles in terms of EIDs studied, social media examined, theoretical frameworks, methodologic approaches, and research findings. Thirty articles were included in the analysis (published between January 1, 2010, and March 1, 2016). EIDs that received most scholarly attention were H1N1 (or swine flu, n = 15), Ebola virus (n = 10), and H7N9 (or avian flu/bird flu, n = 2). Twitter was the most often studied social media (n = 17), followed by YouTube (n = 6), Facebook (n = 6), and blogs (n = 6). Three major approaches in this area of inquiry are identified: (1) assessment of the public's interest in and responses to EIDs, (2) examination of organizations' use of social media in communicating EIDs, and (3) evaluation of the accuracy of EID-related medical information on social media. Although academic studies of EID communication on social media are on the rise, they still suffer from a lack of theorization and a need for more methodologic rigor. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  20. Evaluative Criteria for Qualitative Research in Health Care: Controversies and Recommendations

    PubMed Central

    Cohen, Deborah J.; Crabtree, Benjamin F.

    2008-01-01

    PURPOSE We wanted to review and synthesize published criteria for good qualitative research and develop a cogent set of evaluative criteria. METHODS We identified published journal articles discussing criteria for rigorous research using standard search strategies then examined reference sections of relevant journal articles to identify books and book chapters on this topic. A cross-publication content analysis allowed us to identify criteria and understand the beliefs that shape them. RESULTS Seven criteria for good qualitative research emerged: (1) carrying out ethical research; (2) importance of the research; (3) clarity and coherence of the research report; (4) use of appropriate and rigorous methods; (5) importance of reflexivity or attending to researcher bias; (6) importance of establishing validity or credibility; and (7) importance of verification or reliability. General agreement was observed across publications on the first 4 quality dimensions. On the last 3, important divergent perspectives were observed in how these criteria should be applied to qualitative research, with differences based on the paradigm embraced by the authors. CONCLUSION Qualitative research is not a unified field. Most manuscript and grant reviewers are not qualitative experts and are likely to embrace a generic set of criteria rather than those relevant to the particular qualitative approach proposed or reported. Reviewers and researchers need to be aware of this tendency and educate health care researchers about the criteria appropriate for evaluating qualitative research from within the theoretical and methodological framework from which it emerges. PMID:18626033

  1. Describing and understanding behavioral responses to multiple stressors and multiple stimuli.

    PubMed

    Hale, Robin; Piggott, Jeremy J; Swearer, Stephen E

    2017-01-01

    Understanding the effects of environmental change on natural ecosystems is a major challenge, particularly when multiple stressors interact to produce unexpected "ecological surprises" in the form of complex, nonadditive effects that can amplify or reduce their individual effects. Animals often respond behaviorally to environmental change, and multiple stressors can have both population-level and community-level effects. However, the individual, not combined, effects of stressors on animal behavior are commonly studied. There is a need to understand how animals respond to the more complex combinations of stressors that occur in nature, which requires a systematic and rigorous approach to quantify the various potential behavioral responses to the independent and interactive effects of stressors. We illustrate a robust, systematic approach for understanding behavioral responses to multiple stressors based on integrating schemes used to quantitatively classify interactions in multiple-stressor research and to qualitatively view interactions between multiple stimuli in behavioral experiments. We introduce and unify the two frameworks, highlighting their conceptual and methodological similarities, and use four case studies to demonstrate how this unification could improve our interpretation of interactions in behavioral experiments and guide efforts to manage the effects of multiple stressors. Our unified approach: (1) provides behavioral ecologists with a more rigorous and systematic way to quantify how animals respond to interactions between multiple stimuli, an important theoretical advance, (2) helps us better understand how animals behave when they encounter multiple, potentially interacting stressors, and (3) contributes more generally to the understanding of "ecological surprises" in multiple stressors research.

  2. Gender equity in STEM: The role of dual enrollment science courses in selecting a college major

    NASA Astrophysics Data System (ADS)

    Persons, Christopher Andrew

    A disproportionately low number of women, despite rigorous high school preparation and evidenced interest in STEM through voluntary participation in additional coursework, declare a STEM-related college major. The result of this drop in participation in STEM-related college majors is a job market flooded with men and the support of an incorrect stereotype: STEM is for men. This research seeks to assess the effects, if any, that Dual Enrollment (DE) science courses have on students' self-identified intent to declare a STEM-related college major as well as the respective perceptions of both male and female students. Self-Determination Theory and Gender Equity Framework were used respectively as the theoretical frames. High school students from six schools in two district participated in an online survey and focus groups in this mixed methods study. The results of the research identified the role the DE course played in their choice of college major, possible interventions to correct the underrepresentation, and societal causes for the stereotype.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Angulo, J. C.

    Rigorous and universal relationships among radial expectation values of any D-dimensional quantum-mechanical system are obtained, using Renyi-like position-momentum inequalities in an information-theoretical framework. Although the results are expressed in terms of four moments (two in position space and two in the momentum one), especially interesting are the cases that provide expressions of uncertainty in terms of products {sup 1/a}{sup 1/b}, widely considered in the literature, including the famous Heisenberg relationship {>=}D{sup 2}/4. Improved bounds for these products have recently been provided, but are always restricted to positive orders a,b>0. The interesting part of this work aremore » the inequalities for negative orders. A study of these relationships is carried out for atomic systems in their ground state. Some results are given in terms of relevant physical quantities, including the kinetic and electron-nucleus attraction energies, the diamagnetic susceptibility, and the height of the peak of the Compton profile, among others.« less

  4. The FoReVer Methodology: A MBSE Framework for Formal Verification

    NASA Astrophysics Data System (ADS)

    Baracchi, Laura; Mazzini, Silvia; Cimatti, Alessandro; Tonetta, Stefano; Garcia, Gerald

    2013-08-01

    The need for high level of confidence and operational integrity in critical space (software) systems is well recognized in the Space industry and has been addressed so far through rigorous System and Software Development Processes and stringent Verification and Validation regimes. The Model Based Space System Engineering process (MBSSE) derived in the System and Software Functional Requirement Techniques study (SSFRT) focused on the application of model based engineering technologies to support the space system and software development processes, from mission level requirements to software implementation through model refinements and translations. In this paper we report on our work in the ESA-funded FoReVer project where we aim at developing methodological, theoretical and technological support for a systematic approach to the space avionics system development, in phases 0/A/B/C. FoReVer enriches the MBSSE process with contract-based formal verification of properties, at different stages from system to software, through a step-wise refinement approach, with the support for a Software Reference Architecture.

  5. Molecular polarizability of water from local dielectric response theory

    DOE PAGES

    Ge, Xiaochuan; Lu, Deyu

    2017-08-08

    Here, we propose a fully ab initio theory to compute the electron density response under the perturbation in the local field. This method is based on our recently developed local dielectric response theory [Phys. Rev. B 92, 241107(R), 2015], which provides a rigorous theoretical framework to treat local electronic excitations in both nite and extended systems beyond the commonly employed dipole approximation. We have applied this method to study the electronic part of the molecular polarizability of water in ice Ih and liquid water. Our results reveal that the crystal field of the hydrogen-bond network has strong anisotropic effects, whichmore » significantly enhance the out-of-plane component and suppress the in-plane component perpendicular to the bisector direction. The contribution from the charge transfer is equally important, which increases the isotropic molecular polarizability by 5-6%. Our study provides new insights into the dielectric properties of water, which form the basis to understand electronic excitations in water and to develop accurate polarizable force fields of water.« less

  6. Microscopic optical model potential based on a Dirac Brueckner Hartree Fock approach and the relevant uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Xu, Ruirui; Ma, Zhongyu; Muether, Herbert; van Dalen, E. N. E.; Liu, Tinjin; Zhang, Yue; Zhang, Zhi; Tian, Yuan

    2017-09-01

    A relativistic microscopic optical model potential, named CTOM, for nucleon-nucleus scattering is investigated in the framework of Dirac-Brueckner-Hartree-Fock approach. The microscopic feature of CTOM is guaranteed through rigorously adopting the isospin dependent DBHF calculation within the subtracted T matrix scheme. In order to verify its prediction power, a global study n, p+ A scattering are carried out. The predicted scattering observables coincide with experimental data within a good accuracy over a broad range of targets and a large region of energies only with two free items, namely the free-range factor t in the applied improved local density approximation and minor adjustments of the scalar and vector potentials in the low-density region. In addition, to estimate the uncertainty of the theoretical results, the deterministic simple least square approach is preliminarily employed to derive the covariance of predicted angular distributions, which is also briefly contained in this paper.

  7. Psychiatry and humanism in Argentina.

    PubMed

    Niño Amieva, Alejandra

    2016-04-01

    The authors of the present selection of Latin American Psychiatry texts were characterized by a common deep humanistic attitude. These prolific writers were able to establish or extend the scope of the discipline in which they chose to act, questioning the establishment of rigid boundaries within the framework of a rigorous epistemological reflection. Thus the systematizing spirit of Jose Ingenieros' in the context of positivist evolutionism, resulted in the act of founding a discipline that integrated the biological and the social. In the case of Guillermo Vidal his conception of mental health went beyond the biomedical to consider psychotherapies as an emotional commitment, continence and empathic understanding; with regard to César Cabral his formation and extensive clinical practice resulted in a work defined by the inquiring into the theoretical concepts underlying Psychiatry and Clinical Psychology. This brief selection does not exhaust the issues or the level of ideas and discussions of Psychiatry in Argentina, but constitutes a textual corpus representative of a disciplinary conception understood as scientific and humanistic endeavor.

  8. Turbine blade-tip clearance excitation forces

    NASA Technical Reports Server (NTRS)

    Martinez-Sanchez, M.; Greitzer, E. M.

    1985-01-01

    The results of an effort to assess the existing knowledge and plan the required experimentation in the area of turbine blade tip excitation forces is summarized. The work was carried out in three phases. The first was a literature search and evaluation, which served to highlight the state of the art and to expose the need for an articulated theoretical experimental effort to provide not only design data, but also a rational framework for their extrapolation to new configurations and regimes. The second phase was a start in this direction, in which several of the explicit or implicit assumptions contained in the usual formulations of the Alford force effect were removed and a rigorous linearized flow analysis of the behavior of a nonsymmetric actuator disc was carried out. In the third phase a preliminary design of a turbine test facility that would be used to measure both the excitation forces themselves and the flow patterns responsible for them were conducted and do so over a realistic range of dimensionless parameters.

  9. Comparative thermodynamic studies of aqueous glutaric acid, ammonium sulfate and sodium chloride aerosol at high humidity.

    PubMed

    Hanford, Kate L; Mitchem, Laura; Reid, Jonathan P; Clegg, Simon L; Topping, David O; McFiggans, Gordon B

    2008-10-02

    Aerosol optical tweezers are used to simultaneously characterize and compare the hygroscopic properties of two aerosol droplets, one containing inorganic and organic solutes and the second, referred to as the control droplet, containing a single inorganic salt. The inorganic solute is either sodium chloride or ammonium sulfate and the organic component is glutaric acid. The time variation in the size of each droplet (3-7 microm in radius) is recorded with 1 s time resolution and with nanometre accuracy. The size of the control droplet is used to estimate the relative humidity with an accuracy of better than +/-0.09%. Thus, the Kohler curve of the multicomponent inorganic/organic droplet, which characterizes the variation in equilibrium droplet size with relative humidity, can be determined directly. The measurements presented here focus on high relative humidities, above 97%, in the limit of dilute solutes. The experimental data are compared with theoretical treatments that, while ignoring the interactions between the inorganic and organic components, are based upon accurate representations of the activity-concentration relationships of aqueous solutions of the individual salts. The organic component is treated by a parametrized fit to experimental data or by the UNIFAC model and the water activity of the equilibrium solution droplet is calculated using the approach suggested by Clegg, Seinfeld and Brimblecombe or the Zdanovskii-Stokes-Robinson approximation. It is shown that such an experimental strategy, comparing directly droplets of different composition, enables highly accurate measurements of the hygroscopic properties, allowing the theoretical treatments to be rigorously tested. Typical deviations of the experimental measurements from theoretical predictions are shown to be around 1% in equilibrium size, comparable to the variation between the theoretical frameworks considered.

  10. Rigor or Reliability and Validity in Qualitative Research: Perspectives, Strategies, Reconceptualization, and Recommendations.

    PubMed

    Cypress, Brigitte S

    Issues are still raised even now in the 21st century by the persistent concern with achieving rigor in qualitative research. There is also a continuing debate about the analogous terms reliability and validity in naturalistic inquiries as opposed to quantitative investigations. This article presents the concept of rigor in qualitative research using a phenomenological study as an exemplar to further illustrate the process. Elaborating on epistemological and theoretical conceptualizations by Lincoln and Guba, strategies congruent with qualitative perspective for ensuring validity to establish the credibility of the study are described. A synthesis of the historical development of validity criteria evident in the literature during the years is explored. Recommendations are made for use of the term rigor instead of trustworthiness and the reconceptualization and renewed use of the concept of reliability and validity in qualitative research, that strategies for ensuring rigor must be built into the qualitative research process rather than evaluated only after the inquiry, and that qualitative researchers and students alike must be proactive and take responsibility in ensuring the rigor of a research study. The insights garnered here will move novice researchers and doctoral students to a better conceptual grasp of the complexity of reliability and validity and its ramifications for qualitative inquiry.

  11. A Social-Cognitive Theoretical Framework for Examining Music Teacher Identity

    ERIC Educational Resources Information Center

    McClellan, Edward

    2017-01-01

    The purpose of the study was to examine a diverse range of research literature to provide a social-cognitive theoretical framework as a foundation for definition of identity construction in the music teacher education program. The review of literature may reveal a theoretical framework based around tenets of commonly studied constructs in the…

  12. Penetration with Long Rods: A Theoretical Framework and Comparison with Instrumented Impacts

    DTIC Science & Technology

    1981-05-01

    program to begin probing the details of the interaction process. The theoretical framework underlying such a program is explained in detail. The theory of...of the time sequence of events during penetration. Data from one series of experiments, reported in detail elsewhere, is presented and discussed within the theoretical framework .

  13. Theoretical and Conceptual Frameworks Used in Research on Family-School Partnerships

    ERIC Educational Resources Information Center

    Yamauchi, Lois A.; Ponte, Eva; Ratliffe, Katherine T.; Traynor, Kevin

    2017-01-01

    This study investigated the theoretical frameworks used to frame research on family-school partnerships over a five-year period. Although many researchers have described their theoretical approaches, little has been written about the diversity of frameworks used and how they are applied. Coders analyzed 215 journal articles published from 2007 to…

  14. A rigorous and simpler method of image charges

    NASA Astrophysics Data System (ADS)

    Ladera, C. L.; Donoso, G.

    2016-07-01

    The method of image charges relies on the proven uniqueness of the solution of the Laplace differential equation for an electrostatic potential which satisfies some specified boundary conditions. Granted by that uniqueness, the method of images is rightly described as nothing but shrewdly guessing which and where image charges are to be placed to solve the given electrostatics problem. Here we present an alternative image charges method that is based not on guessing but on rigorous and simpler theoretical grounds, namely the constant potential inside any conductor and the application of powerful geometric symmetries. The aforementioned required uniqueness and, more importantly, guessing are therefore both altogether dispensed with. Our two new theoretical fundaments also allow the image charges method to be introduced in earlier physics courses for engineering and sciences students, instead of its present and usual introduction in electromagnetic theory courses that demand familiarity with the Laplace differential equation and its boundary conditions.

  15. Why so many "rigorous" evaluations fail to identify unintended consequences of development programs: How mixed methods can contribute.

    PubMed

    Bamberger, Michael; Tarsilla, Michele; Hesse-Biber, Sharlene

    2016-04-01

    Many widely-used impact evaluation designs, including randomized control trials (RCTs) and quasi-experimental designs (QEDs), frequently fail to detect what are often quite serious unintended consequences of development programs. This seems surprising as experienced planners and evaluators are well aware that unintended consequences frequently occur. Most evaluation designs are intended to determine whether there is credible evidence (statistical, theory-based or narrative) that programs have achieved their intended objectives and the logic of many evaluation designs, even those that are considered the most "rigorous," does not permit the identification of outcomes that were not specified in the program design. We take the example of RCTs as they are considered by many to be the most rigorous evaluation designs. We present a numbers of cases to illustrate how infusing RCTs with a mixed-methods approach (sometimes called an "RCT+" design) can strengthen the credibility of these designs and can also capture important unintended consequences. We provide a Mixed Methods Evaluation Framework that identifies 9 ways in which UCs can occur, and we apply this framework to two of the case studies. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Seven Basic Steps to Solving Ethical Dilemmas in Special Education: A Decision-Making Framework

    ERIC Educational Resources Information Center

    Stockall, Nancy; Dennis, Lindsay R.

    2015-01-01

    This article presents a seven-step framework for decision making to solve ethical issues in special education. The authors developed the framework from the existing literature and theoretical frameworks of justice, critique, care, and professionalism. The authors briefly discuss each theoretical framework and then describe the decision-making…

  17. Conducting Human Research

    DTIC Science & Technology

    2009-08-05

    Socio-cultural data acquisition, extraction, and management.??? First the idea of a theoretical framework will be very briefly discussed as well as...SUBJECT TERMS human behavior, theoretical framework , hypothesis development, experimental design, ethical research, statistical power, human laboratory...who throw rocks? • How can we make them stay too far away to throw rocks? UNCLASSIFIED – Approved for Public Release Theoretical Framework / Conceptual

  18. An Overview of a Theoretical Framework of Phenomenography in Qualitative Education Research: An Example from Physics Education Research

    ERIC Educational Resources Information Center

    Ornek, Funda

    2008-01-01

    One or more theoretical frameworks or orientations are used in qualitative education research. In this paper, the main tenets, the background and the appropriateness of phenomenography, which is one of the theoretical frameworks used in qualitative research, will be depicted. Further, the differences among phenomenography, phenomenology and…

  19. Using a Theoretical Framework of Institutional Culture to Analyse an Institutional Strategy Document

    ERIC Educational Resources Information Center

    Jacobs, Anthea Hydi Maxine

    2016-01-01

    This paper builds on a conceptual analysis of institutional culture in higher education. A theoretical framework was proposed to analyse institutional documents of two higher education institutions in the Western Cape, for the period 2002 to 2012 (Jacobs 2012). The elements of this theoretical framework are "shared values and beliefs",…

  20. Factors Influencing the Use of Learning Management System in Saudi Arabian Higher Education: A Theoretical Framework

    ERIC Educational Resources Information Center

    Asiri, Mohammed J. Sherbib; Mahmud, Rosnaini bt; Bakar, Kamariah Abu; Ayub, Ahmad Fauzi bin Mohd

    2012-01-01

    The purpose of this paper is to present the theoretical framework underlying a research on factors that influence utilization of the Jusur Learning Management System (Jusur LMS) in Saudi Arabian public universities. Development of the theoretical framework was done based on library research approach. Initially, the existing literature relevant to…

  1. Professional Development and Use of Digital Technologies by Science Teachers: a Review of Theoretical Frameworks

    NASA Astrophysics Data System (ADS)

    Fernandes, Geraldo W. Rocha; Rodrigues, António M.; Ferreira, Carlos Alberto

    2018-03-01

    This article aims to characterise the research on science teachers' professional development programs that support the use of Information and Communication Technologies (ICTs) and the main trends concerning the theoretical frameworks (theoretical foundation, literature review or background) that underpin these studies. Through a systematic review of the literature, 76 articles were found and divided into two axes on training science teachers and the use of digital technologies with their categories. The first axis (characterisation of articles) presents the category key features that characterise the articles selected (major subjects, training and actions for the professional development and major ICT tools and digital resources). The second axis (trends of theoretical frameworks) has three categories organised in theoretical frameworks that emphasise the following: (a) the digital technologies, (b) prospects of curricular renewal and (c) cognitive processes. It also characterised a group of articles with theoretical frameworks that contain multiple elements without deepening them or that even lack a theoretical framework that supports the studies. In this review, we found that many professional development programs for teachers still use inadequate strategies for bringing about change in teacher practices. New professional development proposals are emerging with the objective of minimising such difficulties and this analysis could be a helpful tool to restructure those proposals.

  2. A theoretical framework to support research of health service innovation.

    PubMed

    Fox, Amanda; Gardner, Glenn; Osborne, Sonya

    2015-02-01

    Health service managers and policy makers are increasingly concerned about the sustainability of innovations implemented in health care settings. The increasing demand on health services requires that innovations are both effective and sustainable; however, research in this field is limited, with multiple disciplines, approaches and paradigms influencing the field. These variations prevent a cohesive approach, and therefore the accumulation of research findings, in the development of a body of knowledge. The purpose of this paper is to provide a thorough examination of the research findings and provide an appropriate theoretical framework to examine sustainability of health service innovation. This paper presents an integrative review of the literature available in relation to sustainability of health service innovation and provides the development of a theoretical framework based on integration and synthesis of the literature. A theoretical framework serves to guide research, determine variables, influence data analysis and is central to the quest for ongoing knowledge development. This research outlines the sustainability of innovation framework; a theoretical framework suitable for examining the sustainability of health service innovation. If left unaddressed, health services research will continue in an ad hoc manner, preventing full utilisation of outcomes, recommendations and knowledge for effective provision of health services. The sustainability of innovation theoretical framework provides an operational basis upon which reliable future research can be conducted.

  3. [Reconsidering evaluation criteria regarding health care research: toward an integrative framework of quantitative and qualitative criteria].

    PubMed

    Miyata, Hiroaki; Kai, Ichiro

    2006-05-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confused and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. It is therefore very important to reconsider evaluation criteria regarding rigor in social science. As Lincoln & Guba have already compared quantitative paradigms (validity, reliability, neutrality, generalizability) with qualitative paradigms (credibility, dependability, confirmability, transferability), we have discuss use of evaluation criteria based on pragmatic perspective. Validity/Credibility is the paradigm concerned to observational framework, while Reliability/Dependability refer to the range of stability in observations, Neutrality/Confirmability reflect influences between observers and subjects, Generalizability/Transferability have epistemological difference in the way findings are applied. Qualitative studies, however, does not always chose the qualitative paradigms. If we assume the stability to some extent, it is better to use the quantitative paradigm (reliability). Moreover as a quantitative study can not always guarantee a perfect observational framework, with stability in all phases of observations, it is useful to use qualitative paradigms to enhance the rigor in the study.

  4. Next Generation of Leaching Tests

    EPA Science Inventory

    A corresponding abstract has been cleared for this presentation. The four methods comprising the Leaching Environmental Assessment Framework are described along with the tools to support implementation of the more rigorous and accurate source terms that are developed using LEAF ...

  5. Measuring coherence with entanglement concurrence

    NASA Astrophysics Data System (ADS)

    Qi, Xianfei; Gao, Ting; Yan, Fengli

    2017-07-01

    Quantum coherence is a fundamental manifestation of the quantum superposition principle. Recently, Baumgratz et al (2014 Phys. Rev. Lett. 113 140401) presented a rigorous framework to quantify coherence from the view of theory of physical resource. Here we propose a new valid quantum coherence measure which is a convex roof measure, for a quantum system of arbitrary dimension, essentially using the generalized Gell-Mann matrices. Rigorous proof shows that the proposed coherence measure, coherence concurrence, fulfills all the requirements dictated by the resource theory of quantum coherence measures. Moreover, strong links between the resource frameworks of coherence concurrence and entanglement concurrence is derived, which shows that any degree of coherence with respect to some reference basis can be converted to entanglement via incoherent operations. Our work provides a clear quantitative and operational connection between coherence and entanglement based on two kinds of concurrence. This new coherence measure, coherence concurrence, may also be beneficial to the study of quantum coherence.

  6. Tactics for mechanized reasoning: a commentary on Milner (1984) ‘The use of machines to assist in rigorous proof’

    PubMed Central

    Gordon, M. J. C.

    2015-01-01

    Robin Milner's paper, ‘The use of machines to assist in rigorous proof’, introduces methods for automating mathematical reasoning that are a milestone in the development of computer-assisted theorem proving. His ideas, particularly his theory of tactics, revolutionized the architecture of proof assistants. His methodology for automating rigorous proof soundly, particularly his theory of type polymorphism in programing, led to major contributions to the theory and design of programing languages. His citation for the 1991 ACM A.M. Turing award, the most prestigious award in computer science, credits him with, among other achievements, ‘probably the first theoretically based yet practical tool for machine assisted proof construction’. This commentary was written to celebrate the 350th anniversary of the journal Philosophical Transactions of the Royal Society. PMID:25750147

  7. Interface Pattern Selection in Directional Solidification

    NASA Technical Reports Server (NTRS)

    Trivedi, Rohit; Tewari, Surendra N.

    2001-01-01

    The central focus of this research is to establish key scientific concepts that govern the selection of cellular and dendritic patterns during the directional solidification of alloys. Ground-based studies have established that the conditions under which cellular and dendritic microstructures form are precisely where convection effects are dominant in bulk samples. Thus, experimental data can not be obtained terrestrially under pure diffusive regime. Furthermore, reliable theoretical models are not yet possible which can quantitatively incorporate fluid flow in the pattern selection criterion. Consequently, microgravity experiments on cellular and dendritic growth are designed to obtain benchmark data under diffusive growth conditions that can be quantitatively analyzed and compared with the rigorous theoretical model to establish the fundamental principles that govern the selection of specific microstructure and its length scales. In the cellular structure, different cells in an array are strongly coupled so that the cellular pattern evolution is controlled by complex interactions between thermal diffusion, solute diffusion and interface effects. These interactions give infinity of solutions, and the system selects only a narrow band of solutions. The aim of this investigation is to obtain benchmark data and develop a rigorous theoretical model that will allow us to quantitatively establish the physics of this selection process.

  8. Validation of the theoretical domains framework for use in behaviour change and implementation research

    PubMed Central

    2012-01-01

    Background An integrative theoretical framework, developed for cross-disciplinary implementation and other behaviour change research, has been applied across a wide range of clinical situations. This study tests the validity of this framework. Methods Validity was investigated by behavioural experts sorting 112 unique theoretical constructs using closed and open sort tasks. The extent of replication was tested by Discriminant Content Validation and Fuzzy Cluster Analysis. Results There was good support for a refinement of the framework comprising 14 domains of theoretical constructs (average silhouette value 0.29): ‘Knowledge’, ‘Skills’, ‘Social/Professional Role and Identity’, ‘Beliefs about Capabilities’, ‘Optimism’, ‘Beliefs about Consequences’, ‘Reinforcement’, ‘Intentions’, ‘Goals’, ‘Memory, Attention and Decision Processes’, ‘Environmental Context and Resources’, ‘Social Influences’, ‘Emotions’, and ‘Behavioural Regulation’. Conclusions The refined Theoretical Domains Framework has a strengthened empirical base and provides a method for theoretically assessing implementation problems, as well as professional and other health-related behaviours as a basis for intervention development. PMID:22530986

  9. JACOB: an enterprise framework for computational chemistry.

    PubMed

    Waller, Mark P; Dresselhaus, Thomas; Yang, Jack

    2013-06-15

    Here, we present just a collection of beans (JACOB): an integrated batch-based framework designed for the rapid development of computational chemistry applications. The framework expedites developer productivity by handling the generic infrastructure tier, and can be easily extended by user-specific scientific code. Paradigms from enterprise software engineering were rigorously applied to create a scalable, testable, secure, and robust framework. A centralized web application is used to configure and control the operation of the framework. The application-programming interface provides a set of generic tools for processing large-scale noninteractive jobs (e.g., systematic studies), or for coordinating systems integration (e.g., complex workflows). The code for the JACOB framework is open sourced and is available at: www.wallerlab.org/jacob. Copyright © 2013 Wiley Periodicals, Inc.

  10. Understanding Decision-Making in Specialized Domestic Violence Courts: Can Contemporary Theoretical Frameworks Help Guide These Decisions?

    PubMed

    Pinchevsky, Gillian M

    2016-05-22

    This study fills a gap in the literature by exploring the utility of contemporary courtroom theoretical frameworks-uncertainty avoidance, causal attribution, and focal concerns-for explaining decision-making in specialized domestic violence courts. Using data from two specialized domestic violence courts, this study explores the predictors of prosecutorial and judicial decision-making and the extent to which these factors are congruent with theoretical frameworks often used in studies of court processing. Findings suggest that these theoretical frameworks only partially help explain decision-making in the courts under study. A discussion of the findings and implications for future research is provided. © The Author(s) 2016.

  11. Computation of elementary modes: a unifying framework and the new binary approach

    PubMed Central

    Gagneur, Julien; Klamt, Steffen

    2004-01-01

    Background Metabolic pathway analysis has been recognized as a central approach to the structural analysis of metabolic networks. The concept of elementary (flux) modes provides a rigorous formalism to describe and assess pathways and has proven to be valuable for many applications. However, computing elementary modes is a hard computational task. In recent years we assisted in a multiplication of algorithms dedicated to it. We require a summarizing point of view and a continued improvement of the current methods. Results We show that computing the set of elementary modes is equivalent to computing the set of extreme rays of a convex cone. This standard mathematical representation provides a unified framework that encompasses the most prominent algorithmic methods that compute elementary modes and allows a clear comparison between them. Taking lessons from this benchmark, we here introduce a new method, the binary approach, which computes the elementary modes as binary patterns of participating reactions from which the respective stoichiometric coefficients can be computed in a post-processing step. We implemented the binary approach in FluxAnalyzer 5.1, a software that is free for academics. The binary approach decreases the memory demand up to 96% without loss of speed giving the most efficient method available for computing elementary modes to date. Conclusions The equivalence between elementary modes and extreme ray computations offers opportunities for employing tools from polyhedral computation for metabolic pathway analysis. The new binary approach introduced herein was derived from this general theoretical framework and facilitates the computation of elementary modes in considerably larger networks. PMID:15527509

  12. Texas M-E flexible pavement design system: literature review and proposed framework.

    DOT National Transportation Integrated Search

    2012-04-01

    Recent developments over last several decades have offered an opportunity for more rational and rigorous pavement design procedures. Substantial work has already been completed in Texas, nationally, and internationally, in all aspects of modeling, ma...

  13. From screening to synthesis: using nvivo to enhance transparency in qualitative evidence synthesis.

    PubMed

    Houghton, Catherine; Murphy, Kathy; Meehan, Ben; Thomas, James; Brooker, Dawn; Casey, Dympna

    2017-03-01

    To explore the experiences and perceptions of healthcare staff caring for people with dementia in the acute setting. This article focuses on the methodological process of conducting framework synthesis using nvivo for each stage of the review: screening, data extraction, synthesis and critical appraisal. Qualitative evidence synthesis brings together many research findings in a meaningful way that can be used to guide practice and policy development. For this purpose, synthesis must be conducted in a comprehensive and rigorous way. There has been previous discussion on how using nvivo can assist in enhancing and illustrate the rigorous processes involved. Qualitative framework synthesis. Twelve documents, or research reports, based on nine studies, were included for synthesis. The benefits of using nvivo are outlined in terms of facilitating teams of researchers to systematically and rigorously synthesise findings. nvivo functions were used to conduct a sensitivity analysis. Some valuable lessons were learned, and these are presented to assist and guide researchers who wish to use similar methods in future. Ultimately, good qualitative evidence synthesis will provide practitioners and policymakers with significant information that will guide decision-making on many aspects of clinical practice. The example provided explored how people with dementia are cared for acute settings. © 2016 The Authors. Journal of Clinical Nursing Published by John Wiley & Sons Ltd.

  14. Tackling complexities in understanding the social determinants of health: the contribution of ethnographic research.

    PubMed

    Bandyopadhyay, Mridula

    2011-11-25

    The complexities inherent in understanding the social determinants of health are often not well-served by quantitative approaches. My aim is to show that well-designed and well-conducted ethnographic studies have an important contribution to make in this regard. Ethnographic research designs are a difficult but rigorous approach to research questions that require us to understand the complexity of people's social and cultural lives. I draw on an ethnographic study to describe the complexities of studying maternal health in a rural area in India. I then show how the lessons learnt in that setting and context can be applied to studies done in very different settings. I show how ethnographic research depends for rigour on a theoretical framework for sample selection; why immersion in the community under study, and rapport building with research participants, is important to ensure rich and meaningful data; and how flexible approaches to data collection lead to the gradual emergence of an analysis based on intense cross-referencing with community views and thus a conclusion that explains the similarities and differences observed. When using ethnographic research design it can be difficult to specify in advance the exact details of the study design. Researchers can encounter issues in the field that require them to change what they planned on doing. In rigorous ethnographic studies, the researcher in the field is the research instrument and needs to be well trained in the method. Ethnographic research is challenging, but nevertheless provides a rewarding way of researching complex health problems that require an understanding of the social and cultural determinants of health.

  15. A unified statistical approach to non-negative matrix factorization and probabilistic latent semantic indexing

    PubMed Central

    Wang, Guoli; Ebrahimi, Nader

    2014-01-01

    Non-negative matrix factorization (NMF) is a powerful machine learning method for decomposing a high-dimensional nonnegative matrix V into the product of two nonnegative matrices, W and H, such that V ∼ W H. It has been shown to have a parts-based, sparse representation of the data. NMF has been successfully applied in a variety of areas such as natural language processing, neuroscience, information retrieval, image processing, speech recognition and computational biology for the analysis and interpretation of large-scale data. There has also been simultaneous development of a related statistical latent class modeling approach, namely, probabilistic latent semantic indexing (PLSI), for analyzing and interpreting co-occurrence count data arising in natural language processing. In this paper, we present a generalized statistical approach to NMF and PLSI based on Renyi's divergence between two non-negative matrices, stemming from the Poisson likelihood. Our approach unifies various competing models and provides a unique theoretical framework for these methods. We propose a unified algorithm for NMF and provide a rigorous proof of monotonicity of multiplicative updates for W and H. In addition, we generalize the relationship between NMF and PLSI within this framework. We demonstrate the applicability and utility of our approach as well as its superior performance relative to existing methods using real-life and simulated document clustering data. PMID:25821345

  16. A unified statistical approach to non-negative matrix factorization and probabilistic latent semantic indexing.

    PubMed

    Devarajan, Karthik; Wang, Guoli; Ebrahimi, Nader

    2015-04-01

    Non-negative matrix factorization (NMF) is a powerful machine learning method for decomposing a high-dimensional nonnegative matrix V into the product of two nonnegative matrices, W and H , such that V ∼ W H . It has been shown to have a parts-based, sparse representation of the data. NMF has been successfully applied in a variety of areas such as natural language processing, neuroscience, information retrieval, image processing, speech recognition and computational biology for the analysis and interpretation of large-scale data. There has also been simultaneous development of a related statistical latent class modeling approach, namely, probabilistic latent semantic indexing (PLSI), for analyzing and interpreting co-occurrence count data arising in natural language processing. In this paper, we present a generalized statistical approach to NMF and PLSI based on Renyi's divergence between two non-negative matrices, stemming from the Poisson likelihood. Our approach unifies various competing models and provides a unique theoretical framework for these methods. We propose a unified algorithm for NMF and provide a rigorous proof of monotonicity of multiplicative updates for W and H . In addition, we generalize the relationship between NMF and PLSI within this framework. We demonstrate the applicability and utility of our approach as well as its superior performance relative to existing methods using real-life and simulated document clustering data.

  17. Characterizing and Discovering Spatiotemporal Social Contact Patterns for Healthcare.

    PubMed

    Yang, Bo; Pei, Hongbin; Chen, Hechang; Liu, Jiming; Xia, Shang

    2017-08-01

    During an epidemic, the spatial, temporal and demographic patterns of disease transmission are determined by multiple factors. In addition to the physiological properties of the pathogens and hosts, the social contact of the host population, which characterizes the reciprocal exposures of individuals to infection according to their demographic structure and various social activities, are also pivotal to understanding and predicting the prevalence of infectious diseases. How social contact is measured will affect the extent to which we can forecast the dynamics of infections in the real world. Most current work focuses on modeling the spatial patterns of static social contact. In this work, we use a novel perspective to address the problem of how to characterize and measure dynamic social contact during an epidemic. We propose an epidemic-model-based tensor deconvolution framework in which the spatiotemporal patterns of social contact are represented by the factors of the tensors. These factors can be discovered using a tensor deconvolution procedure with the integration of epidemic models based on rich types of data, mainly heterogeneous outbreak surveillance data, socio-demographic census data and physiological data from medical reports. Using reproduction models that include SIR/SIS/SEIR/SEIS models as case studies, the efficacy and applications of the proposed framework are theoretically analyzed, empirically validated and demonstrated through a set of rigorous experiments using both synthetic and real-world data.

  18. Reconstructing Dewey: Dialectics and Democratic Education

    ERIC Educational Resources Information Center

    Jackson, Jeff

    2012-01-01

    This essay aims to demonstrate the theoretical purchase offered by linking Dewey's educational theory with a rigorous account of dialectical development. Drawing on recent literature which emphasizes the continuing influence of Hegel on Dewey's thought throughout the latter's career, this essay reconstructs Dewey's argument regarding the…

  19. Imaginary-frequency polarizability and van der Waals force constants of two-electron atoms, with rigorous bounds

    NASA Technical Reports Server (NTRS)

    Glover, R. M.; Weinhold, F.

    1977-01-01

    Variational functionals of Braunn and Rebane (1972) for the imagery-frequency polarizability (IFP) have been generalized by the method of Gramian inequalities to give rigorous upper and lower bounds, valid even when the true (but unknown) unperturbed wavefunction must be represented by a variational approximation. Using these formulas in conjunction with flexible variational trial functions, tight error bounds are computed for the IFP and the associated two- and three-body van der Waals interaction constants of the ground 1(1S) and metastable 2(1,3S) states of He and Li(+). These bounds generally establish the ground-state properties to within a fraction of a per cent and metastable properties to within a few per cent, permitting a comparative assessment of competing theoretical methods at this level of accuracy. Unlike previous 'error bounds' for these properties, the present results have a completely a priori theoretical character, with no empirical input data.

  20. A Theoretically Consistent Framework for Modelling Lagrangian Particle Deposition in Plant Canopies

    NASA Astrophysics Data System (ADS)

    Bailey, Brian N.; Stoll, Rob; Pardyjak, Eric R.

    2018-06-01

    We present a theoretically consistent framework for modelling Lagrangian particle deposition in plant canopies. The primary focus is on describing the probability of particles encountering canopy elements (i.e., potential deposition), and provides a consistent means for including the effects of imperfect deposition through any appropriate sub-model for deposition efficiency. Some aspects of the framework draw upon an analogy to radiation propagation through a turbid medium with which to develop model theory. The present method is compared against one of the most commonly used heuristic Lagrangian frameworks, namely that originally developed by Legg and Powell (Agricultural Meteorology, 1979, Vol. 20, 47-67), which is shown to be theoretically inconsistent. A recommendation is made to discontinue the use of this heuristic approach in favour of the theoretically consistent framework developed herein, which is no more difficult to apply under equivalent assumptions. The proposed framework has the additional advantage that it can be applied to arbitrary canopy geometries given readily measurable parameters describing vegetation structure.

  1. A Rigorous Treatment of Energy Extraction from a Rotating Black Hole

    NASA Astrophysics Data System (ADS)

    Finster, F.; Kamran, N.; Smoller, J.; Yau, S.-T.

    2009-05-01

    The Cauchy problem is considered for the scalar wave equation in the Kerr geometry. We prove that by choosing a suitable wave packet as initial data, one can extract energy from the black hole, thereby putting supperradiance, the wave analogue of the Penrose process, into a rigorous mathematical framework. We quantify the maximal energy gain. We also compute the infinitesimal change of mass and angular momentum of the black hole, in agreement with Christodoulou’s result for the Penrose process. The main mathematical tool is our previously derived integral representation of the wave propagator.

  2. A Rigorous Test of the Fit of the Circumplex Model to Big Five Personality Data: Theoretical and Methodological Issues and Two Large Sample Empirical Tests.

    PubMed

    DeGeest, David Scott; Schmidt, Frank

    2015-01-01

    Our objective was to apply the rigorous test developed by Browne (1992) to determine whether the circumplex model fits Big Five personality data. This test has yet to be applied to personality data. Another objective was to determine whether blended items explained correlations among the Big Five traits. We used two working adult samples, the Eugene-Springfield Community Sample and the Professional Worker Career Experience Survey. Fit to the circumplex was tested via Browne's (1992) procedure. Circumplexes were graphed to identify items with loadings on multiple traits (blended items), and to determine whether removing these items changed five-factor model (FFM) trait intercorrelations. In both samples, the circumplex structure fit the FFM traits well. Each sample had items with dual-factor loadings (8 items in the first sample, 21 in the second). Removing blended items had little effect on construct-level intercorrelations among FFM traits. We conclude that rigorous tests show that the fit of personality data to the circumplex model is good. This finding means the circumplex model is competitive with the factor model in understanding the organization of personality traits. The circumplex structure also provides a theoretically and empirically sound rationale for evaluating intercorrelations among FFM traits. Even after eliminating blended items, FFM personality traits remained correlated.

  3. Relationships among Classical Test Theory and Item Response Theory Frameworks via Factor Analytic Models

    ERIC Educational Resources Information Center

    Kohli, Nidhi; Koran, Jennifer; Henn, Lisa

    2015-01-01

    There are well-defined theoretical differences between the classical test theory (CTT) and item response theory (IRT) frameworks. It is understood that in the CTT framework, person and item statistics are test- and sample-dependent. This is not the perception with IRT. For this reason, the IRT framework is considered to be theoretically superior…

  4. Intersubjectivity in Theoretical and Practical Online Courses

    ERIC Educational Resources Information Center

    Lim, Janine; Hall, Barbara M.

    2015-01-01

    Rigorous interaction between peers has been an elusive goal in online asynchronous discussions. Intersubjectivity, the goal of peer-to-peer interaction, is a representation of a higher quality of synthesis. It is the representation of knowledge construction achieved through a synergistic progression from individual contributions to sequences of…

  5. An Exemplar for Teaching and Learning Qualitative Research

    ERIC Educational Resources Information Center

    Onwuegbuzie, Anthony J.; Leech, Nancy L.; Slate, John R.; Stark, Marcella; Sharma, Bipin; Frels, Rebecca; Harris, Kristin; Combs, Julie P.

    2012-01-01

    In this article, we outline a course wherein the instructors teach students how to conduct rigorous qualitative research. We discuss the four major distinct, but overlapping, phases of the course: conceptual/theoretical, technical, applied, and emergent scholar. Students write several qualitative reports, called qualitative notebooks, which…

  6. Separation Kernel Protection Profile Revisited: Choices and Rationale

    DTIC Science & Technology

    2010-12-01

    provide the most stringent protection and rigorous security countermeasures” [ IATF ]. In other words, robustness is not the same as assurance. Figure 3... IATF Information Assurance Technical Framework, Chapter 4, Release 3.1, National Security Agency, September 2002. Karjoth01 G. Karjoth, “The

  7. Removing Preconceptions with a "Learning Cycle."

    ERIC Educational Resources Information Center

    Gang, Su

    1995-01-01

    Describes a teaching experiment that uses the Learning Cycle to achieve the reorientation of physics' students conceptual frameworks away from commonsense perspectives toward scientifically rigorous outlooks. Uses Archimedes' principle as the content topic while using the Learning Cycle to remove students' nonscientific preconceptions. (JRH)

  8. A computational framework for automation of point defect calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goyal, Anuj; Gorai, Prashun; Peng, Haowei

    We have developed a complete and rigorously validated open-source Python framework to automate point defect calculations using density functional theory. Furthermore, the framework provides an effective and efficient method for defect structure generation, and creation of simple yet customizable workflows to analyze defect calculations. This package provides the capability to compute widely-accepted correction schemes to overcome finite-size effects, including (1) potential alignment, (2) image-charge correction, and (3) band filling correction to shallow defects. Using Si, ZnO and In2O3 as test examples, we demonstrate the package capabilities and validate the methodology.

  9. A computational framework for automation of point defect calculations

    DOE PAGES

    Goyal, Anuj; Gorai, Prashun; Peng, Haowei; ...

    2017-01-13

    We have developed a complete and rigorously validated open-source Python framework to automate point defect calculations using density functional theory. Furthermore, the framework provides an effective and efficient method for defect structure generation, and creation of simple yet customizable workflows to analyze defect calculations. This package provides the capability to compute widely-accepted correction schemes to overcome finite-size effects, including (1) potential alignment, (2) image-charge correction, and (3) band filling correction to shallow defects. Using Si, ZnO and In2O3 as test examples, we demonstrate the package capabilities and validate the methodology.

  10. The Community-First Land-Centred Theoretical Framework: Bringing a "Good Mind" to Indigenous Education Research?

    ERIC Educational Resources Information Center

    Styres, Sandra D.; Zinga, Dawn M.

    2013-01-01

    This article introduces an emergent research theoretical framework, the community-first Land-centred research framework. Carefully examining the literature within Indigenous educational research, we noted the limited approaches for engaging in culturally aligned and relevant research within Indigenous communities. The community-first Land-centred…

  11. An e-Learning Theoretical Framework

    ERIC Educational Resources Information Center

    Aparicio, Manuela; Bacao, Fernando; Oliveira, Tiago

    2016-01-01

    E-learning systems have witnessed a usage and research increase in the past decade. This article presents the e-learning concepts ecosystem. It summarizes the various scopes on e-learning studies. Here we propose an e-learning theoretical framework. This theory framework is based upon three principal dimensions: users, technology, and services…

  12. Threshold Capabilities: Threshold Concepts and Knowledge Capability Linked through Variation Theory

    ERIC Educational Resources Information Center

    Baillie, Caroline; Bowden, John A.; Meyer, Jan H. F.

    2013-01-01

    The Threshold Capability Integrated Theoretical Framework (TCITF) is presented as a framework for the design of university curricula, aimed at developing graduates' capability to deal with previously unseen situations in their professional, social, and personal lives. The TCITF is a new theoretical framework derived from, and heavily dependent…

  13. Hyperspherical Sparse Approximation Techniques for High-Dimensional Discontinuity Detection

    DOE PAGES

    Zhang, Guannan; Webster, Clayton G.; Gunzburger, Max; ...

    2016-08-04

    This work proposes a hyperspherical sparse approximation framework for detecting jump discontinuities in functions in high-dimensional spaces. The need for a novel approach results from the theoretical and computational inefficiencies of well-known approaches, such as adaptive sparse grids, for discontinuity detection. Our approach constructs the hyperspherical coordinate representation of the discontinuity surface of a function. Then sparse approximations of the transformed function are built in the hyperspherical coordinate system, with values at each point estimated by solving a one-dimensional discontinuity detection problem. Due to the smoothness of the hypersurface, the new technique can identify jump discontinuities with significantly reduced computationalmore » cost, compared to existing methods. Several approaches are used to approximate the transformed discontinuity surface in the hyperspherical system, including adaptive sparse grid and radial basis function interpolation, discrete least squares projection, and compressed sensing approximation. Moreover, hierarchical acceleration techniques are also incorporated to further reduce the overall complexity. In conclusion, rigorous complexity analyses of the new methods are provided, as are several numerical examples that illustrate the effectiveness of our approach.« less

  14. Ologs: a categorical framework for knowledge representation.

    PubMed

    Spivak, David I; Kent, Robert E

    2012-01-01

    In this paper we introduce the olog, or ontology log, a category-theoretic model for knowledge representation (KR). Grounded in formal mathematics, ologs can be rigorously formulated and cross-compared in ways that other KR models (such as semantic networks) cannot. An olog is similar to a relational database schema; in fact an olog can serve as a data repository if desired. Unlike database schemas, which are generally difficult to create or modify, ologs are designed to be user-friendly enough that authoring or reconfiguring an olog is a matter of course rather than a difficult chore. It is hoped that learning to author ologs is much simpler than learning a database definition language, despite their similarity. We describe ologs carefully and illustrate with many examples. As an application we show that any primitive recursive function can be described by an olog. We also show that ologs can be aligned or connected together into a larger network using functors. The various methods of information flow and institutions can then be used to integrate local and global world-views. We finish by providing several different avenues for future research.

  15. The Westermarck Hypothesis and the Israeli Kibbutzim: Reconciling Contrasting Evidence.

    PubMed

    Shor, Eran

    2015-11-01

    The case of the communal education system in the Israeli kibbutzim is often considered to provide conclusive support for Westermarck's (1891) assertion regarding the existence of evolutionary inbreeding avoidance mechanisms in humans. However, recent studies that have gone back to the kibbutzim seem to provide contrasting evidence and reopen the discussion regarding the case of the kibbutzim and inbreeding avoidance more generally (Lieberman & Lobel, 2012; Shor & Simchai, 2009). In this article, I reassess the case of the kibbutzim, reevaluating the findings and conclusions of these recent research endeavors. I argue that the differences between recent research reports largely result from conceptual and methodological differences and that, in fact, these studies provide insights that are more similar than first meets the eye. I also suggest that we must reexamine the common assumption that the kibbutzim serve as an ideal natural experiment for examining the sources of incest avoidance and the incest taboo. Finally, I discuss the implications of these studies to the longstanding debate over the Westermarck hypothesis and call for a synthetic theoretical framework that produces more precise predictions and more rigorous empirical research designs.

  16. The Second Life Researcher Toolkit - An Exploration of Inworld Tools, Methods and Approaches for Researching Educational Projects in Second Life

    NASA Astrophysics Data System (ADS)

    Moschini, Elena

    Academics are beginning to explore the educational potential of Second LifeTM (SL) by setting up inworld educational activities and projects. Given the relative novelty of the use of virtual world environments in higher education many such projects are still at pilot stage. However the initial pilot and experimentation stage will have to be followed by a rigorous evaluation process as for more traditional teaching projects. The chapter addresses issues about SL research tools and research methods. It introduces a "researcher toolkit" that includes: the various stages in the evaluation of SL educational projects and the theoretical framework that can inform such projects; an outline of the inworld tools that can be utilised or customised for academic research purposes; a review of methods for collecting feedback from participants and of the main ethical issues involved in researching virtual world environments; a discussion on the technical skills required to operate a research project in SL. The chapter also offers an indication of the inworld opportunities for the dissemination of SL research findings.

  17. A combinatorial model for dentate gyrus sparse coding

    DOE PAGES

    Severa, William; Parekh, Ojas; James, Conrad D.; ...

    2016-12-29

    The dentate gyrus forms a critical link between the entorhinal cortex and CA3 by providing a sparse version of the signal. Concurrent with this increase in sparsity, a widely accepted theory suggests the dentate gyrus performs pattern separation—similar inputs yield decorrelated outputs. Although an active region of study and theory, few logically rigorous arguments detail the dentate gyrus’s (DG) coding. We suggest a theoretically tractable, combinatorial model for this action. The model provides formal methods for a highly redundant, arbitrarily sparse, and decorrelated output signal.To explore the value of this model framework, we assess how suitable it is for twomore » notable aspects of DG coding: how it can handle the highly structured grid cell representation in the input entorhinal cortex region and the presence of adult neurogenesis, which has been proposed to produce a heterogeneous code in the DG. We find tailoring the model to grid cell input yields expansion parameters consistent with the literature. In addition, the heterogeneous coding reflects activity gradation observed experimentally. Lastly, we connect this approach with more conventional binary threshold neural circuit models via a formal embedding.« less

  18. Ologs: A Categorical Framework for Knowledge Representation

    PubMed Central

    Spivak, David I.; Kent, Robert E.

    2012-01-01

    In this paper we introduce the olog, or ontology log, a category-theoretic model for knowledge representation (KR). Grounded in formal mathematics, ologs can be rigorously formulated and cross-compared in ways that other KR models (such as semantic networks) cannot. An olog is similar to a relational database schema; in fact an olog can serve as a data repository if desired. Unlike database schemas, which are generally difficult to create or modify, ologs are designed to be user-friendly enough that authoring or reconfiguring an olog is a matter of course rather than a difficult chore. It is hoped that learning to author ologs is much simpler than learning a database definition language, despite their similarity. We describe ologs carefully and illustrate with many examples. As an application we show that any primitive recursive function can be described by an olog. We also show that ologs can be aligned or connected together into a larger network using functors. The various methods of information flow and institutions can then be used to integrate local and global world-views. We finish by providing several different avenues for future research. PMID:22303434

  19. Estimation of integral curves from high angular resolution diffusion imaging (HARDI) data.

    PubMed

    Carmichael, Owen; Sakhanenko, Lyudmila

    2015-05-15

    We develop statistical methodology for a popular brain imaging technique HARDI based on the high order tensor model by Özarslan and Mareci [10]. We investigate how uncertainty in the imaging procedure propagates through all levels of the model: signals, tensor fields, vector fields, and fibers. We construct asymptotically normal estimators of the integral curves or fibers which allow us to trace the fibers together with confidence ellipsoids. The procedure is computationally intense as it blends linear algebra concepts from high order tensors with asymptotical statistical analysis. The theoretical results are illustrated on simulated and real datasets. This work generalizes the statistical methodology proposed for low angular resolution diffusion tensor imaging by Carmichael and Sakhanenko [3], to several fibers per voxel. It is also a pioneering statistical work on tractography from HARDI data. It avoids all the typical limitations of the deterministic tractography methods and it delivers the same information as probabilistic tractography methods. Our method is computationally cheap and it provides well-founded mathematical and statistical framework where diverse functionals on fibers, directions and tensors can be studied in a systematic and rigorous way.

  20. Estimation of integral curves from high angular resolution diffusion imaging (HARDI) data

    PubMed Central

    Carmichael, Owen; Sakhanenko, Lyudmila

    2015-01-01

    We develop statistical methodology for a popular brain imaging technique HARDI based on the high order tensor model by Özarslan and Mareci [10]. We investigate how uncertainty in the imaging procedure propagates through all levels of the model: signals, tensor fields, vector fields, and fibers. We construct asymptotically normal estimators of the integral curves or fibers which allow us to trace the fibers together with confidence ellipsoids. The procedure is computationally intense as it blends linear algebra concepts from high order tensors with asymptotical statistical analysis. The theoretical results are illustrated on simulated and real datasets. This work generalizes the statistical methodology proposed for low angular resolution diffusion tensor imaging by Carmichael and Sakhanenko [3], to several fibers per voxel. It is also a pioneering statistical work on tractography from HARDI data. It avoids all the typical limitations of the deterministic tractography methods and it delivers the same information as probabilistic tractography methods. Our method is computationally cheap and it provides well-founded mathematical and statistical framework where diverse functionals on fibers, directions and tensors can be studied in a systematic and rigorous way. PMID:25937674

  1. Dynamics of two-phase interfaces and surface tensions: A density-functional theory perspective

    NASA Astrophysics Data System (ADS)

    Yatsyshin, Petr; Sibley, David N.; Duran-Olivencia, Miguel A.; Kalliadasis, Serafim

    2016-11-01

    Classical density functional theory (DFT) is a statistical mechanical framework for the description of fluids at the nanoscale, where the inhomogeneity of the fluid structure needs to be carefully accounted for. By expressing the grand free-energy of the fluid as a functional of the one-body density, DFT offers a theoretically consistent and computationally accessible way to obtain two-phase interfaces and respective interfacial tensions in a ternary solid-liquid-gas system. The dynamic version of DFT (DDFT) can be rigorously derived from the Smoluchowsky picture of the dynamics of colloidal particles in a solvent. It is generally agreed that DDFT can capture the diffusion-driven evolution of many soft-matter systems. In this context, we use DDFT to investigate the dynamic behaviour of two-phase interfaces in both equilibrium and dynamic wetting and discuss the possibility of defining a time-dependent surface tension, which still remains in debate. We acknowledge financial support from the European Research Council via Advanced Grant No. 247031 and from the Engineering and Physical Sciences Research Council of the UK via Grants No. EP/L027186 and EP/L020564.

  2. MOE vs. M&E: considering the difference between measuring strategic effectiveness and monitoring tactical evaluation.

    PubMed

    Diehl, Glen; Major, Solomon

    2015-01-01

    Measuring the effectiveness of military Global Health Engagements (GHEs) has become an area of increasing interest to the military medical field. As a result, there have been efforts to more logically and rigorously evaluate GHE projects and programs; many of these have been based on the Logic and Results Frameworks. However, while these Frameworks are apt and appropriate planning tools, they are not ideally suited to measuring programs' effectiveness. This article introduces military medicine professionals to the Measures of Effectiveness for Defense Engagement and Learning (MODEL) program, which implements a new method of assessment, one that seeks to rigorously use Measures of Effectiveness (vs. Measures of Performance) to gauge programs' and projects' success and fidelity to Theater Campaign goals. While the MODEL method draws on the Logic and Results Frameworks where appropriate, it goes beyond their planning focus by using the latest social scientific and econometric evaluation methodologies to link on-the-ground GHE "lines of effort" to the realization of national and strategic goals and end-states. It is hoped these methods will find use beyond the MODEL project itself, and will catalyze a new body of rigorous, empirically based work, which measures the effectiveness of a broad spectrum of GHE and security cooperation activities. We based our strategies on the principle that it is much more cost-effective to prevent conflicts than it is to stop one once it's started. I cannot overstate the importance of our theater security cooperation programs as the centerpiece to securing our Homeland from the irregular and catastrophic threats of the 21st Century.-GEN James L. Jones, USMC (Ret.). Reprint & Copyright © 2015 Association of Military Surgeons of the U.S.

  3. First-Order Frameworks for Managing Models in Engineering Optimization

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natlia M.; Lewis, Robert Michael

    2000-01-01

    Approximation/model management optimization (AMMO) is a rigorous methodology for attaining solutions of high-fidelity optimization problems with minimal expense in high- fidelity function and derivative evaluation. First-order AMMO frameworks allow for a wide variety of models and underlying optimization algorithms. Recent demonstrations with aerodynamic optimization achieved three-fold savings in terms of high- fidelity function and derivative evaluation in the case of variable-resolution models and five-fold savings in the case of variable-fidelity physics models. The savings are problem dependent but certain trends are beginning to emerge. We give an overview of the first-order frameworks, current computational results, and an idea of the scope of the first-order framework applicability.

  4. Evidence - competence - discourse: the theoretical framework of the multi-centre clinical ethics support project METAP.

    PubMed

    Reiter-Theil, Stella; Mertz, Marcel; Schürmann, Jan; Stingelin Giles, Nicola; Meyer-Zehnder, Barbara

    2011-09-01

    In this paper we assume that 'theory' is important for Clinical Ethics Support Services (CESS). We will argue that the underlying implicit theory should be reflected. Moreover, we suggest that the theoretical components on which any clinical ethics support (CES) relies should be explicitly articulated in order to enhance the quality of CES. A theoretical framework appropriate for CES will be necessarily complex and should include ethical (both descriptive and normative), metaethical and organizational components. The various forms of CES that exist in North-America and in Europe show their underlying theory more or less explicitly, with most of them referring to some kind of theoretical components including 'how-to' questions (methodology), organizational issues (implementation), problem analysis (phenomenology or typology of problems), and related ethical issues such as end-of-life decisions (major ethical topics). In order to illustrate and explain the theoretical framework that we are suggesting for our own CES project METAP, we will outline this project which has been established in a multi-centre context in several healthcare institutions. We conceptualize three 'pillars' as the major components of our theoretical framework: (1) evidence, (2) competence, and (3) discourse. As a whole, the framework is aimed at developing a foundation of our CES project METAP. We conclude that this specific integration of theoretical components is a promising model for the fruitful further development of CES. © 2011 Blackwell Publishing Ltd.

  5. Culturally Sensitive Risk Behavior Prevention Programs for African American Adolescents: A Systematic Analysis

    ERIC Educational Resources Information Center

    Metzger, Isha; Cooper, Shauna M.; Zarrett, Nicole; Flory, Kate

    2013-01-01

    The current review conducted a systematic assessment of culturally sensitive risk prevention programs for African American adolescents. Prevention programs meeting the inclusion and exclusion criteria were evaluated across several domains: (1) theoretical orientation and foundation; (2) methodological rigor; (3) level of cultural integration; (4)…

  6. Help Seeking in Academic Settings: Goals, Groups, and Contexts

    ERIC Educational Resources Information Center

    Karabenick, Stuart A., Ed.; Newman, Richard S., Ed.

    2006-01-01

    Building on Karabenick's earlier volume on this topic and maintaining its high standards of scholarship and intellectual rigor, this book brings together contemporary work that is theoretically as well as practically important. It highlights current trends in the area and gives expanded attention to applications to teaching and learning. The…

  7. Accumulating Knowledge: When Are Reading Intervention Results Meaningful?

    ERIC Educational Resources Information Center

    Fletcher, Jack M.; Wagner, Richard K.

    2014-01-01

    The three target articles provide examples of intervention studies that are excellent models for the field. They rely on rigorous and elegant designs, the interventions are motivated by attention to underlying theoretical mechanisms, and longitudinal designs are used to examine the duration of effects of interventions that occur. When studies are…

  8. Researching the Study Abroad Experience

    ERIC Educational Resources Information Center

    McLeod, Mark; Wainwright, Philip

    2009-01-01

    The authors propose a paradigm for rigorous scientific assessment of study abroad programs, with the focus being on how study abroad experiences affect psychological constructs as opposed to looking solely at study-abroad-related outcomes. Social learning theory is used as a possible theoretical basis for making testable hypotheses and guiding…

  9. Theoretical Framework of Leadership in Higher Education of England and Wales

    ERIC Educational Resources Information Center

    Mukan, Nataliya; Havrylyuk, Marianna; Stolyarchuk, Lesia

    2015-01-01

    In the article the theoretical framework of leadership in higher education of England and Wales has been studied. The main objectives of the article are defined as analysis of scientific and pedagogical literature, which highlights different aspects of the problem under research; characteristic of the theoretical fundamentals of educational…

  10. Towards Developing a Theoretical Framework for Measuring Public Sector Managers' Career Success

    ERIC Educational Resources Information Center

    Rasdi, Roziah Mohd; Ismail, Maimunah; Uli, Jegak; Noah, Sidek Mohd

    2009-01-01

    Purpose: The purpose of this paper is to develop a theoretical framework for measuring public sector managers' career success. Design/methodology/approach: The theoretical foundation used in this study is social cognitive career theory. To conduct a literature search, several keywords were identified, i.e. career success, objective and subjective…

  11. The Importance of Theoretical Frameworks and Mathematical Constructs in Designing Digital Tools

    ERIC Educational Resources Information Center

    Trinter, Christine

    2016-01-01

    The increase in availability of educational technologies over the past few decades has not only led to new practice in teaching mathematics but also to new perspectives in research, methodologies, and theoretical frameworks within mathematics education. Hence, the amalgamation of theoretical and pragmatic considerations in digital tool design…

  12. Research on nursing practice. Stress.

    PubMed

    Lyon, B L; Werner, J S

    1987-01-01

    Clearly, there is not agreement among nurse researchers regarding a definitional orientation to stress that best fits nursing's orientation to human experiences. Varying theoretical orientations are used to explain stress or stress-related phenomena, for example, stress as a stimulus, stress as a response, and stress as a transaction. The studies are fairly evenly distributed among the four definitional categories. The various approaches do not represent expanding theoretical explanations of stress, but rather are incompatible approaches to explaining stress. More disconcerting than the lack of direction in research efforts, however, is that all too commonly the measurement of the variables and the methodology were not "linked" or consistent with the theoretical framework. For the most part the research efforts reviewed fell short of theory testing. Even for those studies that were designed to contribute to theory development, it was rare to find research reports that included implications regarding theory in the discussion sections. Additionally, discussion sections of the reports typically did not identify alternative explanations for the findings. Quasi-experimental, ex post facto, and causal comparative studies typically were flawed with validity problems. If nursing is to strengthen its contribution to knowledge in the area of stress, more emphasis will need to be placed on congruence between design and measurement, and on issues of statistical rigor, validity, and reliability. Although some might argue that it is too early to expect a coalescing of definitional orientations, it is important to point out that considerable confusion regarding stress phenomena results from a nonsystematic or nondeliberative mixture of incompatible orientations to or definitions of stress. It is little wonder that the vast number of opinion articles that appear in the nursing literature include varied definitions of stress, often making conflicting recommendations regarding the nursing assessment of stress and nursing intervention strategies to assist a person in stress management efforts.

  13. Theoretical Studies of Elementary Hydrocarbon Species and Their Reactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allen, Wesley D.; Schaefer, Henry F.

    The research program supported by this DOE grant carried out both methodological development and computational applications of first-principles theoretical chemistry based on quantum mechanical wavefunctions, as directed toward understanding and harnessing the fundamental chemical physics of combustion. To build and refine the world’s database of thermochemistry, spectroscopy, and chemical kinetics, predictive and definitive computational methods are needed that push the envelope of modern electronic structure theory. The application of such methods has been made to gain comprehensive knowledge of the paradigmatic reaction networks by which the n- and i-propyl, t-butyl, and n-butyl radicals are oxidized by O 2. Numerous ROOmore » and QOOH intermediates in these R + O 2 reaction systems have been characterized along with the interconnecting isomerization transition states and the barriers leading to fragmentation. Other combustion-related intermediates have also been studied, including methylsulfinyl radical, cyclobutylidene, and radicals derived from acetaldehyde and vinyl alcohol. Theoretical advances have been achieved and made available to the scientific community by implementation into PSI4, an open-source electronic structure computer package emphasizing automation, advanced libraries, and interoperability. We have pursued the development of universal explicitly correlated methods applicable to general electronic wavefunctions, as well as a framework that allows multideterminant reference functions to be expressed as a single determinant from quasiparticle operators. Finally, a rigorous analytical tool for correlated wavefunctions has been created to elucidate dispersion interactions, which play essential roles in many areas of chemistry, but whose effects are often masked and enigmatic. Our research decomposes and analyzes the coupled-cluster electron correlation energy in molecular systems as a function of interelectronic distance. Concepts are emerging that can be used to explain the influence of dispersion on the thermochemistry of large hydrocarbons, including fuels important to combustion technologies.« less

  14. Allometric Convergence in Savanna Trees and Implications for the Use of Plant Scaling Models in Variable Ecosystems

    PubMed Central

    Tredennick, Andrew T.; Bentley, Lisa Patrick; Hanan, Niall P.

    2013-01-01

    Theoretical models of allometric scaling provide frameworks for understanding and predicting how and why the morphology and function of organisms vary with scale. It remains unclear, however, if the predictions of ‘universal’ scaling models for vascular plants hold across diverse species in variable environments. Phenomena such as competition and disturbance may drive allometric scaling relationships away from theoretical predictions based on an optimized tree. Here, we use a hierarchical Bayesian approach to calculate tree-specific, species-specific, and ‘global’ (i.e. interspecific) scaling exponents for several allometric relationships using tree- and branch-level data harvested from three savanna sites across a rainfall gradient in Mali, West Africa. We use these exponents to provide a rigorous test of three plant scaling models (Metabolic Scaling Theory (MST), Geometric Similarity, and Stress Similarity) in savanna systems. For the allometric relationships we evaluated (diameter vs. length, aboveground mass, stem mass, and leaf mass) the empirically calculated exponents broadly overlapped among species from diverse environments, except for the scaling exponents for length, which increased with tree cover and density. When we compare empirical scaling exponents to the theoretical predictions from the three models we find MST predictions are most consistent with our observed allometries. In those situations where observations are inconsistent with MST we find that departure from theory corresponds with expected tradeoffs related to disturbance and competitive interactions. We hypothesize savanna trees have greater length-scaling exponents than predicted by MST due to an evolutionary tradeoff between fire escape and optimization of mechanical stability and internal resource transport. Future research on the drivers of systematic allometric variation could reconcile the differences between observed scaling relationships in variable ecosystems and those predicted by ideal models such as MST. PMID:23484003

  15. A Theoretically Grounded Framework for Integrating the Scholarship of Teaching and Learning

    ERIC Educational Resources Information Center

    Walls, Jill K.

    2016-01-01

    SoTL scholars have written about the importance and utility of teaching from a guiding theoretical framework. In this paper, ecological theory and specifically Bronfenbrenner's bioecological model, is examined as a potential framework for synthesizing SoTL research findings to inform teaching and learning scholarship at the college level. A…

  16. Extensions to regret-based decision curve analysis: an application to hospice referral for terminal patients.

    PubMed

    Tsalatsanis, Athanasios; Barnes, Laura E; Hozo, Iztok; Djulbegovic, Benjamin

    2011-12-23

    Despite the well documented advantages of hospice care, most terminally ill patients do not reap the maximum benefit from hospice services, with the majority of them receiving hospice care either prematurely or delayed. Decision systems to improve the hospice referral process are sorely needed. We present a novel theoretical framework that is based on well-established methodologies of prognostication and decision analysis to assist with the hospice referral process for terminally ill patients. We linked the SUPPORT statistical model, widely regarded as one of the most accurate models for prognostication of terminally ill patients, with the recently developed regret based decision curve analysis (regret DCA). We extend the regret DCA methodology to consider harms associated with the prognostication test as well as harms and effects of the management strategies. In order to enable patients and physicians in making these complex decisions in real-time, we developed an easily accessible web-based decision support system available at the point of care. The web-based decision support system facilitates the hospice referral process in three steps. First, the patient or surrogate is interviewed to elicit his/her personal preferences regarding the continuation of life-sustaining treatment vs. palliative care. Then, regret DCA is employed to identify the best strategy for the particular patient in terms of threshold probability at which he/she is indifferent between continuation of treatment and of hospice referral. Finally, if necessary, the probabilities of survival and death for the particular patient are computed based on the SUPPORT prognostication model and contrasted with the patient's threshold probability. The web-based design of the CDSS enables patients, physicians, and family members to participate in the decision process from anywhere internet access is available. We present a theoretical framework to facilitate the hospice referral process. Further rigorous clinical evaluation including testing in a prospective randomized controlled trial is required and planned.

  17. Extensions to Regret-based Decision Curve Analysis: An application to hospice referral for terminal patients

    PubMed Central

    2011-01-01

    Background Despite the well documented advantages of hospice care, most terminally ill patients do not reap the maximum benefit from hospice services, with the majority of them receiving hospice care either prematurely or delayed. Decision systems to improve the hospice referral process are sorely needed. Methods We present a novel theoretical framework that is based on well-established methodologies of prognostication and decision analysis to assist with the hospice referral process for terminally ill patients. We linked the SUPPORT statistical model, widely regarded as one of the most accurate models for prognostication of terminally ill patients, with the recently developed regret based decision curve analysis (regret DCA). We extend the regret DCA methodology to consider harms associated with the prognostication test as well as harms and effects of the management strategies. In order to enable patients and physicians in making these complex decisions in real-time, we developed an easily accessible web-based decision support system available at the point of care. Results The web-based decision support system facilitates the hospice referral process in three steps. First, the patient or surrogate is interviewed to elicit his/her personal preferences regarding the continuation of life-sustaining treatment vs. palliative care. Then, regret DCA is employed to identify the best strategy for the particular patient in terms of threshold probability at which he/she is indifferent between continuation of treatment and of hospice referral. Finally, if necessary, the probabilities of survival and death for the particular patient are computed based on the SUPPORT prognostication model and contrasted with the patient's threshold probability. The web-based design of the CDSS enables patients, physicians, and family members to participate in the decision process from anywhere internet access is available. Conclusions We present a theoretical framework to facilitate the hospice referral process. Further rigorous clinical evaluation including testing in a prospective randomized controlled trial is required and planned. PMID:22196308

  18. On the Mass of Atoms in Molecules: Beyond the Born-Oppenheimer Approximation

    NASA Astrophysics Data System (ADS)

    Scherrer, Arne; Agostini, Federica; Sebastiani, Daniel; Gross, E. K. U.; Vuilleumier, Rodolphe

    2017-07-01

    Describing the dynamics of nuclei in molecules requires a potential energy surface, which is traditionally provided by the Born-Oppenheimer or adiabatic approximation. However, we also need to assign masses to the nuclei. There, the Born-Oppenheimer picture does not account for the inertia of the electrons, and only bare nuclear masses are considered. Nowadays, experimental accuracy challenges the theoretical predictions of rotational and vibrational spectra and requires the participation of electrons in the internal motion of the molecule. More than 80 years after the original work of Born and Oppenheimer, this issue has still not been solved, in general. Here, we present a theoretical and numerical framework to address this problem in a general and rigorous way. Starting from the exact factorization of the electron-nuclear wave function, we include electronic effects beyond the Born-Oppenheimer regime in a perturbative way via position-dependent corrections to the bare nuclear masses. This maintains an adiabaticlike point of view: The nuclear degrees of freedom feel the presence of the electrons via a single potential energy surface, whereas the inertia of electrons is accounted for and the total mass of the system is recovered. This constitutes a general framework for describing the mass acquired by slow degrees of freedom due to the inertia of light, bounded particles; thus, it is applicable not only in electron-nuclear systems but in light-heavy nuclei or ions as well. We illustrate this idea with a model of proton transfer, where the light particle is the proton and the heavy particles are the oxygen atoms to which the proton is bounded. Inclusion of the light-particle inertia allows us to gain orders of magnitude in accuracy. The electron-nuclear perspective is adopted, instead, to calculate position-dependent mass corrections using density functional theory for a few polyatomic molecules at their equilibrium geometry. These data can serve as input for the computation of high-precision molecular spectra.

  19. Patient and Healthcare Provider Barriers to Hypertension Awareness, Treatment and Follow Up: A Systematic Review and Meta-Analysis of Qualitative and Quantitative Studies

    PubMed Central

    Khatib, Rasha; Schwalm, Jon-David; Yusuf, Salim; Haynes, R. Brian; McKee, Martin; Khan, Maheer; Nieuwlaat, Robby

    2014-01-01

    Background Although the importance of detecting, treating, and controlling hypertension has been recognized for decades, the majority of patients with hypertension remain uncontrolled. The path from evidence to practice contains many potential barriers, but their role has not been reviewed systematically. This review aimed to synthesize and identify important barriers to hypertension control as reported by patients and healthcare providers. Methods Electronic databases MEDLINE, EMBASE and Global Health were searched systematically up to February 2013. Two reviewers independently selected eligible studies. Two reviewers categorized barriers based on a theoretical framework of behavior change. The theoretical framework suggests that a change in behavior requires a strong commitment to change [intention], the necessary skills and abilities to adopt the behavior [capability], and an absence of health system and support constraints. Findings Twenty-five qualitative studies and 44 quantitative studies met the inclusion criteria. In qualitative studies, health system barriers were most commonly discussed in studies of patients and health care providers. Quantitative studies identified disagreement with clinical recommendations as the most common barrier among health care providers. Quantitative studies of patients yielded different results: lack of knowledge was the most common barrier to hypertension awareness. Stress, anxiety and depression were most commonly reported as barriers that hindered or delayed adoption of a healthier lifestyle. In terms of hypertension treatment adherence, patients mostly reported forgetting to take their medication. Finally, priority setting barriers were most commonly reported by patients in terms of following up with their health care providers. Conclusions This review identified a wide range of barriers facing patients and health care providers pursuing hypertension control, indicating the need for targeted multi-faceted interventions. More methodologically rigorous studies that encompass the range of barriers and that include low- and middle-income countries are required in order to inform policies to improve hypertension control. PMID:24454721

  20. Use of software engineering techniques in the design of the ALEPH data acquisition system

    NASA Astrophysics Data System (ADS)

    Charity, T.; McClatchey, R.; Harvey, J.

    1987-08-01

    The SASD methodology is being used to provide a rigorous design framework for various components of the ALEPH data acquisition system. The Entity-Relationship data model is used to describe the layout and configuration of the control and acquisition systems and detector components. State Transition Diagrams are used to specify control applications such as run control and resource management and Data Flow Diagrams assist in decomposing software tasks and defining interfaces between processes. These techniques encourage rigorous software design leading to enhanced functionality and reliability. Improved documentation and communication ensures continuity over the system life-cycle and simplifies project management.

  1. Systematic review of the application of the plan–do–study–act method to improve quality in healthcare

    PubMed Central

    Taylor, Michael J; McNicholas, Chris; Nicolay, Chris; Darzi, Ara; Bell, Derek; Reed, Julie E

    2014-01-01

    Background Plan–do–study–act (PDSA) cycles provide a structure for iterative testing of changes to improve quality of systems. The method is widely accepted in healthcare improvement; however there is little overarching evaluation of how the method is applied. This paper proposes a theoretical framework for assessing the quality of application of PDSA cycles and explores the consistency with which the method has been applied in peer-reviewed literature against this framework. Methods NHS Evidence and Cochrane databases were searched by three independent reviewers. Empirical studies were included that reported application of the PDSA method in healthcare. Application of PDSA cycles was assessed against key features of the method, including documentation characteristics, use of iterative cycles, prediction-based testing of change, initial small-scale testing and use of data over time. Results 73 of 409 individual articles identified met the inclusion criteria. Of the 73 articles, 47 documented PDSA cycles in sufficient detail for full analysis against the whole framework. Many of these studies reported application of the PDSA method that failed to accord with primary features of the method. Less than 20% (14/73) fully documented the application of a sequence of iterative cycles. Furthermore, a lack of adherence to the notion of small-scale change is apparent and only 15% (7/47) reported the use of quantitative data at monthly or more frequent data intervals to inform progression of cycles. Discussion To progress the development of the science of improvement, a greater understanding of the use of improvement methods, including PDSA, is essential to draw reliable conclusions about their effectiveness. This would be supported by the development of systematic and rigorous standards for the application and reporting of PDSAs. PMID:24025320

  2. Systematic Review of Methods in Low-Consensus Fields: Supporting Commensuration through `Construct-Centered Methods Aggregation' in the Case of Climate Change Vulnerability Research.

    PubMed

    Delaney, Aogán; Tamás, Peter A; Crane, Todd A; Chesterman, Sabrina

    2016-01-01

    There is increasing interest in using systematic review to synthesize evidence on the social and environmental effects of and adaptations to climate change. Use of systematic review for evidence in this field is complicated by the heterogeneity of methods used and by uneven reporting. In order to facilitate synthesis of results and design of subsequent research a method, construct-centered methods aggregation, was designed to 1) provide a transparent, valid and reliable description of research methods, 2) support comparability of primary studies and 3) contribute to a shared empirical basis for improving research practice. Rather than taking research reports at face value, research designs are reviewed through inductive analysis. This involves bottom-up identification of constructs, definitions and operationalizations; assessment of concepts' commensurability through comparison of definitions; identification of theoretical frameworks through patterns of construct use; and integration of transparently reported and valid operationalizations into ideal-type research frameworks. Through the integration of reliable bottom-up inductive coding from operationalizations and top-down coding driven from stated theory with expert interpretation, construct-centered methods aggregation enabled both resolution of heterogeneity within identically named constructs and merging of differently labeled but identical constructs. These two processes allowed transparent, rigorous and contextually sensitive synthesis of the research presented in an uneven set of reports undertaken in a heterogenous field. If adopted more broadly, construct-centered methods aggregation may contribute to the emergence of a valid, empirically-grounded description of methods used in primary research. These descriptions may function as a set of expectations that improves the transparency of reporting and as an evolving comprehensive framework that supports both interpretation of existing and design of future research.

  3. Systematic Review of Methods in Low-Consensus Fields: Supporting Commensuration through `Construct-Centered Methods Aggregation’ in the Case of Climate Change Vulnerability Research

    PubMed Central

    Crane, Todd A.; Chesterman, Sabrina

    2016-01-01

    There is increasing interest in using systematic review to synthesize evidence on the social and environmental effects of and adaptations to climate change. Use of systematic review for evidence in this field is complicated by the heterogeneity of methods used and by uneven reporting. In order to facilitate synthesis of results and design of subsequent research a method, construct-centered methods aggregation, was designed to 1) provide a transparent, valid and reliable description of research methods, 2) support comparability of primary studies and 3) contribute to a shared empirical basis for improving research practice. Rather than taking research reports at face value, research designs are reviewed through inductive analysis. This involves bottom-up identification of constructs, definitions and operationalizations; assessment of concepts’ commensurability through comparison of definitions; identification of theoretical frameworks through patterns of construct use; and integration of transparently reported and valid operationalizations into ideal-type research frameworks. Through the integration of reliable bottom-up inductive coding from operationalizations and top-down coding driven from stated theory with expert interpretation, construct-centered methods aggregation enabled both resolution of heterogeneity within identically named constructs and merging of differently labeled but identical constructs. These two processes allowed transparent, rigorous and contextually sensitive synthesis of the research presented in an uneven set of reports undertaken in a heterogenous field. If adopted more broadly, construct-centered methods aggregation may contribute to the emergence of a valid, empirically-grounded description of methods used in primary research. These descriptions may function as a set of expectations that improves the transparency of reporting and as an evolving comprehensive framework that supports both interpretation of existing and design of future research. PMID:26901409

  4. Comparison of optimal design methods in inverse problems

    NASA Astrophysics Data System (ADS)

    Banks, H. T.; Holm, K.; Kappel, F.

    2011-07-01

    Typical optimal design methods for inverse or parameter estimation problems are designed to choose optimal sampling distributions through minimization of a specific cost function related to the resulting error in parameter estimates. It is hoped that the inverse problem will produce parameter estimates with increased accuracy using data collected according to the optimal sampling distribution. Here we formulate the classical optimal design problem in the context of general optimization problems over distributions of sampling times. We present a new Prohorov metric-based theoretical framework that permits one to treat succinctly and rigorously any optimal design criteria based on the Fisher information matrix. A fundamental approximation theory is also included in this framework. A new optimal design, SE-optimal design (standard error optimal design), is then introduced in the context of this framework. We compare this new design criterion with the more traditional D-optimal and E-optimal designs. The optimal sampling distributions from each design are used to compute and compare standard errors; the standard errors for parameters are computed using asymptotic theory or bootstrapping and the optimal mesh. We use three examples to illustrate ideas: the Verhulst-Pearl logistic population model (Banks H T and Tran H T 2009 Mathematical and Experimental Modeling of Physical and Biological Processes (Boca Raton, FL: Chapman and Hall/CRC)), the standard harmonic oscillator model (Banks H T and Tran H T 2009) and a popular glucose regulation model (Bergman R N, Ider Y Z, Bowden C R and Cobelli C 1979 Am. J. Physiol. 236 E667-77 De Gaetano A and Arino O 2000 J. Math. Biol. 40 136-68 Toffolo G, Bergman R N, Finegood D T, Bowden C R and Cobelli C 1980 Diabetes 29 979-90).

  5. A theoretical Gaussian framework for anomalous change detection in hyperspectral images

    NASA Astrophysics Data System (ADS)

    Acito, Nicola; Diani, Marco; Corsini, Giovanni

    2017-10-01

    Exploitation of temporal series of hyperspectral images is a relatively new discipline that has a wide variety of possible applications in fields like remote sensing, area surveillance, defense and security, search and rescue and so on. In this work, we discuss how images taken at two different times can be processed to detect changes caused by insertion, deletion or displacement of small objects in the monitored scene. This problem is known in the literature as anomalous change detection (ACD) and it can be viewed as the extension, to the multitemporal case, of the well-known anomaly detection problem in a single image. In fact, in both cases, the hyperspectral images are processed blindly in an unsupervised manner and without a-priori knowledge about the target spectrum. We introduce the ACD problem using an approach based on the statistical decision theory and we derive a common framework including different ACD approaches. Particularly, we clearly define the observation space, the data statistical distribution conditioned to the two competing hypotheses and the procedure followed to come with the solution. The proposed overview places emphasis on techniques based on the multivariate Gaussian model that allows a formal presentation of the ACD problem and the rigorous derivation of the possible solutions in a way that is both mathematically more tractable and easier to interpret. We also discuss practical problems related to the application of the detectors in the real world and present affordable solutions. Namely, we describe the ACD processing chain including the strategies that are commonly adopted to compensate pervasive radiometric changes, caused by the different illumination/atmospheric conditions, and to mitigate the residual geometric image co-registration errors. Results obtained on real freely available data are discussed in order to test and compare the methods within the proposed general framework.

  6. Applying Sociocultural Theory to Teaching Statistics for Doctoral Social Work Students

    ERIC Educational Resources Information Center

    Mogro-Wilson, Cristina; Reeves, Michael G.; Charter, Mollie Lazar

    2015-01-01

    This article describes the development of two doctoral-level multivariate statistics courses utilizing sociocultural theory, an integrative pedagogical framework. In the first course, the implementation of sociocultural theory helps to support the students through a rigorous introduction to statistics. The second course involves students…

  7. Evidence-based decision making : developing a knowledge base for successful program outcomes in transportation asset management.

    DOT National Transportation Integrated Search

    2015-12-01

    MAP-21 and AASHTOs framework for transportation asset management (TAM) offer opportunities to use more : rigorous approaches to collect and apply evidence within a TAM context. This report documents the results of a study : funded by the Georgia D...

  8. "No Excuses" in New Orleans: The Silent Passivity of Neoliberal Schooling

    ERIC Educational Resources Information Center

    Sondel, Beth

    2016-01-01

    Drawing on ethnographic data, this article critically analyzes pedagogy in "no excuses" charter schools in New Orleans. Employing Ladson-Billings's framework for culturally relevant pedagogy, the author describes the level of academic rigor, cultural competence, and critical consciousness development across classrooms. This study…

  9. Accessing Social Capital through the Academic Mentoring Process

    ERIC Educational Resources Information Center

    Smith, Buffy

    2007-01-01

    This article explores how mentors and mentees create and maintain social capital during the mentoring process. I employ a sociological conceptual framework and rigorous qualitative analytical techniques to examine how students of color and first-generation college students access social capital through mentoring relationships. The findings…

  10. Towards Culturally Relevant Classroom Science: A Theoretical Framework Focusing on Traditional Plant Healing

    ERIC Educational Resources Information Center

    Mpofu, Vongai; Otulaja, Femi S.; Mushayikwa, Emmanuel

    2014-01-01

    A theoretical framework is an important component of a research study. It grounds the study and guides the methodological design. It also forms a reference point for the interpretation of the research findings. This paper conceptually examines the process of constructing a multi-focal theoretical lens for guiding studies that aim to accommodate…

  11. Model calibration for ice sheets and glaciers dynamics: a general theory of inverse problems in glaciology

    NASA Astrophysics Data System (ADS)

    Giudici, Mauro; Baratelli, Fulvia; Vassena, Chiara; Cattaneo, Laura

    2014-05-01

    Numerical modelling of the dynamic evolution of ice sheets and glaciers requires the solution of discrete equations which are based on physical principles (e.g. conservation of mass, linear momentum and energy) and phenomenological constitutive laws (e.g. Glen's and Fourier's laws). These equations must be accompanied by information on the forcing term and by initial and boundary conditions (IBC) on ice velocity, stress and temperature; on the other hand the constitutive laws involves many physical parameters, which possibly depend on the ice thermodynamical state. The proper forecast of the dynamics of ice sheets and glaciers (forward problem, FP) requires a precise knowledge of several quantities which appear in the IBCs, in the forcing terms and in the phenomenological laws and which cannot be easily measured at the study scale in the field. Therefore these quantities can be obtained through model calibration, i.e. by the solution of an inverse problem (IP). Roughly speaking, the IP aims at finding the optimal values of the model parameters that yield the best agreement of the model output with the field observations and data. The practical application of IPs is usually formulated as a generalised least squares approach, which can be cast in the framework of Bayesian inference. IPs are well developed in several areas of science and geophysics and several applications were proposed also in glaciology. The objective of this paper is to provide a further step towards a thorough and rigorous theoretical framework in cryospheric studies. Although the IP is often claimed to be ill-posed, this is rigorously true for continuous domain models, whereas for numerical models, which require the solution of algebraic equations, the properties of the IP must be analysed with more care. First of all, it is necessary to clarify the role of experimental and monitoring data to determine the calibration targets and the values of the parameters that can be considered to be fixed, whereas only the model output should depend on the subset of the parameters that can be identified with the calibration procedure and the solution to the IP. It is actually difficult to guarantee the existence and uniqueness of a solution to the IP for complex non-linear models. Also identifiability, a property related to the solution to the FP, and resolution should be carefully considered. Moreover, instability of the IP should not be confused with ill-conditioning and with the properties of the method applied to compute a solution. Finally, sensitivity analysis is of paramount importance to assess the reliability of the estimated parameters and of the model output, but it is often based on the one-at-a-time approach, through the application of the adjoint-state method, to compute local sensitivity, i.e. the uncertainty on the model output due to small variations of the input parameters, whereas first-order approaches that consider the whole possible variability of the model parameters should be considered. This theoretical framework and the relevant properties are illustrated by means of a simple numerical example of isothermal ice flow, based on the shallow ice approximation.

  12. Integrated primary care, the collaboration imperative inter-organizational cooperation in the integrated primary care field: a theoretical framework

    PubMed Central

    Valentijn, Pim P; Bruijnzeels, Marc A; de Leeuw, Rob J; Schrijvers, Guus J.P

    2012-01-01

    Purpose Capacity problems and political pressures have led to a rapid change in the organization of primary care from mono disciplinary small business to complex inter-organizational relationships. It is assumed that inter-organizational collaboration is the driving force to achieve integrated (primary) care. Despite the importance of collaboration and integration of services in primary care, there is no unambiguous definition for both concepts. The purpose of this study is to examine and link the conceptualisation and validation of the terms inter-organizational collaboration and integrated primary care using a theoretical framework. Theory The theoretical framework is based on the complex collaboration process of negotiation among multiple stakeholder groups in primary care. Methods A literature review of health sciences and business databases, and targeted grey literature sources. Based on the literature review we operationalized the constructs of inter-organizational collaboration and integrated primary care in a theoretical framework. The framework is being validated in an explorative study of 80 primary care projects in the Netherlands. Results and conclusions Integrated primary care is considered as a multidimensional construct based on a continuum of integration, extending from segregation to integration. The synthesis of the current theories and concepts of inter-organizational collaboration is insufficient to deal with the complexity of collaborative issues in primary care. One coherent and integrated theoretical framework was found that could make the complex collaboration process in primary care transparent. This study presented theoretical framework is a first step to understand the patterns of successful collaboration and integration in primary care services. These patterns can give insights in the organization forms needed to create a good working integrated (primary) care system that fits the local needs of a population. Preliminary data of the patterns of collaboration and integration will be presented.

  13. IEP goals for school-age children with speech sound disorders.

    PubMed

    Farquharson, Kelly; Tambyraja, Sherine R; Justice, Laura M; Redle, Erin E

    2014-01-01

    The purpose of the current study was to describe the current state of practice for writing Individualized Education Program (IEP) goals for children with speech sound disorders (SSDs). IEP goals for 146 children receiving services for SSDs within public school systems across two states were coded for their dominant theoretical framework and overall quality. A dichotomous scheme was used for theoretical framework coding: cognitive-linguistic or sensory-motor. Goal quality was determined by examining 7 specific indicators outlined by an empirically tested rating tool. In total, 147 long-term and 490 short-term goals were coded. The results revealed no dominant theoretical framework for long-term goals, whereas short-term goals largely reflected a sensory-motor framework. In terms of quality, the majority of speech production goals were functional and generalizable in nature, but were not able to be easily targeted during common daily tasks or by other members of the IEP team. Short-term goals were consistently rated higher in quality domains when compared to long-term goals. The current state of practice for writing IEP goals for children with SSDs indicates that theoretical framework may be eclectic in nature and likely written to support the individual needs of children with speech sound disorders. Further investigation is warranted to determine the relations between goal quality and child outcomes. (1) Identify two predominant theoretical frameworks and discuss how they apply to IEP goal writing. (2) Discuss quality indicators as they relate to IEP goals for children with speech sound disorders. (3) Discuss the relationship between long-term goals level of quality and related theoretical frameworks. (4) Identify the areas in which business-as-usual IEP goals exhibit strong quality.

  14. The Dynamics of Germinal Centre Selection as Measured by Graph-Theoretical Analysis of Mutational Lineage Trees

    PubMed Central

    Dunn-Walters, Deborah K.; Belelovsky, Alex; Edelman, Hanna; Banerjee, Monica; Mehr, Ramit

    2002-01-01

    We have developed a rigorous graph-theoretical algorithm for quantifying the shape properties of mutational lineage trees. We show that information about the dynamics of hypermutation and antigen-driven clonal selection during the humoral immune response is contained in the shape of mutational lineage trees deduced from the responding clones. Age and tissue related differences in the selection process can be studied using this method. Thus, tree shape analysis can be used as a means of elucidating humoral immune response dynamics in various situations. PMID:15144020

  15. Aerodynamic Design of a Propeller for High-Altitude Balloon Trajectory Control

    NASA Technical Reports Server (NTRS)

    Eppler, Richard; Somers, Dan M.

    2012-01-01

    The aerodynamic design of a propeller for the trajectory control of a high-altitude, scientific balloon has been performed using theoretical methods developed especially for such applications. The methods are described. Optimum, nonlinear chord and twist distributions have been developed in conjunction with the design of a family of airfoils, the SE403, SE404, and SE405, for the propeller. The very low Reynolds numbers along the propeller blade fall in a range that has yet to be rigorously investigated, either experimentally or theoretically.

  16. Tackling complexities in understanding the social determinants of health: the contribution of ethnographic research

    PubMed Central

    2011-01-01

    Objective The complexities inherent in understanding the social determinants of health are often not well-served by quantitative approaches. My aim is to show that well-designed and well-conducted ethnographic studies have an important contribution to make in this regard. Ethnographic research designs are a difficult but rigorous approach to research questions that require us to understand the complexity of people’s social and cultural lives. Approach I draw on an ethnographic study to describe the complexities of studying maternal health in a rural area in India. I then show how the lessons learnt in that setting and context can be applied to studies done in very different settings. Results I show how ethnographic research depends for rigour on a theoretical framework for sample selection; why immersion in the community under study, and rapport building with research participants, is important to ensure rich and meaningful data; and how flexible approaches to data collection lead to the gradual emergence of an analysis based on intense cross-referencing with community views and thus a conclusion that explains the similarities and differences observed. Conclusion When using ethnographic research design it can be difficult to specify in advance the exact details of the study design. Researchers can encounter issues in the field that require them to change what they planned on doing. In rigorous ethnographic studies, the researcher in the field is the research instrument and needs to be well trained in the method. Implication Ethnographic research is challenging, but nevertheless provides a rewarding way of researching complex health problems that require an understanding of the social and cultural determinants of health. PMID:22168509

  17. The role of a posteriori mathematics in physics

    NASA Astrophysics Data System (ADS)

    MacKinnon, Edward

    2018-05-01

    The calculus that co-evolved with classical mechanics relied on definitions of functions and differentials that accommodated physical intuitions. In the early nineteenth century mathematicians began the rigorous reformulation of calculus and eventually succeeded in putting almost all of mathematics on a set-theoretic foundation. Physicists traditionally ignore this rigorous mathematics. Physicists often rely on a posteriori math, a practice of using physical considerations to determine mathematical formulations. This is illustrated by examples from classical and quantum physics. A justification of such practice stems from a consideration of the role of phenomenological theories in classical physics and effective theories in contemporary physics. This relates to the larger question of how physical theories should be interpreted.

  18. The Transfer of Resonance Line Polarization with Partial Frequency Redistribution in the General Hanle–Zeeman Regime

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ballester, E. Alsina; Bueno, J. Trujillo; Belluzzi, L., E-mail: ealsina@iac.es

    2017-02-10

    The spectral line polarization encodes a wealth of information about the thermal and magnetic properties of the solar atmosphere. Modeling the Stokes profiles of strong resonance lines is, however, a complex problem both from a theoretical and computational point of view, especially when partial frequency redistribution (PRD) effects need to be taken into account. In this work, we consider a two-level atom in the presence of magnetic fields of arbitrary intensity (Hanle–Zeeman regime) and orientation, both deterministic and micro-structured. Working within the framework of a rigorous PRD theoretical approach, we have developed a numerical code that solves the full non-LTEmore » radiative transfer problem for polarized radiation, in one-dimensional models of the solar atmosphere, accounting for the combined action of the Hanle and Zeeman effects, as well as for PRD phenomena. After briefly discussing the relevant equations, we describe the iterative method of solution of the problem and the numerical tools that we have developed and implemented. We finally present some illustrative applications to two resonance lines that form at different heights in the solar atmosphere, and provide a detailed physical interpretation of the calculated Stokes profiles. We find that magneto-optical effects have a strong impact on the linear polarization signals that PRD effects produce in the wings of strong resonance lines. We also show that the weak-field approximation has to be used with caution when PRD effects are considered.« less

  19. Implementation of a Goal-Based Systems Engineering Process Using the Systems Modeling Language (SysML)

    NASA Technical Reports Server (NTRS)

    Patterson, Jonathan D.; Breckenridge, Jonathan T.; Johnson, Stephen B.

    2013-01-01

    Building upon the purpose, theoretical approach, and use of a Goal-Function Tree (GFT) being presented by Dr. Stephen B. Johnson, described in a related Infotech 2013 ISHM abstract titled "Goal-Function Tree Modeling for Systems Engineering and Fault Management", this paper will describe the core framework used to implement the GFTbased systems engineering process using the Systems Modeling Language (SysML). These two papers are ideally accepted and presented together in the same Infotech session. Statement of problem: SysML, as a tool, is currently not capable of implementing the theoretical approach described within the "Goal-Function Tree Modeling for Systems Engineering and Fault Management" paper cited above. More generally, SysML's current capabilities to model functional decompositions in the rigorous manner required in the GFT approach are limited. The GFT is a new Model-Based Systems Engineering (MBSE) approach to the development of goals and requirements, functions, and its linkage to design. As a growing standard for systems engineering, it is important to develop methods to implement GFT in SysML. Proposed Method of Solution: Many of the central concepts of the SysML language are needed to implement a GFT for large complex systems. In the implementation of those central concepts, the following will be described in detail: changes to the nominal SysML process, model view definitions and examples, diagram definitions and examples, and detailed SysML construct and stereotype definitions.

  20. A Dynamic Connectome Supports the Emergence of Stable Computational Function of Neural Circuits through Reward-Based Learning.

    PubMed

    Kappel, David; Legenstein, Robert; Habenschuss, Stefan; Hsieh, Michael; Maass, Wolfgang

    2018-01-01

    Synaptic connections between neurons in the brain are dynamic because of continuously ongoing spine dynamics, axonal sprouting, and other processes. In fact, it was recently shown that the spontaneous synapse-autonomous component of spine dynamics is at least as large as the component that depends on the history of pre- and postsynaptic neural activity. These data are inconsistent with common models for network plasticity and raise the following questions: how can neural circuits maintain a stable computational function in spite of these continuously ongoing processes, and what could be functional uses of these ongoing processes? Here, we present a rigorous theoretical framework for these seemingly stochastic spine dynamics and rewiring processes in the context of reward-based learning tasks. We show that spontaneous synapse-autonomous processes, in combination with reward signals such as dopamine, can explain the capability of networks of neurons in the brain to configure themselves for specific computational tasks, and to compensate automatically for later changes in the network or task. Furthermore, we show theoretically and through computer simulations that stable computational performance is compatible with continuously ongoing synapse-autonomous changes. After reaching good computational performance it causes primarily a slow drift of network architecture and dynamics in task-irrelevant dimensions, as observed for neural activity in motor cortex and other areas. On the more abstract level of reinforcement learning the resulting model gives rise to an understanding of reward-driven network plasticity as continuous sampling of network configurations.

  1. A Dynamic Connectome Supports the Emergence of Stable Computational Function of Neural Circuits through Reward-Based Learning

    PubMed Central

    Habenschuss, Stefan; Hsieh, Michael

    2018-01-01

    Synaptic connections between neurons in the brain are dynamic because of continuously ongoing spine dynamics, axonal sprouting, and other processes. In fact, it was recently shown that the spontaneous synapse-autonomous component of spine dynamics is at least as large as the component that depends on the history of pre- and postsynaptic neural activity. These data are inconsistent with common models for network plasticity and raise the following questions: how can neural circuits maintain a stable computational function in spite of these continuously ongoing processes, and what could be functional uses of these ongoing processes? Here, we present a rigorous theoretical framework for these seemingly stochastic spine dynamics and rewiring processes in the context of reward-based learning tasks. We show that spontaneous synapse-autonomous processes, in combination with reward signals such as dopamine, can explain the capability of networks of neurons in the brain to configure themselves for specific computational tasks, and to compensate automatically for later changes in the network or task. Furthermore, we show theoretically and through computer simulations that stable computational performance is compatible with continuously ongoing synapse-autonomous changes. After reaching good computational performance it causes primarily a slow drift of network architecture and dynamics in task-irrelevant dimensions, as observed for neural activity in motor cortex and other areas. On the more abstract level of reinforcement learning the resulting model gives rise to an understanding of reward-driven network plasticity as continuous sampling of network configurations. PMID:29696150

  2. Intellect: a theoretical framework for personality traits related to intellectual achievements.

    PubMed

    Mussel, Patrick

    2013-05-01

    The present article develops a theoretical framework for the structure of personality traits related to intellectual achievements. We postulate a 2-dimensional model, differentiating between 2 processes (Seek and Conquer) and 3 operations (Think, Learn, and Create). The framework was operationalized by a newly developed measure, which was validated based on 2 samples. Subsequently, in 3 studies (overall N = 1,478), the 2-dimensional structure of the Intellect framework was generally supported. Additionally, subdimensions of the Intellect framework specifically predicted conceptually related criteria, including scholastic performance, vocational interest, and leisure activities. Furthermore, results from multidimensional scaling and higher order confirmatory factor analyses show that the framework allows for the incorporation of several constructs that have been proposed on different theoretical backgrounds, such as need for cognition, typical intellectual engagement, curiosity, intrinsic motivation, goal orientation, and openness to ideas. It is concluded that based on the Intellect framework, these constructs, which have been researched separately in the literature, can be meaningfully integrated.

  3. Nursing management of sensory overload in psychiatry – Theoretical densification and modification of the framework model

    PubMed

    Scheydt, Stefan; Needham, Ian; Behrens, Johann

    2017-01-01

    Background: Within the scope of the research project on the subjects of sensory overload and stimulus regulation, a theoretical framework model of the nursing care of patients with sensory overload in psychiatry was developed. In a second step, this theoretical model should now be theoretically compressed and, if necessary, modified. Aim: Empirical verification as well as modification, enhancement and theoretical densification of the framework model of nursing care of patients with sensory overload in psychiatry. Method: Analysis of 8 expert interviews by summarizing and structuring content analysis methods based on Meuser and Nagel (2009) as well as Mayring (2010). Results: The developed framework model (Scheydt et al., 2016b) could be empirically verified, theoretically densificated and extended by one category (perception modulation). Thus, four categories of nursing care of patients with sensory overload can be described in inpatient psychiatry: removal from stimuli, modulation of environmental factors, perceptual modulation as well as help somebody to help him- or herself / coping support. Conclusions: Based on the methodological approach, a relatively well-saturated, credible conceptualization of a theoretical model for the description of the nursing care of patients with sensory overload in stationary psychiatry could be worked out. In further steps, these measures have to be further developed, implemented and evaluated regarding to their efficacy.

  4. Testing Theoretical Models of Magnetic Damping Using an Air Track

    ERIC Educational Resources Information Center

    Vidaurre, Ana; Riera, Jaime; Monsoriu, Juan A.; Gimenez, Marcos H.

    2008-01-01

    Magnetic braking is a long-established application of Lenz's law. A rigorous analysis of the laws governing this problem involves solving Maxwell's equations in a time-dependent situation. Approximate models have been developed to describe different experimental results related to this phenomenon. In this paper we present a new method for the…

  5. Topics in Computational Learning Theory and Graph Algorithms.

    ERIC Educational Resources Information Center

    Board, Raymond Acton

    This thesis addresses problems from two areas of theoretical computer science. The first area is that of computational learning theory, which is the study of the phenomenon of concept learning using formal mathematical models. The goal of computational learning theory is to investigate learning in a rigorous manner through the use of techniques…

  6. Comparing an annual and daily time-step model for predicting field-scale P loss

    USDA-ARS?s Scientific Manuscript database

    Several models with varying degrees of complexity are available for describing P movement through the landscape. The complexity of these models is dependent on the amount of data required by the model, the number of model parameters needed to be estimated, the theoretical rigor of the governing equa...

  7. Multiple Imputation of Multilevel Missing Data-Rigor versus Simplicity

    ERIC Educational Resources Information Center

    Drechsler, Jörg

    2015-01-01

    Multiple imputation is widely accepted as the method of choice to address item-nonresponse in surveys. However, research on imputation strategies for the hierarchical structures that are typically found in the data in educational contexts is still limited. While a multilevel imputation model should be preferred from a theoretical point of view if…

  8. The Theoretical and Empirical Basis for Meditation as an Intervention for PTSD

    ERIC Educational Resources Information Center

    Lang, Ariel J.; Strauss, Jennifer L.; Bomyea, Jessica; Bormann, Jill E.; Hickman, Steven D.; Good, Raquel C.; Essex, Michael

    2012-01-01

    In spite of the existence of good empirically supported treatments for posttraumatic stress disorder (PTSD), consumers and providers continue to ask for more options for managing this common and often chronic condition. Meditation-based approaches are being widely implemented, but there is minimal research rigorously assessing their effectiveness.…

  9. Complexity, Representation and Practice: Case Study as Method and Methodology

    ERIC Educational Resources Information Center

    Miles, Rebecca

    2015-01-01

    While case study is considered a common approach to examining specific and particular examples in research disciplines such as law, medicine and psychology, in the social sciences case study is often treated as a lesser, flawed or undemanding methodology which is less valid, reliable or theoretically rigorous than other methodologies. Building on…

  10. Critical Mentoring Practices to Support Diverse Students in Higher Education: Chicana/Latina Faculty Perspectives

    ERIC Educational Resources Information Center

    Figueroa, Julie López; Rodriguez, Gloria M.

    2015-01-01

    This chapter outlines critical practices that emerged from utilizing social justice frameworks to mentor first-generation, underrepresented minority students at the undergraduate to doctoral levels. The mentoring strategies include helping students to reframe instances when faculty and peers unconsciously conflate academic rigor with color-blind…

  11. Transforming America: Cultural Cohesion, Educational Achievement, and Global Competitiveness. Educational Psychology. Volume 7

    ERIC Educational Resources Information Center

    DeVillar, Robert A.; Jiang, Binbin

    2011-01-01

    Creatively and rigorously blending historical research and contemporary data from various disciplines, this book cogently and comprehensively illustrates the problems and opportunities the American nation faces in education, economics, and the global arena. The authors propose a framework of transformation that would render American culture no…

  12. Bioinformatic genome comparisons for taxonomic and phylogenic assignments using Aeromonas as a test case

    USDA-ARS?s Scientific Manuscript database

    Prokaryotic taxonomy is the underpinning of microbiology, providing a framework for the proper identification and naming of organisms. The 'gold standard' of bacterial species delineation is the overall genome similarity as determined by DNA-DNA hybridization (DDH), a technically rigorous yet someti...

  13. Soil quality evaluation using Soil Management Assessment Framework (SMAF) in Brazilian oxisols with contrasting texture

    USDA-ARS?s Scientific Manuscript database

    To ensure current land use strategies and management practices are economically, environmentally, and socially sustainable, tools and techniques for assessing and quantifying changes in soil quality/health (SQ) need to be developed through rigorous research and potential use by consultants, and othe...

  14. Cash on Demand: A Framework for Managing a Cash Liquidity Position.

    ERIC Educational Resources Information Center

    Augustine, John H.

    1995-01-01

    A well-run college or university will seek to accumulate and maintain an appropriate cash reserve or liquidity position. A rigorous analytic process for estimating the size and cost of a liquidity position, based on judgments about the institution's operating risks and opportunities, is outlined. (MSE)

  15. New Assessments, New Rigor

    ERIC Educational Resources Information Center

    Joan Herman; Robert Linn

    2014-01-01

    Researching. Synthesizing. Reasoning with evidence. The PARCC and Smarter Balanced assessments are clearly setting their sights on complex thinking skills. Researchers Joan Herman and Robert Linn look at the new assessments to see how they stack up against Norman Webb's depth of knowledge framework as well as against current state tests. The…

  16. Evaluating Computer-Related Incidents on Campus

    ERIC Educational Resources Information Center

    Rothschild, Daniel; Rezmierski, Virginia

    2004-01-01

    The Computer Incident Factor Analysis and Categorization (CIFAC) Project at the University of Michigan began in September 2003 with grants from EDUCAUSE and the National Science Foundation (NSF). The project's primary goal is to create a best-practices security framework for colleges and universities based on rigorous quantitative analysis of…

  17. Patterns of Control over the Teaching-Studying-Learning Process and Classrooms as Complex Dynamic Environments: A Theoretical Framework

    ERIC Educational Resources Information Center

    Harjunen, Elina

    2012-01-01

    In this theoretical paper the role of power in classroom interactions is examined in terms of a dominance continuum to advance a theoretical framework justifying the emergence of three ways of distributing power when it comes to dealing with the control over the teaching-studying-learning (TSL) "pattern of teacher domination," "pattern of…

  18. Interpreting quantum coherence through a quantum measurement process

    NASA Astrophysics Data System (ADS)

    Yao, Yao; Dong, G. H.; Xiao, Xing; Li, Mo; Sun, C. P.

    2017-11-01

    Recently, there has been a renewed interest in the quantification of coherence or other coherencelike concepts within the framework of quantum resource theory. However, rigorously defined or not, the notion of coherence or decoherence has already been used by the community for decades since the advent of quantum theory. Intuitively, the definitions of coherence and decoherence should be two sides of the same coin. Therefore, a natural question is raised: How can the conventional decoherence processes, such as the von Neumann-Lüders (projective) measurement postulation or partially dephasing channels, fit into the bigger picture of the recently established theoretical framework? Here we show that the state collapse rules of the von Neumann or Lüders-type measurements, as special cases of genuinely incoherent operations (GIOs), are consistent with the resource theories of quantum coherence. New hierarchical measures of coherence are proposed for the Lüders-type measurement and their relationship with measurement-dependent discord is addressed. Moreover, utilizing the fixed-point theory for C* algebra, we prove that GIOs indeed represent a particular type of partially dephasing (phase-damping) channels which have a matrix representation based on the Schur product. By virtue of the Stinespring dilation theorem, the physical realizations of incoherent operations are investigated in detail and we find that GIOs in fact constitute the core of strictly incoherent operations and generally incoherent operations and the unspeakable notion of coherence induced by GIOs can be transferred to the theories of speakable coherence by the corresponding permutation or relabeling operators.

  19. A Mathematical Framework for the Analysis of Cyber-Resilient Control Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melin, Alexander M; Ferragut, Erik M; Laska, Jason A

    2013-01-01

    The increasingly recognized vulnerability of industrial control systems to cyber-attacks has inspired a considerable amount of research into techniques for cyber-resilient control systems. The majority of this effort involves the application of well known information security (IT) techniques to control system networks. While these efforts are important to protect the control systems that operate critical infrastructure, they are never perfectly effective. Little research has focused on the design of closed-loop dynamics that are resilient to cyber-attack. The majority of control system protection measures are concerned with how to prevent unauthorized access and protect data integrity. We believe that the abilitymore » to analyze how an attacker can effect the closed loop dynamics of a control system configuration once they have access is just as important to the overall security of a control system. To begin to analyze this problem, consistent mathematical definitions of concepts within resilient control need to be established so that a mathematical analysis of the vulnerabilities and resiliencies of a particular control system design methodology and configuration can be made. In this paper, we propose rigorous definitions for state awareness, operational normalcy, and resiliency as they relate to control systems. We will also discuss some mathematical consequences that arise from the proposed definitions. The goal is to begin to develop a mathematical framework and testable conditions for resiliency that can be used to build a sound theoretical foundation for resilient control research.« less

  20. Statistical testing and power analysis for brain-wide association study.

    PubMed

    Gong, Weikang; Wan, Lin; Lu, Wenlian; Ma, Liang; Cheng, Fan; Cheng, Wei; Grünewald, Stefan; Feng, Jianfeng

    2018-04-05

    The identification of connexel-wise associations, which involves examining functional connectivities between pairwise voxels across the whole brain, is both statistically and computationally challenging. Although such a connexel-wise methodology has recently been adopted by brain-wide association studies (BWAS) to identify connectivity changes in several mental disorders, such as schizophrenia, autism and depression, the multiple correction and power analysis methods designed specifically for connexel-wise analysis are still lacking. Therefore, we herein report the development of a rigorous statistical framework for connexel-wise significance testing based on the Gaussian random field theory. It includes controlling the family-wise error rate (FWER) of multiple hypothesis testings using topological inference methods, and calculating power and sample size for a connexel-wise study. Our theoretical framework can control the false-positive rate accurately, as validated empirically using two resting-state fMRI datasets. Compared with Bonferroni correction and false discovery rate (FDR), it can reduce false-positive rate and increase statistical power by appropriately utilizing the spatial information of fMRI data. Importantly, our method bypasses the need of non-parametric permutation to correct for multiple comparison, thus, it can efficiently tackle large datasets with high resolution fMRI images. The utility of our method is shown in a case-control study. Our approach can identify altered functional connectivities in a major depression disorder dataset, whereas existing methods fail. A software package is available at https://github.com/weikanggong/BWAS. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. Integrating Quality Improvement and Continuing Professional Development: A Model From the Mental Health Care System.

    PubMed

    Sockalingam, Sanjeev; Tehrani, Hedieh; Lin, Elizabeth; Lieff, Susan; Harris, Ilene; Soklaridis, Sophie

    2016-04-01

    To explore the perspectives of leaders in psychiatry and continuing professional development (CPD) regarding the relationship, opportunities, and challenges in integrating quality improvement (QI) and CPD. In 2013-2014, the authors interviewed 18 participants in Canada: 10 psychiatrists-in-chief, 6 CPD leaders in psychiatry, and 2 individuals with experience integrating these domains in psychiatry who were identified through snowball sampling. Questions were designed to identify participants' perspectives about the definition, relationship, and integration of QI and CPD in psychiatry. Interviews were recorded and transcribed. An iterative, inductive method was used to thematically analyze the transcripts. To ensure the rigor of the analysis, the authors performed member checking and sampling until theoretical saturation was achieved. Participants defined QI as a concept measured at the individual, hospital, and health care system levels and CPD as a concept measured predominantly at the individual and hospital levels. Four themes related to the relationship between QI and CPD were identified: challenges with QI training, adoption of QI into the mental health care system, implementation of QI in CPD, and practice improvement outcomes. Despite participants describing QI and CPD as mutually beneficial, they expressed uncertainty about the appropriateness of aligning these domains within a mental health care context because of the identified challenges. This study identified challenges with aligning QI and CPD in psychiatry and yielded a framework to inform future integration efforts. Further research is needed to determine the generalizability of this framework to other specialties and health care professions.

  2. Delayed bet-hedging resilience strategies under environmental fluctuations

    NASA Astrophysics Data System (ADS)

    Ogura, Masaki; Wakaiki, Masashi; Rubin, Harvey; Preciado, Victor M.

    2017-05-01

    Many biological populations, such as bacterial colonies, have developed through evolution a protection mechanism, called bet hedging, to increase their probability of survival under stressful environmental fluctuation. In this context, the concept of preadaptation refers to a common type of bet-hedging protection strategy in which a relatively small number of individuals in a population stochastically switch their phenotypes to a dormant metabolic state in which they increase their probability of survival against potential environmental shocks. Hence, if an environmental shock took place at some point in time, preadapted organisms would be better adapted to survive and proliferate once the shock is over. In many biological populations, the mechanisms of preadaptation and proliferation present delays whose influence in the fitness of the population are not well understood. In this paper, we propose a rigorous mathematical framework to analyze the role of delays in both preadaptation and proliferation mechanisms in the survival of biological populations, with an emphasis on bacterial colonies. Our theoretical framework allows us to analytically quantify the average growth rate of a bet-hedging bacterial colony with stochastically delayed reactions with arbitrary precision. We verify the accuracy of the proposed method by numerical simulations and conclude that the growth rate of a bet-hedging population shows a nontrivial dependency on their preadaptation and proliferation delays. Contrary to the current belief, our results show that faster reactions do not, in general, increase the overall fitness of a biological population.

  3. Upping the "Anti-": The Value of an Anti-Racist Theoretical Framework in Music Education

    ERIC Educational Resources Information Center

    Hess, Juliet

    2015-01-01

    In a time that some have argued is "postracial" following the election and reelection of Barack Obama (see Wise 2010, for discussion), this paper argues that antiracism is a crucial theoretical framework for music education. I explore three areas of music education, in which such a framework can push toward change. The first area speaks…

  4. Testing the Consolidated Framework for Implementation Research on health care innovations from South Yorkshire.

    PubMed

    Ilott, Irene; Gerrish, Kate; Booth, Andrew; Field, Becky

    2013-10-01

    There is an international imperative to implement research into clinical practice to improve health care. Understanding the dynamics of change requires knowledge from theoretical and empirical studies. This paper presents a novel approach to testing a new meta theoretical framework: the Consolidated Framework for Implementation Research. The utility of the Framework was evaluated using a post hoc, deductive analysis of 11 narrative accounts of innovation in health care services and practice from England, collected in 2010. A matrix, comprising the five domains and 39 constructs of the Framework was developed to examine the coherence of the terminology, to compare results across contexts and to identify new theoretical developments. The Framework captured the complexity of implementation across 11 diverse examples, offering theoretically informed, comprehensive coverage. The Framework drew attention to relevant points in individual cases together with patterns across cases; for example, all were internally developed innovations that brought direct or indirect patient advantage. In 10 cases, the change was led by clinicians. Most initiatives had been maintained for several years and there was evidence of spread in six examples. Areas for further development within the Framework include sustainability and patient/public engagement in implementation. Our analysis suggests that this conceptual framework has the potential to offer useful insights, whether as part of a situational analysis or by developing context-specific propositions for hypothesis testing. Such studies are vital now that innovation is being promoted as core business for health care. © 2012 John Wiley & Sons Ltd.

  5. Fraying connections of caring women: an exemplar of including difference in the development of explanatory frameworks.

    PubMed

    Wuest, J

    1997-01-01

    While research exploring diverse groups enhances understanding of their unique perspectives and experiences, it also contributes to the exclusion of such groups from mainstream frameworks and solutions. The feminist grounded theory method allows for inclusion of marginalized groups through theoretical sensitivity to feminist theory and theoretical sampling. This paper demonstrates how this approach results in an explanatory framework that accounts for diverse realities in a study of women's caring. Fraying connections were identified as women's initial response to competing and changing caring demands. The range of dimensions and properties of fraying connections was identified through theoretical sampling guided by the emerging themes and theoretical sensitivity to issues of gender, culture, age, ability, class, and sexual orientation.

  6. Theoretical framework to study exercise motivation for breast cancer risk reduction.

    PubMed

    Wood, Maureen E

    2008-01-01

    To identify an appropriate theoretical framework to study exercise motivation for breast cancer risk reduction among high-risk women. An extensive review of the literature was conducted to gather relevant information pertaining to the Health Promotion Model, self-determination theory, social cognitive theory, Health Belief Model, Transtheoretical Model, theory of planned behavior, and protection motivation theory. An iterative approach was used to summarize the literature related to exercise motivation within each theoretical framework. Protection motivation theory could be used to examine the effects of perceived risk and self-efficacy in motivating women to exercise to facilitate health-related behavioral change. Evidence-based research within a chosen theoretical model can aid practitioners when making practical recommendations to reduce breast cancer risk.

  7. Innovation value chain capability in Malaysian-owned company: A theoretical framework

    NASA Astrophysics Data System (ADS)

    Abidin, Norkisme Zainal; Suradi, Nur Riza Mohd

    2014-09-01

    Good quality products or services are no longer adequate to guarantee the sustainability of a company in the present competitive business. Prior research has developed various innovation models with the hope to better understand the innovativeness of the company. Due to countless definitions, indicators, factors, parameter and approaches in the study of innovation, it is difficult to ensure which one will best suit the Malaysian-owned company innovativeness. This paper aims to provide a theoretical background to support the framework of the innovation value chain capability in Malaysian-owned Company. The theoretical framework was based on the literature reviews, expert interviews and focus group study. The framework will be used to predict and assess the innovation value chain capability in Malaysian-owned company.

  8. A framework for biodynamic feedthrough analysis--part I: theoretical foundations.

    PubMed

    Venrooij, Joost; van Paassen, Marinus M; Mulder, Mark; Abbink, David A; Mulder, Max; van der Helm, Frans C T; Bulthoff, Heinrich H

    2014-09-01

    Biodynamic feedthrough (BDFT) is a complex phenomenon, which has been studied for several decades. However, there is little consensus on how to approach the BDFT problem in terms of definitions, nomenclature, and mathematical descriptions. In this paper, a framework for biodynamic feedthrough analysis is presented. The goal of this framework is two-fold. First, it provides some common ground between the seemingly large range of different approaches existing in the BDFT literature. Second, the framework itself allows for gaining new insights into BDFT phenomena. It will be shown how relevant signals can be obtained from measurement, how different BDFT dynamics can be derived from them, and how these different dynamics are related. Using the framework, BDFT can be dissected into several dynamical relationships, each relevant in understanding BDFT phenomena in more detail. The presentation of the BDFT framework is divided into two parts. This paper, Part I, addresses the theoretical foundations of the framework. Part II, which is also published in this issue, addresses the validation of the framework. The work is presented in two separate papers to allow for a detailed discussion of both the framework's theoretical background and its validation.

  9. The Roy Adaptation Model: A Theoretical Framework for Nurses Providing Care to Individuals With Anorexia Nervosa.

    PubMed

    Jennings, Karen M

    Using a nursing theoretical framework to understand, elucidate, and propose nursing research is fundamental to knowledge development. This article presents the Roy Adaptation Model as a theoretical framework to better understand individuals with anorexia nervosa during acute treatment, and the role of nursing assessments and interventions in the promotion of weight restoration. Nursing assessments and interventions situated within the Roy Adaptation Model take into consideration how weight restoration does not occur in isolation but rather reflects an adaptive process within external and internal environments, and has the potential for more holistic care.

  10. National Strategic Planning: Linking DIMEFIL/PMESII to a Theory of Victory

    DTIC Science & Technology

    2009-04-01

    theoretical and one practical, and both are interlinked, The theoretical problem is the lack of a mental framework tying the desired end state...mental framework tying the desired end state (usually broadly stated) to the activities undertaken with the instruments of national power. This is a... FRAMEWORK TO DIMEFIL/PMESII ............ 39 CHAPTER 4. HOLY GRAIL OR WITCHES’ BREW? RECORDING REASONING IN SOFTWARE

  11. Understanding the Theoretical Framework of Technological Pedagogical Content Knowledge: A Collaborative Self-Study to Understand Teaching Practice and Aspects of Knowledge

    ERIC Educational Resources Information Center

    Fransson, Goran; Holmberg, Jorgen

    2012-01-01

    This paper describes a self-study research project that focused on our experiences when planning, teaching, and evaluating a course in initial teacher education. The theoretical framework of technological pedagogical content knowledge (TPACK) was used as a conceptual structure for the self-study. Our understanding of the framework in relation to…

  12. A Critical Review of the Use of Wenger's Community of Practice (CoP) Theoretical Framework in Online and Blended Learning Research, 2000-2014

    ERIC Educational Resources Information Center

    Smith, Sedef Uzuner; Hayes, Suzanne; Shea, Peter

    2017-01-01

    After presenting a brief overview of the key elements that underpin Etienne Wenger's communities of practice (CoP) theoretical framework, one of the most widely cited and influential conceptions of social learning, this paper reviews extant empirical work grounded in this framework to investigate online/blended learning in higher education and in…

  13. A Theoretical Framework for Studying Adolescent Contraceptive Use.

    ERIC Educational Resources Information Center

    Urberg, Kathryn A.

    1982-01-01

    Presents a theoretical framework for viewing adolescent contraceptive usage. The problem-solving process is used for developmentally examining the competencies that must be present for effective contraceptive use, including: problem recognition, motivation, generation of alternatives, decision making and implementation. Each aspect is discussed…

  14. A 10 year (2000–2010) systematic review of interventions to improve quality of care in hospitals

    PubMed Central

    2012-01-01

    Background Against a backdrop of rising healthcare costs, variability in care provision and an increased emphasis on patient satisfaction, the need for effective interventions to improve quality of care has come to the fore. This is the first ten year (2000–2010) systematic review of interventions which sought to improve quality of care in a hospital setting. This review moves beyond a broad assessment of outcome significance levels and makes recommendations for future effective and accessible interventions. Methods Two researchers independently screened a total of 13,195 English language articles from the databases PsychInfo, Medline, PubMed, EmBase and CinNahl. There were 120 potentially relevant full text articles examined and 20 of those articles met the inclusion criteria. Results Included studies were heterogeneous in terms of approach and scientific rigour and varied in scope from small scale improvements for specific patient groups to large scale quality improvement programmes across multiple settings. Interventions were broadly categorised as either technical (n = 11) or interpersonal (n = 9). Technical interventions were in the main implemented by physicians and concentrated on improving care for patients with heart disease or pneumonia. Interpersonal interventions focused on patient satisfaction and tended to be implemented by nursing staff. Technical interventions had a tendency to achieve more substantial improvements in quality of care. Conclusions The rigorous application of inclusion criteria to studies established that despite the very large volume of literature on quality of care improvements, there is a paucity of hospital interventions with a theoretically based design or implementation. The screening process established that intervention studies to date have largely failed to identify their position along the quality of care spectrum. It is suggested that this lack of theoretical grounding may partly explain the minimal transfer of health research to date into policy. It is recommended that future interventions are established within a theoretical framework and that selected quality of care outcomes are assessed using this framework. Future interventions to improve quality of care will be most effective when they use a collaborative approach, involve multidisciplinary teams, utilise available resources, involve physicians and recognise the unique requirements of each patient group. PMID:22925835

  15. Addressing the Wicked Problem of Quality in Higher Education: Theoretical Approaches and Implications

    ERIC Educational Resources Information Center

    Krause, Kerri-Lee

    2012-01-01

    This article explores the wicked problem of quality in higher education, arguing for a more robust theorising of the subject at national, institutional and local department level. The focus of the discussion rests on principles for theorising in more rigorous ways about the multidimensional issue of quality. Quality in higher education is proposed…

  16. Mexican Educational Ethnography and the Work of the DIE: Crossing the Border and Finding the Historical Everyday [book review].

    ERIC Educational Resources Information Center

    Levinson, Bradley A.

    1998-01-01

    The theoretical insight and ethnographic rigor of this collection of essays from participants at Departamento de Investigaciones Educativas (DIE) of the National Polytechnic Institute about the role of the public school in Mexican social and political life promote understanding of educational processes in different contexts, including rural and…

  17. Polytechnic Engineering Mathematics: Assessing Its Relevance to the Productivity of Industries in Uganda

    ERIC Educational Resources Information Center

    Jehopio, Peter J.; Wesonga, Ronald

    2017-01-01

    Background: The main objective of the study was to examine the relevance of engineering mathematics to the emerging industries. The level of abstraction, the standard of rigor, and the depth of theoretical treatment are necessary skills expected of a graduate engineering technician to be derived from mathematical knowledge. The question of whether…

  18. A Very Short, Fairly Interesting and Reasonably Cheap Book about Qualitative Research. Very Short, Fairly Interesting & Reasonably Cheap Books

    ERIC Educational Resources Information Center

    Silverman, David

    2007-01-01

    In this book, the author shows how good research can be methodologically inventive, empirically rigorous, theoretically-alive and practically relevant. Using materials ranging from photographs to novels and newspaper stories this book demonstrates that getting to grips with these issues means asking fundamental questions about how we are…

  19. The Problem of Bio-Concepts: Biopolitics, Bio-Economy and the Political Economy of Nothing

    ERIC Educational Resources Information Center

    Birch, Kean

    2017-01-01

    Scholars in science and technology studies--and no doubt other fields--have increasingly drawn on Michel Foucault's concept of biopolitics to theorize a variety of new "bio-concepts." While there might be some theoretical value in such exercises, many of these bio-concepts have simply replaced more rigorous--and therefore…

  20. Beyond the Quantitative and Qualitative Divide: Research in Art Education as Border Skirmish.

    ERIC Educational Resources Information Center

    Sullivan, Graeme

    1996-01-01

    Analyzes a research project that utilizes a coherent conceptual model of art education research incorporating the demand for empirical rigor and providing for diverse interpretive frameworks. Briefly profiles the NUD*IST (Non-numerical Unstructured Data Indexing Searching and Theorizing) software system that can organize and retrieve complex…

  1. An Ex Post Facto Evaluation Framework for Place-Based Police Interventions

    ERIC Educational Resources Information Center

    Braga, Anthony A.; Hureau, David M.; Papachristos, Andrew V.

    2011-01-01

    Background: A small but growing body of research evidence suggests that place-based police interventions generate significant crime control gains. While place-based policing strategies have been adopted by a majority of U.S. police departments, very few agencies make a priori commitments to rigorous evaluations. Objective: Recent methodological…

  2. Memory Hazard Functions: A Vehicle for Theory Development and Test

    ERIC Educational Resources Information Center

    Chechile, Richard A.

    2006-01-01

    A framework is developed to rigorously test an entire class of memory retention functions by examining hazard properties. Evidence is provided that the memory hazard function is not monotonically decreasing. Yet most of the proposals for retention functions, which have emerged from the psychological literature, imply that memory hazard is…

  3. Reconstructing Constructivism: Causal Models, Bayesian Learning Mechanisms, and the Theory Theory

    ERIC Educational Resources Information Center

    Gopnik, Alison; Wellman, Henry M.

    2012-01-01

    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework…

  4. The Mauritian Education System: Was There a Will to Anglicize it?

    ERIC Educational Resources Information Center

    Tirvsssen, Rada

    2007-01-01

    Clive Whitehead (2005: 315-329) makes an indisputable claim that British colonial education is a controversial topic in the history of education. The macro study of educational systems undertaken within a framework that guarantees a systematic and rigorous approach can offer answers to many disputed issues, but researchers should not underestimate…

  5. Considerations for Designing Group Randomized Trials of Professional Development with Teacher Knowledge Outcomes

    ERIC Educational Resources Information Center

    Kelcey, Ben; Phelps, Geoffrey

    2013-01-01

    Despite recent shifts in research emphasizing the value of carefully designed experiments, the number of studies of teacher professional development with rigorous designs has lagged behind its student outcome counterparts. We outline a framework for the design of group randomized trials (GRTs) with teachers' knowledge as the outcome and…

  6. Further Iterations on Using the Problem-Analysis Framework

    ERIC Educational Resources Information Center

    Annan, Michael; Chua, Jocelyn; Cole, Rachel; Kennedy, Emma; James, Robert; Markusdottir, Ingibjorg; Monsen, Jeremy; Robertson, Lucy; Shah, Sonia

    2013-01-01

    A core component of applied educational and child psychology practice is the skilfulness with which practitioners are able to rigorously structure and conceptualise complex real world human problems. This is done in such a way that when they (with others) jointly work on them, there is an increased likelihood of positive outcomes being achieved…

  7. Integrative taxonomy: Where we are now, with a focus on the resolution of three tropical fruit fly species complexes

    USDA-ARS?s Scientific Manuscript database

    Accurate species delimitation underpins good taxonomy. Formalisation of integrative taxonomy in the last decade has provided a framework for using multidisciplinary data to increase rigor in species delimitation hypotheses. We address the state of integrative taxonomy by using an international proje...

  8. International Service Learning: Conceptual Frameworks and Research. IUPUI Series on Service Learning Research 1

    ERIC Educational Resources Information Center

    Bringle, Robert G., Ed.; Hatcher, Julie A., Ed.; Jones, Steven G., Ed.

    2010-01-01

    This book focuses on conducting research on International Service Learning (ISL), which includes developing and evaluating hypotheses about ISL outcomes and measuring its impact on students, faculty, and communities. The book argues that rigorous research is essential to improving the quality of ISL's implementation and delivery, and providing the…

  9. School Governor Regulation in England's Changing Education Landscape

    ERIC Educational Resources Information Center

    Baxter, Jacqueline

    2017-01-01

    The changing education landscape in England, combined with a more rigorous form of governor regulation in the form of the Ofsted 2012 Inspection Framework, are together placing more demands than ever before on the 300,000 volunteer school governors in England. These school governors are, in many cases, directly accountable to the Secretary of…

  10. Evaluating Interactive Policy Making on Biotechnology: The Case of the Dutch Ministry of Health, Welfare and Sport

    ERIC Educational Resources Information Center

    Broerse, Jacqueline E. W.; de Cock Buning, Tjard; Roelofsen, Anneloes; Bunders, Joske F. G.

    2009-01-01

    Public engagement is increasingly advocated and applied in the development and implementation of technological innovations. However, initiatives so far are rarely considered effective. There is a need for more methodological rigor and insight into conducive conditions. The authors developed an evaluative framework and assessed accordingly the…

  11. 75 FR 47573 - Career and Technical Education Program-Promoting Rigorous Career and Technical Education Programs...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-06

    ... program is to use 10 key components based on the ``Program of Study Design Framework'' [[Page 47574...., the States' Career Clusters \\2\\), and offer students the opportunities to earn postsecondary credits... extent to which students are attaining necessary knowledge and skills, we agree that administrators...

  12. David crighton, 1942-2000: a commentary on his career and his influence on aeroacoustic theory

    NASA Astrophysics Data System (ADS)

    Ffowcs Williams, John E.

    David Crighton, a greatly admired figure in fluid mechanics, Head of the Department of Applied Mathematics and Theoretical Physics at Cambridge, and Master of Jesus College, Cambridge, died at the peak of his career. He had made important contributions to the theory of waves generated by unsteady flow. Crighton's work was always characterized by the application of rigorous mathematical approximations to fluid mechanical idealizations of practically relevant problems. At the time of his death, he was certainly the most influential British applied mathematical figure, and his former collaborators and students form a strong school that continues his special style of mathematical application. Rigorous analysis of well-posed aeroacoustical problems was transformed by David Crighton.

  13. The problem with the phrase women and minorities: intersectionality-an important theoretical framework for public health.

    PubMed

    Bowleg, Lisa

    2012-07-01

    Intersectionality is a theoretical framework that posits that multiple social categories (e.g., race, ethnicity, gender, sexual orientation, socioeconomic status) intersect at the micro level of individual experience to reflect multiple interlocking systems of privilege and oppression at the macro, social-structural level (e.g., racism, sexism, heterosexism). Public health's commitment to social justice makes it a natural fit with intersectionality's focus on multiple historically oppressed populations. Yet despite a plethora of research focused on these populations, public health studies that reflect intersectionality in their theoretical frameworks, designs, analyses, or interpretations are rare. Accordingly, I describe the history and central tenets of intersectionality, address some theoretical and methodological challenges, and highlight the benefits of intersectionality for public health theory, research, and policy.

  14. The Problem With the Phrase Women and Minorities: Intersectionality—an Important Theoretical Framework for Public Health

    PubMed Central

    2012-01-01

    Intersectionality is a theoretical framework that posits that multiple social categories (e.g., race, ethnicity, gender, sexual orientation, socioeconomic status) intersect at the micro level of individual experience to reflect multiple interlocking systems of privilege and oppression at the macro, social-structural level (e.g., racism, sexism, heterosexism). Public health’s commitment to social justice makes it a natural fit with intersectionality’s focus on multiple historically oppressed populations. Yet despite a plethora of research focused on these populations, public health studies that reflect intersectionality in their theoretical frameworks, designs, analyses, or interpretations are rare. Accordingly, I describe the history and central tenets of intersectionality, address some theoretical and methodological challenges, and highlight the benefits of intersectionality for public health theory, research, and policy. PMID:22594719

  15. Adopting Health Behavior Change Theory throughout the Clinical Practice Guideline Process

    ERIC Educational Resources Information Center

    Ceccato, Natalie E.; Ferris, Lorraine E.; Manuel, Douglas; Grimshaw, Jeremy M.

    2007-01-01

    Adopting a theoretical framework throughout the clinical practice guideline (CPG) process (development, dissemination, implementation, and evaluation) can be useful in systematically identifying, addressing, and explaining behavioral influences impacting CPG uptake and effectiveness. This article argues that using a theoretical framework should…

  16. [A framework for evaluating ethical issues of public health initiatives: practical aspects and theoretical implications].

    PubMed

    Petrini, Carlo

    2015-01-01

    The "Framework for the Ethical Conduct of Public Health Initiatives", developed by Public Health Ontario, is a practical guide for assessing the ethical implications of evidence-generating public health initiatives, whether research or non-research activities, involving people, their biological materials or their personal information. The Framework is useful not only to those responsible for determining the ethical acceptability of an initiative, but also to investigators planning new public health initiatives. It is informed by a theoretical approach that draws on widely shared bioethical principles. Two considerations emerge from both the theoretical framework and its practical application: the line between practice and research is often blurred; public health ethics and biomedical research ethics are based on the same common heritage of values.

  17. The role of language in learning physics

    NASA Astrophysics Data System (ADS)

    Brookes, David T.

    Many studies in PER suggest that language poses a serious difficulty for students learning physics. These difficulties are mostly attributed to misunderstanding of specialized terminology. This terminology often assigns new meanings to everyday terms used to describe physical models and phenomena. In this dissertation I present a novel approach to analyzing of the role of language in learning physics. This approach is based on the analysis of the historical development of physics ideas, the language of modern physicists, and students' difficulties in the areas of quantum mechanics, classical mechanics, and thermodynamics. These data are analyzed using linguistic tools borrowed from cognitive linguistics and systemic functional grammar. Specifically, I combine the idea of conceptual metaphor and grammar to build a theoretical framework that accounts for: (1) the role and function that language serves for physicists when they speak and reason about physical ideas and phenomena, (2) specific features of students' reasoning and difficulties that may be related to or derived from language that students read or hear. The theoretical framework is developed using the methodology of a grounded theoretical approach. The theoretical framework allows us to make predictions about the relationship between student discourse and their conceptual and problem solving difficulties. Tests of the theoretical framework are presented in the context of "heat" in thermodynamics and "force" in dynamics. In each case the language that students use to reason about the concepts of "heat" and "force" is analyzed using the theoretical framework. The results of this analysis show that language is very important in students' learning. In particular, students are (1) using features of physicists' conceptual metaphors to reason about physical phenomena, often overextending and misapplying these features, (2) drawing cues from the grammar of physicists' speech and writing to categorize physics concepts; this categorization of physics concepts plays a key role in students' ability to solve physics problems. In summary, I present a theoretical framework that provides a possible explanation of the role that language plays in learning physics. The framework also attempts to account for how and why physicists' language influences students in the way that it does.

  18. Understanding the Role of Numeracy in Health: Proposed Theoretical Framework and Practical Insights

    PubMed Central

    Lipkus, Isaac M.; Peters, Ellen

    2009-01-01

    Numeracy, that is how facile people are with mathematical concepts and their applications, is gaining importance in medical decision making and risk communication. This paper proposes six critical functions of health numeracy. These functions are integrated into a theoretical framework on health numeracy that has implications for risk-communication and medical-decision-making processes. We examine practical underpinnings for targeted interventions aimed at improving such processes as a function of health numeracy. It is hoped that the proposed functions and theoretical framework will spur more research to determine how an understanding of health numeracy can lead to more effective communication and decision outcomes. PMID:19834054

  19. Towards a Theoretical Framework for Educational Simulations.

    ERIC Educational Resources Information Center

    Winer, Laura R.; Vazquez-Abad, Jesus

    1981-01-01

    Discusses the need for a sustained and systematic effort toward establishing a theoretical framework for educational simulations, proposes the adaptation of models borrowed from the natural and applied sciences, and describes three simulations based on such a model adapted using Brunerian learning theory. Sixteen references are listed. (LLS)

  20. Evolution or Revolution: Mobility Requirements for the AirLand Battle Future Concept

    DTIC Science & Technology

    1991-02-20

    analysis and the model a theoretical framework for tactical mobility is established. The considerations for tactical mobility on the future battlefield are...examined in the context of the theoretical framework . Finally, using the criteria of sufficiency, feasibility, and the time/space continuum, the

  1. Formulating a Theoretical Framework for Assessing Network Loads for Effective Deployment in Network-Centric Operations and Warfare

    DTIC Science & Technology

    2008-11-01

    is particularly important in order to design a network that is realistically deployable. The goal of this project is the design of a theoretical ... framework to assess and predict the effectiveness and performance of networks and their loads.

  2. School District Organization and Student Dropout.

    ERIC Educational Resources Information Center

    Engelhard, George, Jr.

    The purpose of this study was to develop and test a theoretical framework that would examine the structural relationships between select organizational and environmental variables and school district effectiveness in Michigan. The theoretical framework was derived from organizational theory and represents a social-ecological approach to the study…

  3. Educational Communities of Inquiry: Theoretical Framework, Research and Practice

    ERIC Educational Resources Information Center

    Akyol, Zehra; Garrison, D. Randy

    2013-01-01

    Communications technologies have been continuously integrated into learning and training environments which has revealed the need for a clear understanding of the process. The Community of Inquiry (COI) Theoretical Framework has a philosophical foundation which provides planned guidelines and principles to development useful learning environments…

  4. Exploring How Globalization Shapes Education: Methodology and Theoretical Framework

    ERIC Educational Resources Information Center

    Pan, Su-Yan

    2010-01-01

    This is a commentary on some major issues raised in Carter and Dediwalage's "Globalisation and science education: The case of "Sustainability by the bay"" (this issue), particularly their methodology and theoretical framework for understanding how globalisation shapes education (including science education). While acknowledging the authors'…

  5. Chaotic attractors and physical measures for some density dependent Leslie population models

    NASA Astrophysics Data System (ADS)

    Ugarcovici, Ilie; Weiss, Howard

    2007-12-01

    Following ecologists' discoveries, mathematicians have begun studying extensions of the ubiquitous age structured Leslie population model that allow some survival probabilities and/or fertility rates to depend on population densities. These nonlinear extensions commonly exhibit very complicated dynamics: through computer studies, some authors have discovered robust Hénon-like strange attractors in several families. Population biologists and demographers frequently wish to average a function over many generations and conclude that the average is independent of the initial population distribution. This type of 'ergodicity' seems to be a fundamental tenet in population biology. In this paper we develop the first rigorous ergodic theoretic framework for density dependent Leslie population models. We study two generation models with Ricker and Hassell (recruitment type) fertility terms. We prove that for some parameter regions these models admit a chaotic (ergodic) attractor which supports a unique physical probability measure. This physical measure, having full Lebesgue measure basin, satisfies in the strongest possible sense the population biologist's requirement for ergodicity in their population models. We use the celebrated work of Wang and Young 2001 Commun. Math. Phys. 218 1-97, and our results are the first applications of their method to biology, ecology or demography.

  6. Finding the way with a noisy brain.

    PubMed

    Cheung, Allen; Vickerstaff, Robert

    2010-11-11

    Successful navigation is fundamental to the survival of nearly every animal on earth, and achieved by nervous systems of vastly different sizes and characteristics. Yet surprisingly little is known of the detailed neural circuitry from any species which can accurately represent space for navigation. Path integration is one of the oldest and most ubiquitous navigation strategies in the animal kingdom. Despite a plethora of computational models, from equational to neural network form, there is currently no consensus, even in principle, of how this important phenomenon occurs neurally. Recently, all path integration models were examined according to a novel, unifying classification system. Here we combine this theoretical framework with recent insights from directed walk theory, and develop an intuitive yet mathematically rigorous proof that only one class of neural representation of space can tolerate noise during path integration. This result suggests many existing models of path integration are not biologically plausible due to their intolerance to noise. This surprising result imposes significant computational limitations on the neurobiological spatial representation of all successfully navigating animals, irrespective of species. Indeed, noise-tolerance may be an important functional constraint on the evolution of neuroarchitectural plans in the animal kingdom.

  7. The Convoy Model: Explaining Social Relations From a Multidisciplinary Perspective

    PubMed Central

    Antonucci, Toni C.

    2014-01-01

    Purpose of the Study: Social relations are a key aspect of aging and the life course. In this paper, we trace the scientific origins of the study of social relations, focusing in particular on research grounded in the convoy model. Design and Methods: We first briefly review and critique influential historical studies to illustrate how the scientific study of social relations developed. Next, we highlight early and current findings grounded in the convoy model that have provided key insights into theory, method, policy, and practice in the study of aging. Results: Early social relations research, while influential, lacked the combined approach of theoretical grounding and methodological rigor. Nevertheless, previous research findings, especially from anthropology, suggested the importance of social relations in the achievement of positive outcomes. Considering both life span and life course perspectives and grounded in a multidisciplinary perspective, the convoy model was developed to unify and consolidate scattered evidence while at the same time directing future empirical and applied research. Early findings are summarized, current evidence presented, and future directions projected. Implications: The convoy model has provided a useful framework in the study of aging, especially for understanding predictors and consequences of social relations across the life course. PMID:24142914

  8. The logic of counterfactual analysis in case-study explanation.

    PubMed

    Mahoney, James; Barrenechea, Rodrigo

    2017-12-19

    In this paper, we develop a set-theoretic and possible worlds approach to counterfactual analysis in case-study explanation. Using this approach, we first consider four kinds of counterfactuals: necessary condition counterfactuals, SUIN condition counterfactuals, sufficient condition counterfactuals, and INUS condition counterfactuals. We explore the distinctive causal claims entailed in each, and conclude that necessary condition and SUIN condition counterfactuals are the most useful types for hypothesis assessment in case-study research. We then turn attention to the development of a rigorous understanding of the 'minimal-rewrite' rule, linking this rule to insights from set theory about the relative importance of necessary conditions. We show why, logically speaking, a comparative analysis of two necessary condition counterfactuals will tend to favour small events and contingent happenings. A third section then presents new tools for specifying the level of generality of the events in a counterfactual. We show why and how the goals of formulating empirically important versus empirically plausible counterfactuals stand in tension with one another. Finally, we use our framework to link counterfactual analysis to causal sequences, which in turn provides advantages for conducting counterfactual projections. © London School of Economics and Political Science 2017.

  9. Social learning and the replication process: an experimental investigation.

    PubMed

    Derex, Maxime; Feron, Romain; Godelle, Bernard; Raymond, Michel

    2015-06-07

    Human cultural traits typically result from a gradual process that has been described as analogous to biological evolution. This observation has led pioneering scholars to draw inspiration from population genetics to develop a rigorous and successful theoretical framework of cultural evolution. Social learning, the mechanism allowing information to be transmitted between individuals, has thus been described as a simple replication mechanism. Although useful, the extent to which this idealization appropriately describes the actual social learning events has not been carefully assessed. Here, we used a specifically developed computer task to evaluate (i) the extent to which social learning leads to the replication of an observed behaviour and (ii) the consequences it has for fitness landscape exploration. Our results show that social learning does not lead to a dichotomous choice between disregarding and replicating social information. Rather, it appeared that individuals combine and transform information coming from multiple sources to produce new solutions. As a consequence, landscape exploration was promoted by the use of social information. These results invite us to rethink the way social learning is commonly modelled and could question the validity of predictions coming from models considering this process as replicative. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  10. Convective heat transfer in a measurement cell for scanning electrochemical microscopy.

    PubMed

    Novev, Javor K; Compton, Richard G

    2016-11-21

    Electrochemical experiments, especially those performed with scanning electrochemical microscopy (SECM), are often carried out without taking special care to thermostat the solution; it is usually assumed that its temperature is homogeneous and equal to the ambient. The present study aims to test this assumption via numerical simulations of the heat transfer in a particular system - the typical measurement cell for SECM. It is assumed that the temperature of the solution is initially homogeneous but different from that of its surroundings; convective heat transfer in the solution and the surrounding air is taken into account within the framework of the Boussinesq approximation. The hereby presented theoretical treatment indicates that an initial temperature difference of the order of 1 K dissipates with a characteristic time scale of ∼1000 s; the thermal equilibration is accompanied by convective flows with a maximum velocity of ∼10 -4 m s -1 ; furthermore, the temporal evolution of the temperature profile is influenced by the sign of the initial difference. These results suggest that, unless the temperature of the solution is rigorously controlled, convection may significantly compromise the interpretation of data from SECM and other electrochemical techniques, which is usually done on the basis of diffusion-only models.

  11. Adaptive tracking control for active suspension systems with non-ideal actuators

    NASA Astrophysics Data System (ADS)

    Pan, Huihui; Sun, Weichao; Jing, Xingjian; Gao, Huijun; Yao, Jianyong

    2017-07-01

    As a critical component of transportation vehicles, active suspension systems are instrumental in the improvement of ride comfort and maneuverability. However, practical active suspensions commonly suffer from parameter uncertainties (e.g., the variations of payload mass and suspension component parameters), external disturbances and especially the unknown non-ideal actuators (i.e., dead-zone and hysteresis nonlinearities), which always significantly deteriorate the control performance in practice. To overcome these issues, this paper synthesizes an adaptive tracking control strategy for vehicle suspension systems to achieve suspension performance improvements. The proposed control algorithm is formulated by developing a unified framework of non-ideal actuators rather than a separate way, which is a simple yet effective approach to remove the unexpected nonlinear effects. From the perspective of practical implementation, the advantages of the presented controller for active suspensions include that the assumptions on the measurable actuator outputs, the prior knowledge of nonlinear actuator parameters and the uncertain parameters within a known compact set are not required. Furthermore, the stability of the closed-loop suspension system is theoretically guaranteed by rigorous mathematical analysis. Finally, the effectiveness of the presented adaptive control scheme is confirmed using comparative numerical simulation validations.

  12. Quantum Discord Determines the Interferometric Power of Quantum States

    NASA Astrophysics Data System (ADS)

    Girolami, Davide; Souza, Alexandre M.; Giovannetti, Vittorio; Tufarelli, Tommaso; Filgueiras, Jefferson G.; Sarthour, Roberto S.; Soares-Pinto, Diogo O.; Oliveira, Ivan S.; Adesso, Gerardo

    2014-05-01

    Quantum metrology exploits quantum mechanical laws to improve the precision in estimating technologically relevant parameters such as phase, frequency, or magnetic fields. Probe states are usually tailored to the particular dynamics whose parameters are being estimated. Here we consider a novel framework where quantum estimation is performed in an interferometric configuration, using bipartite probe states prepared when only the spectrum of the generating Hamiltonian is known. We introduce a figure of merit for the scheme, given by the worst-case precision over all suitable Hamiltonians, and prove that it amounts exactly to a computable measure of discord-type quantum correlations for the input probe. We complement our theoretical results with a metrology experiment, realized in a highly controllable room-temperature nuclear magnetic resonance setup, which provides a proof-of-concept demonstration for the usefulness of discord in sensing applications. Discordant probes are shown to guarantee a nonzero phase sensitivity for all the chosen generating Hamiltonians, while classically correlated probes are unable to accomplish the estimation in a worst-case setting. This work establishes a rigorous and direct operational interpretation for general quantum correlations, shedding light on their potential for quantum technology.

  13. Multi-Source Multi-Target Dictionary Learning for Prediction of Cognitive Decline.

    PubMed

    Zhang, Jie; Li, Qingyang; Caselli, Richard J; Thompson, Paul M; Ye, Jieping; Wang, Yalin

    2017-06-01

    Alzheimer's Disease (AD) is the most common type of dementia. Identifying correct biomarkers may determine pre-symptomatic AD subjects and enable early intervention. Recently, Multi-task sparse feature learning has been successfully applied to many computer vision and biomedical informatics researches. It aims to improve the generalization performance by exploiting the shared features among different tasks. However, most of the existing algorithms are formulated as a supervised learning scheme. Its drawback is with either insufficient feature numbers or missing label information. To address these challenges, we formulate an unsupervised framework for multi-task sparse feature learning based on a novel dictionary learning algorithm. To solve the unsupervised learning problem, we propose a two-stage Multi-Source Multi-Target Dictionary Learning (MMDL) algorithm. In stage 1, we propose a multi-source dictionary learning method to utilize the common and individual sparse features in different time slots. In stage 2, supported by a rigorous theoretical analysis, we develop a multi-task learning method to solve the missing label problem. Empirical studies on an N = 3970 longitudinal brain image data set, which involves 2 sources and 5 targets, demonstrate the improved prediction accuracy and speed efficiency of MMDL in comparison with other state-of-the-art algorithms.

  14. Automated inference procedure for the determination of cell growth parameters

    NASA Astrophysics Data System (ADS)

    Harris, Edouard A.; Koh, Eun Jee; Moffat, Jason; McMillen, David R.

    2016-01-01

    The growth rate and carrying capacity of a cell population are key to the characterization of the population's viability and to the quantification of its responses to perturbations such as drug treatments. Accurate estimation of these parameters necessitates careful analysis. Here, we present a rigorous mathematical approach for the robust analysis of cell count data, in which all the experimental stages of the cell counting process are investigated in detail with the machinery of Bayesian probability theory. We advance a flexible theoretical framework that permits accurate estimates of the growth parameters of cell populations and of the logical correlations between them. Moreover, our approach naturally produces an objective metric of avoidable experimental error, which may be tracked over time in a laboratory to detect instrumentation failures or lapses in protocol. We apply our method to the analysis of cell count data in the context of a logistic growth model by means of a user-friendly computer program that automates this analysis, and present some samples of its output. Finally, we note that a traditional least squares fit can provide misleading estimates of parameter values, because it ignores available information with regard to the way in which the data have actually been collected.

  15. Directly Measuring the Degree of Quantum Coherence using Interference Fringes

    NASA Astrophysics Data System (ADS)

    Wang, Yi-Tao; Tang, Jian-Shun; Wei, Zhi-Yuan; Yu, Shang; Ke, Zhi-Jin; Xu, Xiao-Ye; Li, Chuan-Feng; Guo, Guang-Can

    2017-01-01

    Quantum coherence is the most distinguished feature of quantum mechanics. It lies at the heart of the quantum-information technologies as the fundamental resource and is also related to other quantum resources, including entanglement. It plays a critical role in various fields, even in biology. Nevertheless, the rigorous and systematic resource-theoretic framework of coherence has just been developed recently, and several coherence measures are proposed. Experimentally, the usual method to measure coherence is to perform state tomography and use mathematical expressions. Here, we alternatively develop a method to measure coherence directly using its most essential behavior—the interference fringes. The ancilla states are mixed into the target state with various ratios, and the minimal ratio that makes the interference fringes of the "mixed state" vanish is taken as the quantity of coherence. We also use the witness observable to witness coherence, and the optimal witness constitutes another direct method to measure coherence. For comparison, we perform tomography and calculate l1 norm of coherence, which coincides with the results of the other two methods in our situation. Our methods are explicit and robust, providing a nice alternative to the tomographic technique.

  16. Directly Measuring the Degree of Quantum Coherence using Interference Fringes.

    PubMed

    Wang, Yi-Tao; Tang, Jian-Shun; Wei, Zhi-Yuan; Yu, Shang; Ke, Zhi-Jin; Xu, Xiao-Ye; Li, Chuan-Feng; Guo, Guang-Can

    2017-01-13

    Quantum coherence is the most distinguished feature of quantum mechanics. It lies at the heart of the quantum-information technologies as the fundamental resource and is also related to other quantum resources, including entanglement. It plays a critical role in various fields, even in biology. Nevertheless, the rigorous and systematic resource-theoretic framework of coherence has just been developed recently, and several coherence measures are proposed. Experimentally, the usual method to measure coherence is to perform state tomography and use mathematical expressions. Here, we alternatively develop a method to measure coherence directly using its most essential behavior-the interference fringes. The ancilla states are mixed into the target state with various ratios, and the minimal ratio that makes the interference fringes of the "mixed state" vanish is taken as the quantity of coherence. We also use the witness observable to witness coherence, and the optimal witness constitutes another direct method to measure coherence. For comparison, we perform tomography and calculate l_{1} norm of coherence, which coincides with the results of the other two methods in our situation. Our methods are explicit and robust, providing a nice alternative to the tomographic technique.

  17. Measurement of students' perceptions of nursing as a career.

    PubMed

    Matutina, Robin E; Newman, Susan D; Jenkins, Carolyn M

    2010-09-01

    Middle school has been identified as the prime age group to begin nursing recruitment efforts because students have malleable perceptions about nursing as a future career choice. The purpose of this integrative review is to present a brief overview of research processes related to middle school students' perceptions of nursing as a future career choice and to critically evaluate the current instruments used to measure middle and high school students' perceptions of nursing as a career choice. An integrative review of the years 1989 to 2009 was conducted searching Cumulative Index to Nursing and Allied Health Literature (CINAHL), National Library of Medicine PubMed service (PubMed), and Ovid MEDLINE databases using the key words career, choice, future, ideal, nursing, and perception. Reference lists of retrieved studies were hand searched, yielding a total of 22 studies. Inclusion criteria were (a) sample of middle school students, (b) sample of high school students, (c) mixed sample including middle or high school students, and (4) samples other than middle or high school students if the instrument was tested with middle or high school students in a separate study. Ten studies met these criteria. Of the 10 studies, samples were 30% middle school students; 40% high school students; 10% mixed, including school-aged students; and 20% college students with an instrument tested in middle school students. Eighty percent of participants were White females. Overall, participants' socioeconomic status was not identified. A single study included a theoretical framework. Five instruments were identified and each could be completed in 15 to 30 min. The most commonly used instrument is available free of charge. Seventy percent of the studies used Cronbach's alpha to report instrument reliability (0.63 to 0.93), whereas 30% failed to report reliability. Fifty percent of the studies established validity via a "panel of experts," with three of those studies further describing the panel of experts. Samples of white females may hinder generalization. Socioeconomic status was not consistently reported and may be an important factor with regard to perceptions of nursing as a career choice. An overall absence of theoretical framework hinders empirical data from being applied to nursing theories that in turn may support nursing concepts. The reporting of reliability and validity may be improved by further defining panel of experts and expanding the number of experts (more than seven). More in-depth evaluation of the psychometric properties of the instruments with more diverse populations is needed. Rigorously tested instruments may be useful in determining middle school students' perceptions about nursing. Therefore, future researchers should consider testing existing instruments in the middle school population, adhering to theoretical frameworks, diversifying the sample population, and clearly reporting reliability and validity to gain knowledge about middle school students' perceptions about a nursing career.

  18. An Expanded Theoretical Framework of Care Coordination Across Transitions in Care Settings.

    PubMed

    Radwin, Laurel E; Castonguay, Denise; Keenan, Carolyn B; Hermann, Cherice

    2016-01-01

    For many patients, high-quality, patient-centered, and cost-effective health care requires coordination among multiple clinicians and settings. Ensuring optimal care coordination requires a clear understanding of how clinician activities and continuity during transitions affect patient-centeredness and quality outcomes. This article describes an expanded theoretical framework to better understand care coordination. The framework provides clear articulation of concepts. Examples are provided of ways to measure the concepts.

  19. Intentional research design in implementation science: implications for the use of nomothetic and idiographic assessment.

    PubMed

    Lyon, Aaron R; Connors, Elizabeth; Jensen-Doss, Amanda; Landes, Sara J; Lewis, Cara C; McLeod, Bryce D; Rutt, Christopher; Stanick, Cameo; Weiner, Bryan J

    2017-09-01

    The advancement of implementation science is dependent on identifying assessment strategies that can address implementation and clinical outcome variables in ways that are valid, relevant to stakeholders, and scalable. This paper presents a measurement agenda for implementation science that integrates the previously disparate assessment traditions of idiographic and nomothetic approaches. Although idiographic and nomothetic approaches are both used in implementation science, a review of the literature on this topic suggests that their selection can be indiscriminate, driven by convenience, and not explicitly tied to research study design. As a result, they are not typically combined deliberately or effectively. Thoughtful integration may simultaneously enhance both the rigor and relevance of assessments across multiple levels within health service systems. Background on nomothetic and idiographic assessment is provided as well as their potential to support research in implementation science. Drawing from an existing framework, seven structures (of various sequencing and weighting options) and five functions (Convergence, Complementarity, Expansion, Development, Sampling) for integrating conceptually distinct research methods are articulated as they apply to the deliberate, design-driven integration of nomothetic and idiographic assessment approaches. Specific examples and practical guidance are provided to inform research consistent with this framework. Selection and integration of idiographic and nomothetic assessments for implementation science research designs can be improved. The current paper argues for the deliberate application of a clear framework to improve the rigor and relevance of contemporary assessment strategies.

  20. Geometry of behavioral spaces: A computational approach to analysis and understanding of agent based models and agent behaviors

    NASA Astrophysics Data System (ADS)

    Cenek, Martin; Dahl, Spencer K.

    2016-11-01

    Systems with non-linear dynamics frequently exhibit emergent system behavior, which is important to find and specify rigorously to understand the nature of the modeled phenomena. Through this analysis, it is possible to characterize phenomena such as how systems assemble or dissipate and what behaviors lead to specific final system configurations. Agent Based Modeling (ABM) is one of the modeling techniques used to study the interaction dynamics between a system's agents and its environment. Although the methodology of ABM construction is well understood and practiced, there are no computational, statistically rigorous, comprehensive tools to evaluate an ABM's execution. Often, a human has to observe an ABM's execution in order to analyze how the ABM functions, identify the emergent processes in the agent's behavior, or study a parameter's effect on the system-wide behavior. This paper introduces a new statistically based framework to automatically analyze agents' behavior, identify common system-wide patterns, and record the probability of agents changing their behavior from one pattern of behavior to another. We use network based techniques to analyze the landscape of common behaviors in an ABM's execution. Finally, we test the proposed framework with a series of experiments featuring increasingly emergent behavior. The proposed framework will allow computational comparison of ABM executions, exploration of a model's parameter configuration space, and identification of the behavioral building blocks in a model's dynamics.

  1. Geometry of behavioral spaces: A computational approach to analysis and understanding of agent based models and agent behaviors.

    PubMed

    Cenek, Martin; Dahl, Spencer K

    2016-11-01

    Systems with non-linear dynamics frequently exhibit emergent system behavior, which is important to find and specify rigorously to understand the nature of the modeled phenomena. Through this analysis, it is possible to characterize phenomena such as how systems assemble or dissipate and what behaviors lead to specific final system configurations. Agent Based Modeling (ABM) is one of the modeling techniques used to study the interaction dynamics between a system's agents and its environment. Although the methodology of ABM construction is well understood and practiced, there are no computational, statistically rigorous, comprehensive tools to evaluate an ABM's execution. Often, a human has to observe an ABM's execution in order to analyze how the ABM functions, identify the emergent processes in the agent's behavior, or study a parameter's effect on the system-wide behavior. This paper introduces a new statistically based framework to automatically analyze agents' behavior, identify common system-wide patterns, and record the probability of agents changing their behavior from one pattern of behavior to another. We use network based techniques to analyze the landscape of common behaviors in an ABM's execution. Finally, we test the proposed framework with a series of experiments featuring increasingly emergent behavior. The proposed framework will allow computational comparison of ABM executions, exploration of a model's parameter configuration space, and identification of the behavioral building blocks in a model's dynamics.

  2. A model for rigorously applying the Exploration, Preparation, Implementation, Sustainment (EPIS) framework in the design and measurement of a large scale collaborative multi-site study.

    PubMed

    Becan, Jennifer E; Bartkowski, John P; Knight, Danica K; Wiley, Tisha R A; DiClemente, Ralph; Ducharme, Lori; Welsh, Wayne N; Bowser, Diana; McCollister, Kathryn; Hiller, Matthew; Spaulding, Anne C; Flynn, Patrick M; Swartzendruber, Andrea; Dickson, Megan F; Fisher, Jacqueline Horan; Aarons, Gregory A

    2018-04-13

    This paper describes the means by which a United States National Institute on Drug Abuse (NIDA)-funded cooperative, Juvenile Justice-Translational Research on Interventions for Adolescents in the Legal System (JJ-TRIALS), utilized an established implementation science framework in conducting a multi-site, multi-research center implementation intervention initiative. The initiative aimed to bolster the ability of juvenile justice agencies to address unmet client needs related to substance use while enhancing inter-organizational relationships between juvenile justice and local behavioral health partners. The EPIS (Exploration, Preparation, Implementation, Sustainment) framework was selected and utilized as the guiding model from inception through project completion; including the mapping of implementation strategies to EPIS stages, articulation of research questions, and selection, content, and timing of measurement protocols. Among other key developments, the project led to a reconceptualization of its governing implementation science framework into cyclical form as the EPIS Wheel. The EPIS Wheel is more consistent with rapid-cycle testing principles and permits researchers to track both progressive and recursive movement through EPIS. Moreover, because this randomized controlled trial was predicated on a bundled strategy method, JJ-TRIALS was designed to rigorously test progress through the EPIS stages as promoted by facilitation of data-driven decision making principles. The project extended EPIS by (1) elucidating the role and nature of recursive activity in promoting change (yielding the circular EPIS Wheel), (2) by expanding the applicability of the EPIS framework beyond a single evidence-based practice (EBP) to address varying process improvement efforts (representing varying EBPs), and (3) by disentangling outcome measures of progression through EPIS stages from the a priori established study timeline. The utilization of EPIS in JJ-TRIALS provides a model for practical and applied use of implementation frameworks in real-world settings that span outer service system and inner organizational contexts in improving care for vulnerable populations. NCT02672150 . Retrospectively registered on 22 January 2016.

  3. Thematic Processes in the Comprehension of Technical Prose.

    DTIC Science & Technology

    1982-02-20

    theoretical framework for this process is that the important content of a passage is constructed by the reader based on the semantic content of the...against actual reader behavior. These models represent the general theoretical framework in a highly specific way, and thus summarize the major results of the project. (Author)

  4. Optimizing the Long-Term Retention of Skills: Structural and Analytic Approaches to Skill Maintenance

    DTIC Science & Technology

    1990-08-01

    evidence for a surprising degree of long-term skill retention. We formulated a theoretical framework , focusing on the importance of procedural reinstatement...considerable forgetting over even relatively short retention intervals. We have been able to place these studies in the same general theoretical framework developed

  5. Time, Space, and Mass at the Operational Level of War: The Dynamics of the Culminating Point,

    DTIC Science & Technology

    1988-04-28

    theoretical framework for operational culmination and then examining the theory as reflected in recent history. This paper focuses on the concept of...the paper first examines key definitions and provides a theoretical framework for understanding culmination. Next, it considers the application of the

  6. Strategic Innovation in HE: The Roles of Academic Middle Managers

    ERIC Educational Resources Information Center

    Kallenberg, Ton

    2007-01-01

    This article explains the development of, and presents a theoretical framework for, harnessing the roles of the academic middle manager in strategic innovation in Dutch higher education, thereby increasing higher education's ability to learn, innovate and develop a competitive advantage. The framework is developed from theoretical models of role…

  7. Implicit Theoretical Leadership Frameworks of Higher Education Administrators.

    ERIC Educational Resources Information Center

    Lees, Kimberly; And Others

    Colleges and universities have a unique organizational culture that influences the decision-making processes used by leaders of higher education. This paper presents findings of a study that attempted to identify the theoretical frameworks that administrators of higher education use to guide their decision-making processes. The following…

  8. NLPIR: A Theoretical Framework for Applying Natural Language Processing to Information Retrieval.

    ERIC Educational Resources Information Center

    Zhou, Lina; Zhang, Dongsong

    2003-01-01

    Proposes a theoretical framework called NLPIR that integrates natural language processing (NLP) into information retrieval (IR) based on the assumption that there exists representation distance between queries and documents. Discusses problems in traditional keyword-based IR, including relevance, and describes some existing NLP techniques.…

  9. A New Theoretical Approach to Postsecondary Student Disability: Disability-Diversity (Dis)Connect Model

    ERIC Educational Resources Information Center

    Aquino, Katherine C.

    2016-01-01

    Disability is often viewed as an obstacle to postsecondary inclusion, but not a characteristic of student diversity. Additionally, current theoretical frameworks isolate disability from other student diversity characteristics. In response, a new conceptual framework, the Disability-Diversity (Dis)Connect Model (DDDM), was created to address…

  10. A Theoretical Framework towards Understanding of Emotional and Behavioural Difficulties

    ERIC Educational Resources Information Center

    Poulou, Maria S.

    2014-01-01

    Children's emotional and behavioural difficulties are the result of multiple individual, social and contextual factors working in concert. The current paper proposes a theoretical framework to interpret students' emotional and behavioural difficulties in schools, by taking into consideration teacher-student relationships, students'…

  11. Couples coping with cancer: exploration of theoretical frameworks from dyadic studies.

    PubMed

    Regan, Tim W; Lambert, Sylvie D; Kelly, Brian; Falconier, Mariana; Kissane, David; Levesque, Janelle V

    2015-12-01

    A diagnosis of cancer and subsequent treatment are distressing not only for the person directly affected, but also for their intimate partner. The aim of this review is to (a) identify the main theoretical frameworks underpinning research addressing dyadic coping among couples affected by cancer, (b) summarise the evidence supporting the concepts described in these theoretical frameworks, and (c) examine the similarities and differences between these theoretical perspectives. A literature search was undertaken to identify descriptive studies published between 1990 and 2013 (English and French) that examined the interdependence of patients' and partners' coping, and the impact of coping on psychosocial outcomes. Data were extracted using a standardised form and reviewed by three of the authors. Twenty-three peer-reviewed manuscripts were identified, from which seven theoretical perspectives were derived: Relationship-Focused Coping, Transactional Model of Stress and Coping, Systemic-Transactional Model (STM) of dyadic coping, Collaborative Coping, Relationship Intimacy model, Communication models, and Coping Congruence. Although these theoretical perspectives emphasised different aspects of coping, a number of conceptual commonalities were noted. This review identified key theoretical frameworks of dyadic coping used in cancer. Evidence indicates that responses within the couple that inhibit open communication between partner and patient are likely to have an adverse impact on psychosocial outcomes. Models that incorporate the interdependence of emotional responses and coping behaviours within couples have an emerging evidence base in psycho-oncology and may have greatest validity and clinical utility in this setting. Copyright © 2015 John Wiley & Sons, Ltd.

  12. Theories of behaviour change synthesised into a set of theoretical groupings: introducing a thematic series on the theoretical domains framework.

    PubMed

    Francis, Jill J; O'Connor, Denise; Curran, Janet

    2012-04-24

    Behaviour change is key to increasing the uptake of evidence into healthcare practice. Designing behaviour-change interventions first requires problem analysis, ideally informed by theory. Yet the large number of partly overlapping theories of behaviour makes it difficult to select the most appropriate theory. The need for an overarching theoretical framework of behaviour change was addressed in research in which 128 explanatory constructs from 33 theories of behaviour were identified and grouped. The resulting Theoretical Domains Framework (TDF) appears to be a helpful basis for investigating implementation problems. Research groups in several countries have conducted TDF-based studies. It seems timely to bring together the experience of these teams in a thematic series to demonstrate further applications and to report key developments. This overview article describes the TDF, provides a brief critique of the framework, and introduces this thematic series.In a brief review to assess the extent of TDF-based research, we identified 133 papers that cite the framework. Of these, 17 used the TDF as the basis for empirical studies to explore health professionals' behaviour. The identified papers provide evidence of the impact of the TDF on implementation research. Two major strengths of the framework are its theoretical coverage and its capacity to elicit beliefs that could signify key mediators of behaviour change. The TDF provides a useful conceptual basis for assessing implementation problems, designing interventions to enhance healthcare practice, and understanding behaviour-change processes. We discuss limitations and research challenges and introduce papers in this series.

  13. Theories of behaviour change synthesised into a set of theoretical groupings: introducing a thematic series on the theoretical domains framework

    PubMed Central

    2012-01-01

    Behaviour change is key to increasing the uptake of evidence into healthcare practice. Designing behaviour-change interventions first requires problem analysis, ideally informed by theory. Yet the large number of partly overlapping theories of behaviour makes it difficult to select the most appropriate theory. The need for an overarching theoretical framework of behaviour change was addressed in research in which 128 explanatory constructs from 33 theories of behaviour were identified and grouped. The resulting Theoretical Domains Framework (TDF) appears to be a helpful basis for investigating implementation problems. Research groups in several countries have conducted TDF-based studies. It seems timely to bring together the experience of these teams in a thematic series to demonstrate further applications and to report key developments. This overview article describes the TDF, provides a brief critique of the framework, and introduces this thematic series. In a brief review to assess the extent of TDF-based research, we identified 133 papers that cite the framework. Of these, 17 used the TDF as the basis for empirical studies to explore health professionals’ behaviour. The identified papers provide evidence of the impact of the TDF on implementation research. Two major strengths of the framework are its theoretical coverage and its capacity to elicit beliefs that could signify key mediators of behaviour change. The TDF provides a useful conceptual basis for assessing implementation problems, designing interventions to enhance healthcare practice, and understanding behaviour-change processes. We discuss limitations and research challenges and introduce papers in this series. PMID:22531601

  14. Retrieval Induces Forgetting, but Only When Nontested Items Compete for Retrieval: Implication for Interference, Inhibition, and Context Reinstatement

    ERIC Educational Resources Information Center

    Chan, Jason C. K.; Erdman, Matthew R.; Davis, Sara D.

    2015-01-01

    The mechanism responsible for retrieval-induced forgetting has been the subject of rigorous theoretical debate, with some researchers postulating that retrieval-induced forgetting can be explained by interference (J. G .W. Raaijmakers & E. Jakab, 2013) or context reinstatement (T. R. Jonker, P. Seli, & C. M. MacLeod, 2013), whereas others…

  15. Implementation of rigorous renormalization group method for ground space and low-energy states of local Hamiltonians

    NASA Astrophysics Data System (ADS)

    Roberts, Brenden; Vidick, Thomas; Motrunich, Olexei I.

    2017-12-01

    The success of polynomial-time tensor network methods for computing ground states of certain quantum local Hamiltonians has recently been given a sound theoretical basis by Arad et al. [Math. Phys. 356, 65 (2017), 10.1007/s00220-017-2973-z]. The convergence proof, however, relies on "rigorous renormalization group" (RRG) techniques which differ fundamentally from existing algorithms. We introduce a practical adaptation of the RRG procedure which, while no longer theoretically guaranteed to converge, finds matrix product state ansatz approximations to the ground spaces and low-lying excited spectra of local Hamiltonians in realistic situations. In contrast to other schemes, RRG does not utilize variational methods on tensor networks. Rather, it operates on subsets of the system Hilbert space by constructing approximations to the global ground space in a treelike manner. We evaluate the algorithm numerically, finding similar performance to density matrix renormalization group (DMRG) in the case of a gapped nondegenerate Hamiltonian. Even in challenging situations of criticality, large ground-state degeneracy, or long-range entanglement, RRG remains able to identify candidate states having large overlap with ground and low-energy eigenstates, outperforming DMRG in some cases.

  16. On the probability density function and characteristic function moments of image steganalysis in the log prediction error wavelet subband

    NASA Astrophysics Data System (ADS)

    Bao, Zhenkun; Li, Xiaolong; Luo, Xiangyang

    2017-01-01

    Extracting informative statistic features is the most essential technical issue of steganalysis. Among various steganalysis methods, probability density function (PDF) and characteristic function (CF) moments are two important types of features due to the excellent ability for distinguishing the cover images from the stego ones. The two types of features are quite similar in definition. The only difference is that the PDF moments are computed in the spatial domain, while the CF moments are computed in the Fourier-transformed domain. Then, the comparison between PDF and CF moments is an interesting question of steganalysis. Several theoretical results have been derived, and CF moments are proved better than PDF moments in some cases. However, in the log prediction error wavelet subband of wavelet decomposition, some experiments show that the result is opposite and lacks a rigorous explanation. To solve this problem, a comparison result based on the rigorous proof is presented: the first-order PDF moment is proved better than the CF moment, while the second-order CF moment is better than the PDF moment. It tries to open the theoretical discussion on steganalysis and the question of finding suitable statistical features.

  17. Culpability and pain management/control in peripheral vascular disease using the ethics of principles and care.

    PubMed

    Omery, A

    1991-09-01

    The purposes of this article were to provide insight into the process of ethics and ethical inquiry and to explore the ethical issues of culpability and pain management/control. Critical care nurses who currently care for vascular patients identified these issues as occurring frequently in their practice. Authors in critical care nursing generally have limited the process of ethical inquiry to a theoretical framework built around an ethic of principles. The message many critical care nurses heard was that this one type of theoretical ethical framework was the totality of ethics. The application of these principles was ethical inquiry. For some nurses, the ethic of principles is sufficient. For others, an ethic of principles is either incomplete or foreign. This second group of nurses may believe that they have no moral voice if the language of ethics is only the language of principles. The language of principles, however, is not the only theoretical framework available. There is also the ethic of care, and ethical inquiry can include the application of that framework. Indeed, the language of the ethic of care may give a voice to nurses who previously felt morally mute. In fact, these two theoretical frameworks are not the only frameworks available to nurses. There is also virtue ethics, a framework not discussed in this article. A multiplicity of ethical frameworks is available for nurses to use in analyzing their professional and personal dilemmas. Recognizing that multiplicity, nurses can analyze their ethical dilemmas more comprehensively and effectively. Applying differing ethical frameworks can result in the same conclusions. This was the case for the issue of culpability.(ABSTRACT TRUNCATED AT 250 WORDS)

  18. The functional-cognitive meta-theoretical framework: Reflections, possible clarifications and how to move forward.

    PubMed

    Barnes-Holmes, Dermot; Hussey, Ian

    2016-02-01

    The functional-cognitive meta-theoretical framework has been offered as a conceptual basis for facilitating greater communication and cooperation between the functional/behavioural and cognitive traditions within psychology, thus leading to benefits for both scientific communities. The current article is written from the perspective of two functional researchers, who are also proponents of the functional-cognitive framework, and attended the "Building Bridges between the Functional and Cognitive Traditions" meeting at Ghent University in the summer of 2014. The article commences with a brief summary of the functional approach to theory, followed by our reflections upon the functional-cognitive framework in light of that meeting. In doing so, we offer three ways in which the framework could be clarified: (a) effective communication between the two traditions is likely to be found at the level of behavioural observations rather than effects or theory, (b) not all behavioural observations will be deemed to be of mutual interest to both traditions, and (c) observations of mutual interest will be those that serve to elaborate and extend existing theorising in the functional and/or cognitive traditions. The article concludes with a summary of what we perceive to be the strengths and weaknesses of the framework, and a suggestion that there is a need to determine if the framework is meta-theoretical or is in fact a third theoretical approach to doing psychological science. © 2015 International Union of Psychological Science.

  19. Origin of the spike-timing-dependent plasticity rule

    NASA Astrophysics Data System (ADS)

    Cho, Myoung Won; Choi, M. Y.

    2016-08-01

    A biological synapse changes its efficacy depending on the difference between pre- and post-synaptic spike timings. Formulating spike-timing-dependent interactions in terms of the path integral, we establish a neural-network model, which makes it possible to predict relevant quantities rigorously by means of standard methods in statistical mechanics and field theory. In particular, the biological synaptic plasticity rule is shown to emerge as the optimal form for minimizing the free energy. It is further revealed that maximization of the entropy of neural activities gives rise to the competitive behavior of biological learning. This demonstrates that statistical mechanics helps to understand rigorously key characteristic behaviors of a neural network, thus providing the possibility of physics serving as a useful and relevant framework for probing life.

  20. Specifying the behavior of concurrent systems

    NASA Technical Reports Server (NTRS)

    Furtek, F. C.

    1984-01-01

    A framework for rigorously specifying the behavior of concurrent systems is proposed. It is based on the view of a concurrent system as a collection of interacting processes but no assumptions are made about the mechanisms for process synchronization and communication. A formal language is described that permits the expression of a broad range of logical and timing dependencies.

  1. Climate Change: Creating an Integrated Framework for Improving School Climate

    ERIC Educational Resources Information Center

    Alliance for Excellent Education, 2013

    2013-01-01

    This report from the Alliance finds that schools that struggle most with providing a positive school climate more often disproportionately serve students of color and low-income students. It also confirms that students of color and students from low-income families are less likely to have access to rigorous course work and experienced teachers,…

  2. Education in Emergencies: A Review of Theory and Research

    ERIC Educational Resources Information Center

    Burde, Dana; Kapit, Amy; Wahl, Rachel L.; Guven, Ozen; Skarpeteig, Margot Igland

    2017-01-01

    In this article, we conduct an integrative and rigorous review of theory and research on education in emergencies programs and interventions as international agencies implement them in areas of armed conflict. We ask several questions. How did this subfield emerge and what are the key conceptual frameworks that shape it today? How do education in…

  3. Key Considerations When Measuring Teacher Effectiveness: A Framework for Validating Teachers' Professional Practices. AACC Report

    ERIC Educational Resources Information Center

    Gallagher, Carole; Rabinowitz, Stanley; Yeagley, Pamela

    2011-01-01

    Researchers recommend that policymakers use data from multiple sources when making decisions that have high-stakes consequences (Herman, Baker, & Linn, 2004; Linn, 2007; Stone & Lane, 2003). For this reason, a fair but rigorous teacher-effectiveness rating process relies on evidence collected from different sources (Goe, Bell, & Little, 2008;…

  4. Integrated model development for liquid fueled rocket propulsion systems

    NASA Technical Reports Server (NTRS)

    Santi, L. Michael

    1993-01-01

    As detailed in the original statement of work, the objective of phase two of this research effort was to develop a general framework for rocket engine performance prediction that integrates physical principles, a rigorous mathematical formalism, component level test data, system level test data, and theory-observation reconciliation. Specific phase two development tasks are defined.

  5. Gathering Evidence on an After-School Supplemental Instruction Program: Design Challenges and Early Findings in Light of NCLB

    ERIC Educational Resources Information Center

    Chatterji, Madhabi; Kwon, Young Ae; Sng, Clarice

    2006-01-01

    The No Child Left Behind (NCLB) Act of 2001 requires that public schools adopt research-supported programs and practices, with a strong recommendation for randomized controlled trials (RCTs) as the "gold standard" for scientific rigor in empirical research. Within that policy framework, this paper compares the relative utility of…

  6. Beyond Academics: A Holistic Framework for Enhancing Education and Workplace Success. ACT Research Report Series. 2015 (4)

    ERIC Educational Resources Information Center

    Camara, Wayne, Ed.; O'Connor, Ryan, Ed.; Mattern, Krista, Ed.; Hanson, Mary Ann, Ed.

    2015-01-01

    Colleges have long recognized the importance of multiple domains. Admissions officers look to high school grades as indicators of persistence and achievement; student statements and letters of recommendation as indicators of character, behavior, and adaptability; the rigor of courses completed in high school as evidence of effort, motivation, and…

  7. Model-theoretic framework for sensor data fusion

    NASA Astrophysics Data System (ADS)

    Zavoleas, Kyriakos P.; Kokar, Mieczyslaw M.

    1993-09-01

    The main goal of our research in sensory data fusion (SDF) is the development of a systematic approach (a methodology) to designing systems for interpreting sensory information and for reasoning about the situation based upon this information and upon available data bases and knowledge bases. To achieve such a goal, two kinds of subgoals have been set: (1) develop a theoretical framework in which rational design/implementation decisions can be made, and (2) design a prototype SDF system along the lines of the framework. Our initial design of the framework has been described in our previous papers. In this paper we concentrate on the model-theoretic aspects of this framework. We postulate that data are embedded in data models, and information processing mechanisms are embedded in model operators. The paper is devoted to analyzing the classes of model operators and their significance in SDF. We investigate transformation abstraction and fusion operators. A prototype SDF system, fusing data from range and intensity sensors, is presented, exemplifying the structures introduced. Our framework is justified by the fact that it provides modularity, traceability of information flow, and a basis for a specification language for SDF.

  8. Decision support frameworks and tools for conservation

    USGS Publications Warehouse

    Schwartz, Mark W.; Cook, Carly N.; Pressey, Robert L.; Pullin, Andrew S.; Runge, Michael C.; Salafsky, Nick; Sutherland, William J.; Williamson, Matthew A.

    2018-01-01

    The practice of conservation occurs within complex socioecological systems fraught with challenges that require transparent, defensible, and often socially engaged project planning and management. Planning and decision support frameworks are designed to help conservation practitioners increase planning rigor, project accountability, stakeholder participation, transparency in decisions, and learning. We describe and contrast five common frameworks within the context of six fundamental questions (why, who, what, where, when, how) at each of three planning stages of adaptive management (project scoping, operational planning, learning). We demonstrate that decision support frameworks provide varied and extensive tools for conservation planning and management. However, using any framework in isolation risks diminishing potential benefits since no one framework covers the full spectrum of potential conservation planning and decision challenges. We describe two case studies that have effectively deployed tools from across conservation frameworks to improve conservation actions and outcomes. Attention to the critical questions for conservation project planning should allow practitioners to operate within any framework and adapt tools to suit their specific management context. We call on conservation researchers and practitioners to regularly use decision support tools as standard practice for framing both practice and research.

  9. Quantifying heterogeneity attributable to polythetic diagnostic criteria: theoretical framework and empirical application.

    PubMed

    Olbert, Charles M; Gala, Gary J; Tupler, Larry A

    2014-05-01

    Heterogeneity within psychiatric disorders is both theoretically and practically problematic: For many disorders, it is possible for 2 individuals to share very few or even no symptoms in common yet share the same diagnosis. Polythetic diagnostic criteria have long been recognized to contribute to this heterogeneity, yet no unified theoretical understanding of the coherence of symptom criteria sets currently exists. A general framework for analyzing the logical and mathematical structure, coherence, and diversity of Diagnostic and Statistical Manual diagnostic categories (DSM-5 and DSM-IV-TR) is proposed, drawing from combinatorial mathematics, set theory, and information theory. Theoretical application of this framework to 18 diagnostic categories indicates that in most categories, 2 individuals with the same diagnosis may share no symptoms in common, and that any 2 theoretically possible symptom combinations will share on average less than half their symptoms. Application of this framework to 2 large empirical datasets indicates that patients who meet symptom criteria for major depressive disorder and posttraumatic stress disorder tend to share approximately three-fifths of symptoms in common. For both disorders in each of the datasets, pairs of individuals who shared no common symptoms were observed. Any 2 individuals with either diagnosis were unlikely to exhibit identical symptomatology. The theoretical and empirical results stemming from this approach have substantive implications for etiological research into, and measurement of, psychiatric disorders.

  10. Organ and tissue donation in clinical settings: a systematic review of the impact of interventions aimed at health professionals

    PubMed Central

    2014-01-01

    In countries where presumed consent for organ donation does not apply, health professionals (HP) are key players for identifying donors and obtaining their consent. This systematic review was designed to verify the efficacy of interventions aimed at HPs to promote organ and tissue donation in clinical settings. CINAHL (1982 to 2012), COCHRANE LIBRARY, EMBASE (1974 to 2012), MEDLINE (1966 to 2012), PsycINFO (1960 to 2012), and ProQuest Dissertations and Theses were searched for papers published in French or English until September 2012. Studies were considered if they met the following criteria: aimed at improving HPs’ practices regarding the donation process or at increasing donation rates; HPs working in clinical settings; and interventions with a control group or pre-post assessments. Intervention behavioral change techniques were analyzed using a validated taxonomy. A risk ratio was computed for each study having a control group. A total of 15 studies were identified, of which only 5 had a control group. Interventions were either educational, organizational or a combination of both, and had a weak theoretical basis. The most common behavior change technique was providing instruction. Two sets of interventions showed a significant risk ratio. However, most studies did not report the information needed to compute their efficacy. Therefore, interventions aimed at improving the donation process or at increasing donation rates should be based on sound theoretical frameworks. They would benefit from more rigorous evaluation methods to ensure good knowledge translation and appropriate organizational decisions to improve professional practices. PMID:24628967

  11. Spiritual AIM and the work of the chaplain: a model for assessing spiritual needs and outcomes in relationship.

    PubMed

    Shields, Michele; Kestenbaum, Allison; Dunn, Laura B

    2015-02-01

    Distinguishing the unique contributions and roles of chaplains as members of healthcare teams requires the fundamental step of articulating and critically evaluating conceptual models that guide practice. However, there is a paucity of well-described spiritual assessment models. Even fewer of the extant models prescribe interventions and describe desired outcomes corresponding to spiritual assessments. This article describes the development, theoretical underpinnings, and key components of one model, called the Spiritual Assessment and Intervention Model (Spiritual AIM). Three cases are presented that illustrate Spiritual AIM in practice. Spiritual AIM was developed over the past 20 years to address the limitations of existing models. The model evolved based in part on observing how different people respond to a health crisis and what kinds of spiritual needs appear to emerge most prominently during a health crisis. Spiritual AIM provides a conceptual framework for the chaplain to diagnose an individual's primary unmet spiritual need, devise and implement a plan for addressing this need through embodiment/relationship, and articulate and evaluate the desired and actual outcome of the intervention. Spiritual AIM's multidisciplinary theory is consistent with the goals of professional chaplaincy training and practice, which emphasize the integration of theology, recognition of interpersonal dynamics, cultural humility and competence, ethics, and theories of human development. Further conceptual and empirical work is needed to systematically refine, evaluate, and disseminate well-articulated spiritual assessment models such as Spiritual AIM. This foundational work is vital to advancing chaplaincy as a theoretically grounded and empirically rigorous healthcare profession.

  12. Application of Intervention Mapping to the Development of a Complex Physical Therapist Intervention.

    PubMed

    Jones, Taryn M; Dear, Blake F; Hush, Julia M; Titov, Nickolai; Dean, Catherine M

    2016-12-01

    Physical therapist interventions, such as those designed to change physical activity behavior, are often complex and multifaceted. In order to facilitate rigorous evaluation and implementation of these complex interventions into clinical practice, the development process must be comprehensive, systematic, and transparent, with a sound theoretical basis. Intervention Mapping is designed to guide an iterative and problem-focused approach to the development of complex interventions. The purpose of this case report is to demonstrate the application of an Intervention Mapping approach to the development of a complex physical therapist intervention, a remote self-management program aimed at increasing physical activity after acquired brain injury. Intervention Mapping consists of 6 steps to guide the development of complex interventions: (1) needs assessment; (2) identification of outcomes, performance objectives, and change objectives; (3) selection of theory-based intervention methods and practical applications; (4) organization of methods and applications into an intervention program; (5) creation of an implementation plan; and (6) generation of an evaluation plan. The rationale and detailed description of this process are presented using an example of the development of a novel and complex physical therapist intervention, myMoves-a program designed to help individuals with an acquired brain injury to change their physical activity behavior. The Intervention Mapping framework may be useful in the development of complex physical therapist interventions, ensuring the development is comprehensive, systematic, and thorough, with a sound theoretical basis. This process facilitates translation into clinical practice and allows for greater confidence and transparency when the program efficacy is investigated. © 2016 American Physical Therapy Association.

  13. Implementing health promotion tools in Australian Indigenous primary health care.

    PubMed

    Percival, Nikki A; McCalman, Janya; Armit, Christine; O'Donoghue, Lynette; Bainbridge, Roxanne; Rowley, Kevin; Doyle, Joyce; Tsey, Komla

    2018-02-01

    In Australia, significant resources have been invested in producing health promotion best practice guidelines, frameworks and tools (herein referred to as health promotion tools) as a strategy to improve Indigenous health promotion programmes. Yet, there has been very little rigorous implementation research about whether or how health promotion tools are implemented. This paper theorizes the complex processes of health promotion tool implementation in Indigenous comprehensive primary healthcare services. Data were derived from published and grey literature about the development and the implementation of four Indigenous health promotion tools. Tools were theoretically sampled to account for the key implementation types described in the literature. Data were analysed using the grounded-theory methods of coding and constant comparison with construct a theoretical implementation model. An Indigenous Health Promotion Tool Implementation Model was developed. Implementation is a social process, whereby researchers, practitioners and community members collectively interacted in creating culturally responsive health promotion to the common purpose of facilitating empowerment. The implementation of health promotion tools was influenced by the presence of change agents; a commitment to reciprocity and organizational governance and resourcing. The Indigenous Health Promotion Tool Implementation Model assists in explaining how health promotion tools are implemented and the conditions that influence these actions. Rather than simply developing more health promotion tools, our study suggests that continuous investment in developing conditions that support empowering implementation processes are required to maximize the beneficial impacts and effectiveness of health promotion tools. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  14. Health information systems: a survey of frameworks for developing countries.

    PubMed

    Marcelo, A B

    2010-01-01

    The objective of this paper is to perform a survey of excellent research on health information systems (HIS) analysis and design, and their underlying theoretical frameworks. It classifies these frameworks along major themes, and analyzes the different approaches to HIS development that are practical in resource-constrained environments. Literature review based on PubMed citations and conference proceedings, as well as Internet searches on information systems in general, and health information systems in particular. The field of health information systems development has been studied extensively. Despite this, failed implementations are still common. Theoretical frameworks for HIS development are available that can guide implementers. As awareness, acceptance, and demand for health information systems increase globally, the variety of approaches and strategies will also follow. For developing countries with scarce resources, a trial-and-error approach can be very costly. Lessons from the successes and failures of initial HIS implementations have been abstracted into theoretical frameworks. These frameworks organize complex HIS concepts into methodologies that standardize techniques in implementation. As globalization continues to impact healthcare in the developing world, demand for more responsive health systems will become urgent. More comprehensive frameworks and practical tools to guide HIS implementers will be imperative.

  15. An Uncertainty Quantification Framework for Prognostics and Condition-Based Monitoring

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar; Goebel, Kai

    2014-01-01

    This paper presents a computational framework for uncertainty quantification in prognostics in the context of condition-based monitoring of aerospace systems. The different sources of uncertainty and the various uncertainty quantification activities in condition-based prognostics are outlined in detail, and it is demonstrated that the Bayesian subjective approach is suitable for interpreting uncertainty in online monitoring. A state-space model-based framework for prognostics, that can rigorously account for the various sources of uncertainty, is presented. Prognostics consists of two important steps. First, the state of the system is estimated using Bayesian tracking, and then, the future states of the system are predicted until failure, thereby computing the remaining useful life of the system. The proposed framework is illustrated using the power system of a planetary rover test-bed, which is being developed and studied at NASA Ames Research Center.

  16. Conceptual models for cumulative risk assessment.

    PubMed

    Linder, Stephen H; Sexton, Ken

    2011-12-01

    In the absence of scientific consensus on an appropriate theoretical framework, cumulative risk assessment and related research have relied on speculative conceptual models. We argue for the importance of theoretical backing for such models and discuss 3 relevant theoretical frameworks, each supporting a distinctive "family" of models. Social determinant models postulate that unequal health outcomes are caused by structural inequalities; health disparity models envision social and contextual factors acting through individual behaviors and biological mechanisms; and multiple stressor models incorporate environmental agents, emphasizing the intermediary role of these and other stressors. The conclusion is that more careful reliance on established frameworks will lead directly to improvements in characterizing cumulative risk burdens and accounting for disproportionate adverse health effects.

  17. Conceptual Models for Cumulative Risk Assessment

    PubMed Central

    Sexton, Ken

    2011-01-01

    In the absence of scientific consensus on an appropriate theoretical framework, cumulative risk assessment and related research have relied on speculative conceptual models. We argue for the importance of theoretical backing for such models and discuss 3 relevant theoretical frameworks, each supporting a distinctive “family” of models. Social determinant models postulate that unequal health outcomes are caused by structural inequalities; health disparity models envision social and contextual factors acting through individual behaviors and biological mechanisms; and multiple stressor models incorporate environmental agents, emphasizing the intermediary role of these and other stressors. The conclusion is that more careful reliance on established frameworks will lead directly to improvements in characterizing cumulative risk burdens and accounting for disproportionate adverse health effects. PMID:22021317

  18. Do behavioral scientists really understand HIV-related sexual risk behavior? A systematic review of longitudinal and experimental studies predicting sexual behavior.

    PubMed

    Huebner, David M; Perry, Nicholas S

    2015-10-01

    Behavioral interventions to reduce sexual risk behavior depend on strong health behavior theory. By identifying the psychosocial variables that lead causally to sexual risk, theories provide interventionists with a guide for how to change behavior. However, empirical research is critical to determining whether a particular theory adequately explains sexual risk behavior. A large body of cross-sectional evidence, which has been reviewed elsewhere, supports the notion that certain theory-based constructs (e.g., self-efficacy) are correlates of sexual behavior. However, given the limitations of inferring causality from correlational research, it is essential that we review the evidence from more methodologically rigorous studies (i.e., longitudinal and experimental designs). This systematic review identified 44 longitudinal studies in which investigators attempted to predict sexual risk from psychosocial variables over time. We also found 134 experimental studies (i.e., randomized controlled trials of HIV interventions), but of these only 9 (6.7 %) report the results of mediation analyses that might provide evidence for the validity of health behavior theories in predicting sexual behavior. Results show little convergent support across both types of studies for most traditional, theoretical predictors of sexual behavior. This suggests that the field must expand the body of empirical work that utilizes the most rigorous study designs to test our theoretical assumptions. The inconsistent results of existing research would indicate that current theoretical models of sexual risk behavior are inadequate, and may require expansion or adaptation.

  19. Bringing scientific rigor to community-developed programs in Hong Kong.

    PubMed

    Fabrizio, Cecilia S; Hirschmann, Malia R; Lam, Tai Hing; Cheung, Teresa; Pang, Irene; Chan, Sophia; Stewart, Sunita M

    2012-12-31

    This paper describes efforts to generate evidence for community-developed programs to enhance family relationships in the Chinese culture of Hong Kong, within the framework of community-based participatory research (CBPR). The CBPR framework was applied to help maximize the development of the intervention and the public health impact of the studies, while enhancing the capabilities of the social service sector partners. Four academic-community research teams explored the process of designing and implementing randomized controlled trials in the community. In addition to the expected cultural barriers between teams of academics and community practitioners, with their different outlooks, concerns and languages, the team navigated issues in utilizing the principles of CBPR unique to this Chinese culture. Eventually the team developed tools for adaptation, such as an emphasis on building the relationship while respecting role delineation and an iterative process of defining the non-negotiable parameters of research design while maintaining scientific rigor. Lessons learned include the risk of underemphasizing the size of the operational and skills shift between usual agency practices and research studies, the importance of minimizing non-negotiable parameters in implementing rigorous research designs in the community, and the need to view community capacity enhancement as a long term process. The four pilot studies under the FAMILY Project demonstrated that nuanced design adaptations, such as wait list controls and shorter assessments, better served the needs of the community and led to the successful development and vigorous evaluation of a series of preventive, family-oriented interventions in the Chinese culture of Hong Kong.

  20. Optimal Experimental Design of Borehole Locations for Bayesian Inference of Past Ice Sheet Surface Temperatures

    NASA Astrophysics Data System (ADS)

    Davis, A. D.; Huan, X.; Heimbach, P.; Marzouk, Y.

    2017-12-01

    Borehole data are essential for calibrating ice sheet models. However, field expeditions for acquiring borehole data are often time-consuming, expensive, and dangerous. It is thus essential to plan the best sampling locations that maximize the value of data while minimizing costs and risks. We present an uncertainty quantification (UQ) workflow based on rigorous probability framework to achieve these objectives. First, we employ an optimal experimental design (OED) procedure to compute borehole locations that yield the highest expected information gain. We take into account practical considerations of location accessibility (e.g., proximity to research sites, terrain, and ice velocity may affect feasibility of drilling) and robustness (e.g., real-time constraints such as weather may force researchers to drill at sub-optimal locations near those originally planned), by incorporating a penalty reflecting accessibility as well as sensitivity to deviations from the optimal locations. Next, we extract vertical temperature profiles from these boreholes and formulate a Bayesian inverse problem to reconstruct past surface temperatures. Using a model of temperature advection/diffusion, the top boundary condition (corresponding to surface temperatures) is calibrated via efficient Markov chain Monte Carlo (MCMC). The overall procedure can then be iterated to choose new optimal borehole locations for the next expeditions.Through this work, we demonstrate powerful UQ methods for designing experiments, calibrating models, making predictions, and assessing sensitivity--all performed under an uncertain environment. We develop a theoretical framework as well as practical software within an intuitive workflow, and illustrate their usefulness for combining data and models for environmental and climate research.

  1. Towards a Grand Unified Theory of sports performance.

    PubMed

    Glazier, Paul S

    2017-12-01

    Sports performance is generally considered to be governed by a range of interacting physiological, biomechanical, and psychological variables, amongst others. Despite sports performance being multi-factorial, however, the majority of performance-oriented sports science research has predominantly been monodisciplinary in nature, presumably due, at least in part, to the lack of a unifying theoretical framework required to integrate the various subdisciplines of sports science. In this target article, I propose a Grand Unified Theory (GUT) of sports performance-and, by elaboration, sports science-based around the constraints framework introduced originally by Newell (1986). A central tenet of this GUT is that, at both the intra- and inter-individual levels of analysis, patterns of coordination and control, which directly determine the performance outcome, emerge from the confluence of interacting organismic, environmental, and task constraints via the formation and self-organisation of coordinative structures. It is suggested that this GUT could be used to: foster interdisciplinary research collaborations; break down the silos that have developed in sports science and restore greater disciplinary balance to the field; promote a more holistic understanding of sports performance across all levels of analysis; increase explanatory power of applied research work; provide stronger rationale for data collection and variable selection; and direct the development of integrated performance monitoring technologies. This GUT could also provide a scientifically rigorous basis for integrating the subdisciplines of sports science in applied sports science support programmes adopted by high-performance agencies and national governing bodies for various individual and team sports. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Robust flow stability: Theory, computations and experiments in near wall turbulence

    NASA Astrophysics Data System (ADS)

    Bobba, Kumar Manoj

    Helmholtz established the field of hydrodynamic stability with his pioneering work in 1868. From then on, hydrodynamic stability became an important tool in understanding various fundamental fluid flow phenomena in engineering (mechanical, aeronautics, chemical, materials, civil, etc.) and science (astrophysics, geophysics, biophysics, etc.), and turbulence in particular. However, there are many discrepancies between classical hydrodynamic stability theory and experiments. In this thesis, the limitations of traditional hydrodynamic stability theory are shown and a framework for robust flow stability theory is formulated. A host of new techniques like gramians, singular values, operator norms, etc. are introduced to understand the role of various kinds of uncertainty. An interesting feature of this framework is the close interplay between theory and computations. It is shown that a subset of Navier-Stokes equations are globally, non-nonlinearly stable for all Reynolds number. Yet, invoking this new theory, it is shown that these equations produce structures (vortices and streaks) as seen in the experiments. The experiments are done in zero pressure gradient transiting boundary layer on a flat plate in free surface tunnel. Digital particle image velocimetry, and MEMS based laser Doppler velocimeter and shear stress sensors have been used to make quantitative measurements of the flow. Various theoretical and computational predictions are in excellent agreement with the experimental data. A closely related topic of modeling, simulation and complexity reduction of large mechanics problems with multiple spatial and temporal scales is also studied. A nice method that rigorously quantifies the important scales and automatically gives models of the problem to various levels of accuracy is introduced. Computations done using spectral methods are presented.

  3. Learning Physical Domains: Toward a Theoretical Framework.

    ERIC Educational Resources Information Center

    Forbus, Kenneth D.; Gentner, Dedre

    People use and extend their knowledge of the physical world constantly. Understanding how this fluency is achieved would be an important milestone in understanding human learning and intelligence, as well as a useful guide for constructing machines that learn. This paper presents a theoretical framework that is being developed in an attempt to…

  4. Memory and the Self in Autism: A Review and Theoretical Framework

    ERIC Educational Resources Information Center

    Lind, Sophie E.

    2010-01-01

    This article reviews research on (a) autobiographical episodic and semantic memory, (b) the self-reference effect, (c) memory for the actions of self versus other (the self-enactment effect), and (d) non-autobiographical episodic memory in autism spectrum disorder (ASD), and provides a theoretical framework to account for the bidirectional…

  5. A general theoretical framework for decoherence in open and closed systems

    NASA Astrophysics Data System (ADS)

    Castagnino, Mario; Fortin, Sebastian; Laura, Roberto; Lombardi, Olimpia

    2008-08-01

    A general theoretical framework for decoherence is proposed, which encompasses formalisms originally devised to deal just with open or closed systems. The conditions for decoherence are clearly stated and the relaxation and decoherence times are compared. Finally, the spin-bath model is developed in detail from the new perspective.

  6. 21st Century Pedagogical Content Knowledge and Science Teaching and Learning

    ERIC Educational Resources Information Center

    Slough, Scott; Chamblee, Gregory

    2017-01-01

    Technological Pedagogical Content Knowledge (TPACK) is a theoretical framework that has enjoyed widespread applications as it applies to the integration of technology in the teaching and learning process. This paper reviews the background for TPACK, discusses some of its limitations, and reviews and introduces a new theoretical framework, 21st…

  7. Analysing Theoretical Frameworks of Moral Education through Lakatos's Philosophy of Science

    ERIC Educational Resources Information Center

    Han, Hyemin

    2014-01-01

    The structure of studies of moral education is basically interdisciplinary; it includes moral philosophy, psychology, and educational research. This article systematically analyses the structure of studies of moral educational from the vantage points of philosophy of science. Among the various theoretical frameworks in the field of philosophy of…

  8. Applying the Grossman et al. Theoretical Framework: The Case of Reading

    ERIC Educational Resources Information Center

    Kucan, Linda; Palincsar, Annemarie Sullivan; Busse, Tracy; Heisey, Natalie; Klingelhofer, Rachel; Rimbey, Michelle; Schutz, Kristine

    2011-01-01

    Background/Context: This article describes the application of the theoretical framework proposed by Grossman and her colleagues to a research effort focusing on text-based discussion as a context for comprehension instruction. According to Grossman and her colleagues, a useful way to consider the teaching of complex practices to candidates is to…

  9. Unifying Different Theories of Learning: Theoretical Framework and Empirical Evidence

    ERIC Educational Resources Information Center

    Phan, Huy Phuong

    2008-01-01

    The main aim of this research study was to test out a conceptual model encompassing the theoretical frameworks of achievement goals, study processing strategies, effort, and reflective thinking practice. In particular, it was postulated that the causal influences of achievement goals on academic performance are direct and indirect through study…

  10. Internet Use and Cognitive Development: A Theoretical Framework

    ERIC Educational Resources Information Center

    Johnson, Genevieve

    2006-01-01

    The number of children and adolescents accessing the Internet as well as the amount of time online are steadily increasing. The most common online activities include playing video games, accessing web sites, and communicating via chat rooms, email, and instant messaging. A theoretical framework for understanding the effects of Internet use on…

  11. Growth in Mathematical Understanding While Learning How To Teach: A Theoretical Perspective.

    ERIC Educational Resources Information Center

    Cavey, Laurie O.

    This theoretical paper outlines a conceptual framework for examining growth in prospective teachers' mathematical understanding as they engage in thinking about and planning for the mathematical learning of others. The framework is based on the Pirie-Kieren (1994) Dynamical Theory for the Growth of Mathematical Understanding and extends into the…

  12. Design-Based Research: Case of a Teaching Sequence on Mechanics

    ERIC Educational Resources Information Center

    Tiberghien, Andree; Vince, Jacques; Gaidioz, Pierre

    2009-01-01

    Design-based research, and particularly its theoretical status, is a subject of debate in the science education community. In the first part of this paper, a theoretical framework drawn up to develop design-based research will be presented. This framework is mainly based on epistemological analysis of physics modelling, learning and teaching…

  13. Proposing a Theoretical Framework for Digital Age Youth Information Behavior Building upon Radical Change Theory

    ERIC Educational Resources Information Center

    Koh, Kyungwon

    2011-01-01

    Contemporary young people are engaged in a variety of information behaviors, such as information seeking, using, sharing, and creating. The ways youth interact with information have transformed in the shifting digital information environment; however, relatively little empirical research exists and no theoretical framework adequately explains…

  14. The Influence of the Pedagogical Content Knowledge Theoretical Framework on Research on Preservice Teacher Education

    ERIC Educational Resources Information Center

    Mecoli, Storey

    2013-01-01

    Pedagogical Content Knowledge, Lee S. Shulman's theoretical framework, has had a substantial influence on research in preservice teacher education, and consequently, schools of education. This review builds from Grossman's case studies that concluded that beginning teachers provided with excellent teacher education developed more substantial PCK…

  15. "Theorizing Teacher Mobility": A Critical Review of Literature

    ERIC Educational Resources Information Center

    Vagi, Robert; Pivovarova, Margarita

    2017-01-01

    In this critical review of literature, we summarize the major theoretical frameworks that have been used to study teacher mobility. In total we identified 40 teacher mobility studies that met our inclusion criteria. We conclude that relatively few theoretical frameworks have been used to study teacher mobility and those that have been used are…

  16. Utilizing the Theoretical Framework of Collective Identity to Understand Processes in Youth Programs

    ERIC Educational Resources Information Center

    Futch, Valerie A.

    2016-01-01

    This article explores collective identity as a useful theoretical framework for understanding social and developmental processes that occur in youth programs. Through narrative analysis of past participant interviews (n = 21) from an after-school theater program, known as "The SOURCE", it was found that participants very clearly describe…

  17. A Holistic Theoretical Approach to Intellectual Disability: Going beyond the Four Current Perspectives

    ERIC Educational Resources Information Center

    Schalock, Robert L.; Luckasson, Ruth; Tassé, Marc J.; Verdugo, Miguel Angel

    2018-01-01

    This article describes a holistic theoretical framework that can be used to explain intellectual disability (ID) and organize relevant information into a usable roadmap to guide understanding and application. Developing the framework involved analyzing the four current perspectives on ID and synthesizing this information into a holistic…

  18. Theoretical frameworks used to discuss ethical issues in private physiotherapy practice and proposal of a new ethical tool.

    PubMed

    Drolet, Marie-Josée; Hudon, Anne

    2015-02-01

    In the past, several researchers in the field of physiotherapy have asserted that physiotherapy clinicians rarely use ethical knowledge to solve ethical issues raised by their practice. Does this assertion still hold true? Do the theoretical frameworks used by researchers and clinicians allow them to analyze thoroughly the ethical issues they encounter in their everyday practice? In our quest for answers, we conducted a literature review and analyzed the ethical theoretical frameworks used by physiotherapy researchers and clinicians to discuss the ethical issues raised by private physiotherapy practice. Our final analysis corpus consisted of thirty-nine texts. Our main finding is that researchers and clinicians in physiotherapy rarely use ethical knowledge to analyze the ethical issues raised in their practice and that gaps exist in the theoretical frameworks currently used to analyze these issues. Consequently, we developed, for ethical analysis, a four-part prism which we have called the Quadripartite Ethical Tool (QET). This tool can be incorporated into existing theoretical frameworks to enable professionals to integrate ethical knowledge into their ethical analyses. The innovative particularity of the QET is that it encompasses three ethical theories (utilitarism, deontologism, and virtue ethics) and axiological ontology (professional values) and also draws on both deductive and inductive approaches. It is our hope that this new tool will help researchers and clinicians integrate ethical knowledge into their analysis of ethical issues and contribute to fostering ethical analyses that are grounded in relevant philosophical and axiological foundations.

  19. Can we Build on Social Movement Theories to Develop and Improve Community-Based Participatory Research? A Framework Synthesis Review.

    PubMed

    Tremblay, Marie-Claude; Martin, Debbie H; Macaulay, Ann C; Pluye, Pierre

    2017-06-01

    A long-standing challenge in community-based participatory research (CBPR) has been to anchor practice and evaluation in a relevant and comprehensive theoretical framework of community change. This study describes the development of a multidimensional conceptual framework that builds on social movement theories to identify key components of CBPR processes. Framework synthesis was used as a general literature search and analysis strategy. An initial conceptual framework was developed from the theoretical literature on social movement. A literature search performed to identify illustrative CBPR projects yielded 635 potentially relevant documents, from which eight projects (corresponding to 58 publications) were retained after record and full-text screening. Framework synthesis was used to code and organize data from these projects, ultimately providing a refined framework. The final conceptual framework maps key concepts of CBPR mobilization processes, such as the pivotal role of the partnership; resources and opportunities as necessary components feeding the partnership's development; the importance of framing processes; and a tight alignment between the cause (partnership's goal), the collective action strategy, and the system changes targeted. The revised framework provides a context-specific model to generate a new, innovative understanding of CBPR mobilization processes, drawing on existing theoretical foundations. © 2017 The Authors American Journal of Community Psychology published by Wiley Periodicals, Inc. on behalf of Society for Community Research and Action.

  20. Optimality conditions for the numerical solution of optimization problems with PDE constraints :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aguilo Valentin, Miguel Alejandro; Ridzal, Denis

    2014-03-01

    A theoretical framework for the numerical solution of partial di erential equation (PDE) constrained optimization problems is presented in this report. This theoretical framework embodies the fundamental infrastructure required to e ciently implement and solve this class of problems. Detail derivations of the optimality conditions required to accurately solve several parameter identi cation and optimal control problems are also provided in this report. This will allow the reader to further understand how the theoretical abstraction presented in this report translates to the application.

  1. A cognitive framework for analyzing and describing introductory students' use and understanding of mathematics in physics

    NASA Astrophysics Data System (ADS)

    Tuminaro, Jonathan

    Many introductory, algebra-based physics students perform poorly on mathematical problem solving tasks in physics. There are at least two possible, distinct reasons for this poor performance: (1) students simply lack the mathematical skills needed to solve problems in physics, or (2) students do not know how to apply the mathematical skills they have to particular problem situations in physics. While many students do lack the requisite mathematical skills, a major finding from this work is that the majority of students possess the requisite mathematical skills, yet fail to use or interpret them in the context of physics. In this thesis I propose a theoretical framework to analyze and describe students' mathematical thinking in physics. In particular, I attempt to answer two questions. What are the cognitive tools involved in formal mathematical thinking in physics? And, why do students make the kinds of mistakes they do when using mathematics in physics? According to the proposed theoretical framework there are three major theoretical constructs: mathematical resources, which are the knowledge elements that are activated in mathematical thinking and problem solving; epistemic games, which are patterns of activities that use particular kinds of knowledge to create new knowledge or solve a problem; and frames, which are structures of expectations that determine how individuals interpret situations or events. The empirical basis for this study comes from videotaped sessions of college students solving homework problems. The students are enrolled in an algebra-based introductory physics course. The videotapes were transcribed and analyzed using the aforementioned theoretical framework. Two important results from this work are: (1) the construction of a theoretical framework that offers researchers a vocabulary (ontological classification of cognitive structures) and grammar (relationship between the cognitive structures) for understanding the nature and origin of mathematical use in the context physics, and (2) a detailed understanding, in terms of the proposed theoretical framework, of the errors that students make when using mathematics in the context of physics.

  2. Field-widened Michelson interferometer for spectral discrimination in high-spectral-resolution lidar: theoretical framework.

    PubMed

    Cheng, Zhongtao; Liu, Dong; Luo, Jing; Yang, Yongying; Zhou, Yudi; Zhang, Yupeng; Duan, Lulin; Su, Lin; Yang, Liming; Shen, Yibing; Wang, Kaiwei; Bai, Jian

    2015-05-04

    A field-widened Michelson interferometer (FWMI) is developed to act as the spectral discriminator in high-spectral-resolution lidar (HSRL). This realization is motivated by the wide-angle Michelson interferometer (WAMI) which has been used broadly in the atmospheric wind and temperature detection. This paper describes an independent theoretical framework about the application of the FWMI in HSRL for the first time. In the framework, the operation principles and application requirements of the FWMI are discussed in comparison with that of the WAMI. Theoretical foundations for designing this type of interferometer are introduced based on these comparisons. Moreover, a general performance estimation model for the FWMI is established, which can provide common guidelines for the performance budget and evaluation of the FWMI in the both design and operation stages. Examples incorporating many practical imperfections or conditions that may degrade the performance of the FWMI are given to illustrate the implementation of the modeling. This theoretical framework presents a complete and powerful tool for solving most of theoretical or engineering problems encountered in the FWMI application, including the designing, parameter calibration, prior performance budget, posterior performance estimation, and so on. It will be a valuable contribution to the lidar community to develop a new generation of HSRLs based on the FWMI spectroscopic filter.

  3. Developmental engineering: a new paradigm for the design and manufacturing of cell-based products. Part II: from genes to networks: tissue engineering from the viewpoint of systems biology and network science.

    PubMed

    Lenas, Petros; Moos, Malcolm; Luyten, Frank P

    2009-12-01

    The field of tissue engineering is moving toward a new concept of "in vitro biomimetics of in vivo tissue development." In Part I of this series, we proposed a theoretical framework integrating the concepts of developmental biology with those of process design to provide the rules for the design of biomimetic processes. We named this methodology "developmental engineering" to emphasize that it is not the tissue but the process of in vitro tissue development that has to be engineered. To formulate the process design rules in a rigorous way that will allow a computational design, we should refer to mathematical methods to model the biological process taking place in vitro. Tissue functions cannot be attributed to individual molecules but rather to complex interactions between the numerous components of a cell and interactions between cells in a tissue that form a network. For tissue engineering to advance to the level of a technologically driven discipline amenable to well-established principles of process engineering, a scientifically rigorous formulation is needed of the general design rules so that the behavior of networks of genes, proteins, or cells that govern the unfolding of developmental processes could be related to the design parameters. Now that sufficient experimental data exist to construct plausible mathematical models of many biological control circuits, explicit hypotheses can be evaluated using computational approaches to facilitate process design. Recent progress in systems biology has shown that the empirical concepts of developmental biology that we used in Part I to extract the rules of biomimetic process design can be expressed in rigorous mathematical terms. This allows the accurate characterization of manufacturing processes in tissue engineering as well as the properties of the artificial tissues themselves. In addition, network science has recently shown that the behavior of biological networks strongly depends on their topology and has developed the necessary concepts and methods to describe it, allowing therefore a deeper understanding of the behavior of networks during biomimetic processes. These advances thus open the door to a transition for tissue engineering from a substantially empirical endeavor to a technology-based discipline comparable to other branches of engineering.

  4. A theoretical framework for psychiatric nursing practice.

    PubMed

    Onega, L L

    1991-01-01

    Traditionally, specific theoretical frameworks which are congruent with psychiatric nursing practice have been poorly articulated. The purpose of this paper is to identify and discuss a philosophical base, a theoretical framework, application to psychiatric nursing, and issues related to psychiatric nursing knowledge development and practice. A philosophical framework that is likely to be congruent with psychiatric nursing, which is based on the nature of human beings, health, psychiatric nursing and reality, is identified. Aaron Antonovsky's Salutogenic Model is discussed and applied to psychiatric nursing. This model provides a helpful way for psychiatric nurses to organize their thinking processes and ultimately improve the health care services that they offer to their clients. Goal setting and nursing interventions using this model are discussed. Additionally, application of the use of Antonovsky's model is made to nursing research areas such as hardiness, uncertainty, suffering, empathy and literary works. Finally, specific issues related to psychiatric nursing are addressed.

  5. Conceptualizing and Measuring Working Memory and its Relationship to Aphasia

    PubMed Central

    Wright, Heather Harris; Fergadiotis, Gerasimos

    2011-01-01

    Background General agreement exists in the literature that individuals with aphasia can exhibit a working memory deficit that contributes to their language processing impairments. Though conceptualized within different working memory frameworks, researchers have suggested that individuals with aphasia have limited working memory capacity, impaired attention-control processes as well as impaired inhibitory mechanisms. However, across studies investigating working memory ability in individuals with aphasia, different measures have been used to quantify their working memory ability and identify the relationship between working memory and language performance. Aims The primary objectives of this article are to (1) review current working memory theoretical frameworks, (2) review tasks used to measure working memory, and (3) discuss findings from studies that have investigated working memory as they relate to language processing in aphasia. Main Contribution Though findings have been consistent across studies investigating working memory ability in individuals with aphasia, discussion of how working memory is conceptualized and defined is often missing, as is discussion of results within a theoretical framework. This is critical, as working memory is conceptualized differently across the different theoretical frameworks. They differ in explaining what limits capacity and the source of individual differences as well as how information is encoded, maintained, and retrieved. When test methods are considered within a theoretical framework, specific hypotheses can be tested and stronger conclusions that are less susceptible to different interpretations can be made. Conclusions Working memory ability has been investigated in numerous studies with individuals with aphasia. To better understand the underlying cognitive constructs that contribute to the language deficits exhibited by individuals with aphasia, future investigations should operationally define the cognitive constructs of interest and discuss findings within theoretical frameworks. PMID:22639480

  6. Cultural Competence in the Treatment of Addictions: Theory, Practice and Evidence.

    PubMed

    Gainsbury, Sally M

    2017-07-01

    Culturally and linguistically diverse (CALD) populations often have high rates of addictive disorders, but lower rates of treatment seeking and completion than the mainstream population. A significant barrier to treatment is the lack of culturally relevant and appropriate treatment. A literature review was conducted to identify relevant literature related to cultural competence in mental health services delivery and specifically treatment for addictive disorders. Several theoretical models of cultural competence in therapy have been developed, but the lack of rigorous research limits the empirical evidence available. Research indicates that culturally competent treatment practices including providing therapy and materials in the client's language, knowledge, understanding and appreciation for cultural perspectives and nuances, involving the wider family and community and training therapists can enhance client engagement, retention and treatment outcomes for substance use and gambling. Further methodologically rigorous research is needed to isolate the impact of cultural competence for the treatment of addictions and guide research to determine treatment efficacy within specific CALD populations. Training therapists and recruiting therapists and researchers from CALD communities is important to ensure an ongoing focus and improved outcomes for CALD populations due to the importance of engaging these populations with addiction treatment. Copyright © 2016 John Wiley & Sons, Ltd. Key Practitioner Message: The treatment needs of culturally diverse individuals with addictions are often not met. Theoretical models can guide therapists in incorporating cultural competence. Culturally targeted treatments increase recruitment, retention and treatment outcomes. Cultural competence includes matching clinicians and clients on linguistic and cultural backgrounds as well as being mindful of the impact of culture on client's experience of addiction problems. Few methodologically rigorous trials have been conducted to guide treatment practices and research needs to be incorporated into existing culturally relevant treatment services. Copyright © 2016 John Wiley & Sons, Ltd.

  7. Sensemaking in a Value Based Context for Large Scale Complex Engineered Systems

    NASA Astrophysics Data System (ADS)

    Sikkandar Basha, Nazareen

    The design and the development of Large-Scale Complex Engineered Systems (LSCES) requires the involvement of multiple teams and numerous levels of the organization and interactions with large numbers of people and interdisciplinary departments. Traditionally, requirements-driven Systems Engineering (SE) is used in the design and development of these LSCES. The requirements are used to capture the preferences of the stakeholder for the LSCES. Due to the complexity of the system, multiple levels of interactions are required to elicit the requirements of the system within the organization. Since LSCES involves people and interactions between the teams and interdisciplinary departments, it should be socio-technical in nature. The elicitation of the requirements of most large-scale system projects are subjected to creep in time and cost due to the uncertainty and ambiguity of requirements during the design and development. In an organization structure, the cost and time overrun can occur at any level and iterate back and forth thus increasing the cost and time. To avoid such creep past researches have shown that rigorous approaches such as value based designing can be used to control it. But before the rigorous approaches can be used, the decision maker should have a proper understanding of requirements creep and the state of the system when the creep occurs. Sensemaking is used to understand the state of system when the creep occurs and provide a guidance to decision maker. This research proposes the use of the Cynefin framework, sensemaking framework which can be used in the design and development of LSCES. It can aide in understanding the system and decision making to minimize the value gap due to requirements creep by eliminating ambiguity which occurs during design and development. A sample hierarchical organization is used to demonstrate the state of the system at the occurrence of requirements creep in terms of cost and time using the Cynefin framework. These trials are continued for different requirements and at different sub-system level. The results obtained show that the Cynefin framework can be used to improve the value of the system and can be used for predictive analysis. The decision makers can use these findings and use rigorous approaches and improve the design of Large Scale Complex Engineered Systems.

  8. MIDER: Network Inference with Mutual Information Distance and Entropy Reduction

    PubMed Central

    Villaverde, Alejandro F.; Ross, John; Morán, Federico; Banga, Julio R.

    2014-01-01

    The prediction of links among variables from a given dataset is a task referred to as network inference or reverse engineering. It is an open problem in bioinformatics and systems biology, as well as in other areas of science. Information theory, which uses concepts such as mutual information, provides a rigorous framework for addressing it. While a number of information-theoretic methods are already available, most of them focus on a particular type of problem, introducing assumptions that limit their generality. Furthermore, many of these methods lack a publicly available implementation. Here we present MIDER, a method for inferring network structures with information theoretic concepts. It consists of two steps: first, it provides a representation of the network in which the distance among nodes indicates their statistical closeness. Second, it refines the prediction of the existing links to distinguish between direct and indirect interactions and to assign directionality. The method accepts as input time-series data related to some quantitative features of the network nodes (such as e.g. concentrations, if the nodes are chemical species). It takes into account time delays between variables, and allows choosing among several definitions and normalizations of mutual information. It is general purpose: it may be applied to any type of network, cellular or otherwise. A Matlab implementation including source code and data is freely available (http://www.iim.csic.es/~gingproc/mider.html). The performance of MIDER has been evaluated on seven different benchmark problems that cover the main types of cellular networks, including metabolic, gene regulatory, and signaling. Comparisons with state of the art information–theoretic methods have demonstrated the competitive performance of MIDER, as well as its versatility. Its use does not demand any a priori knowledge from the user; the default settings and the adaptive nature of the method provide good results for a wide range of problems without requiring tuning. PMID:24806471

  9. MIDER: network inference with mutual information distance and entropy reduction.

    PubMed

    Villaverde, Alejandro F; Ross, John; Morán, Federico; Banga, Julio R

    2014-01-01

    The prediction of links among variables from a given dataset is a task referred to as network inference or reverse engineering. It is an open problem in bioinformatics and systems biology, as well as in other areas of science. Information theory, which uses concepts such as mutual information, provides a rigorous framework for addressing it. While a number of information-theoretic methods are already available, most of them focus on a particular type of problem, introducing assumptions that limit their generality. Furthermore, many of these methods lack a publicly available implementation. Here we present MIDER, a method for inferring network structures with information theoretic concepts. It consists of two steps: first, it provides a representation of the network in which the distance among nodes indicates their statistical closeness. Second, it refines the prediction of the existing links to distinguish between direct and indirect interactions and to assign directionality. The method accepts as input time-series data related to some quantitative features of the network nodes (such as e.g. concentrations, if the nodes are chemical species). It takes into account time delays between variables, and allows choosing among several definitions and normalizations of mutual information. It is general purpose: it may be applied to any type of network, cellular or otherwise. A Matlab implementation including source code and data is freely available (http://www.iim.csic.es/~gingproc/mider.html). The performance of MIDER has been evaluated on seven different benchmark problems that cover the main types of cellular networks, including metabolic, gene regulatory, and signaling. Comparisons with state of the art information-theoretic methods have demonstrated the competitive performance of MIDER, as well as its versatility. Its use does not demand any a priori knowledge from the user; the default settings and the adaptive nature of the method provide good results for a wide range of problems without requiring tuning.

  10. Review of finite fields: Applications to discrete Fourier, transforms and Reed-Solomon coding

    NASA Technical Reports Server (NTRS)

    Wong, J. S. L.; Truong, T. K.; Benjauthrit, B.; Mulhall, B. D. L.; Reed, I. S.

    1977-01-01

    An attempt is made to provide a step-by-step approach to the subject of finite fields. Rigorous proofs and highly theoretical materials are avoided. The simple concepts of groups, rings, and fields are discussed and developed more or less heuristically. Examples are used liberally to illustrate the meaning of definitions and theories. Applications include discrete Fourier transforms and Reed-Solomon coding.

  11. Music-therapy analyzed through conceptual mapping

    NASA Astrophysics Data System (ADS)

    Martinez, Rodolfo; de la Fuente, Rebeca

    2002-11-01

    Conceptual maps have been employed lately as a learning tool, as a modern study technique, and as a new way to understand intelligence, which allows for the development of a strong theoretical reference, in order to prove the research hypothesis. This paper presents a music-therapy analysis based on this tool to produce a conceptual mapping network, which ranges from magic through the rigor of the hard sciences.

  12. Developing a targeted, theory-informed implementation intervention using two theoretical frameworks to address health professional and organisational factors: a case study to improve the management of mild traumatic brain injury in the emergency department.

    PubMed

    Tavender, Emma J; Bosch, Marije; Gruen, Russell L; Green, Sally E; Michie, Susan; Brennan, Sue E; Francis, Jill J; Ponsford, Jennie L; Knott, Jonathan C; Meares, Sue; Smyth, Tracy; O'Connor, Denise A

    2015-05-25

    Despite the availability of evidence-based guidelines for the management of mild traumatic brain injury in the emergency department (ED), variations in practice exist. Interventions designed to implement recommended behaviours can reduce this variation. Using theory to inform intervention development is advocated; however, there is no consensus on how to select or apply theory. Integrative theoretical frameworks, based on syntheses of theories and theoretical constructs relevant to implementation, have the potential to assist in the intervention development process. This paper describes the process of applying two theoretical frameworks to investigate the factors influencing recommended behaviours and the choice of behaviour change techniques and modes of delivery for an implementation intervention. A stepped approach was followed: (i) identification of locally applicable and actionable evidence-based recommendations as targets for change, (ii) selection and use of two theoretical frameworks for identifying barriers to and enablers of change (Theoretical Domains Framework and Model of Diffusion of Innovations in Service Organisations) and (iii) identification and operationalisation of intervention components (behaviour change techniques and modes of delivery) to address the barriers and enhance the enablers, informed by theory, evidence and feasibility/acceptability considerations. We illustrate this process in relation to one recommendation, prospective assessment of post-traumatic amnesia (PTA) by ED staff using a validated tool. Four recommendations for managing mild traumatic brain injury were targeted with the intervention. The intervention targeting the PTA recommendation consisted of 14 behaviour change techniques and addressed 6 theoretical domains and 5 organisational domains. The mode of delivery was informed by six Cochrane reviews. It was delivered via five intervention components : (i) local stakeholder meetings, (ii) identification of local opinion leader teams, (iii) a train-the-trainer workshop for appointed local opinion leaders, (iv) local training workshops for delivery by trained local opinion leaders and (v) provision of tools and materials to prompt recommended behaviours. Two theoretical frameworks were used in a complementary manner to inform intervention development in managing mild traumatic brain injury in the ED. The effectiveness and cost-effectiveness of the developed intervention is being evaluated in a cluster randomised trial, part of the Neurotrauma Evidence Translation (NET) program.

  13. A mobile application of breast cancer e-support program versus routine Care in the treatment of Chinese women with breast cancer undergoing chemotherapy: study protocol for a randomized controlled trial.

    PubMed

    Zhu, Jiemin; Ebert, Lyn; Liu, Xiangyu; Chan, Sally Wai-Chi

    2017-04-26

    Women with breast cancer undergoing chemotherapy suffer from a number of symptoms and report receiving inadequate support from health care professionals. Innovative and easily accessible interventions are lacking. Breast Cancer e-Support is a mobile Application program (App) that provides patients with individually tailored information and a support group of peers and health care professionals. Breast Cancer e-Support aims to promote women's self-efficacy, social support and symptom management, thus improving their quality of life and psychological well-being. A single-blinded, multi-centre, randomised, 6-month, parallel-group superiority design will be used. Based on Bandura's self-efficacy theory and the social exchange theory, Breast Cancer e-Support has four modules: 1) a Learning forum; 2) a Discussion forum; 3) an Ask-the-Expert forum; and 4) a Personal Stories forum. Women with breast cancer (n = 108) who are commencing chemotherapy will be recruited from two university-affiliated hospitals in China. They will be randomly assigned to either control group that receives routine care or intervention group that receives routine care plus access to Breast Cancer e-Support program during their four cycles of chemotherapy. Self-efficacy, social support, symptom distress, quality of life, and anxiety and depression will be measured at baseline, then one week and 12 weeks post-intervention. This is the first study of its kind in China to evaluate the use of a mobile application intervention with a rigorous research design and theoretical framework. This study will contribute to evidence regarding the effectiveness of a theory-based mobile application to support women with breast cancer undergoing chemotherapy. The results should provide a better understanding of the role of self-efficacy and social support in reducing symptom distress and of the credibility of using a theoretical framework to develop internet-based interventions. The results will provide evidence to support the implementation of an innovative and easily accessible intervention that enhances health outcomes. ACTRN: ACTRN12616000639426 , Registered 17 May, 2016.

  14. Guidelines for a graph-theoretic implementation of structural equation modeling

    USGS Publications Warehouse

    Grace, James B.; Schoolmaster, Donald R.; Guntenspergen, Glenn R.; Little, Amanda M.; Mitchell, Brian R.; Miller, Kathryn M.; Schweiger, E. William

    2012-01-01

    Structural equation modeling (SEM) is increasingly being chosen by researchers as a framework for gaining scientific insights from the quantitative analyses of data. New ideas and methods emerging from the study of causality, influences from the field of graphical modeling, and advances in statistics are expanding the rigor, capability, and even purpose of SEM. Guidelines for implementing the expanded capabilities of SEM are currently lacking. In this paper we describe new developments in SEM that we believe constitute a third-generation of the methodology. Most characteristic of this new approach is the generalization of the structural equation model as a causal graph. In this generalization, analyses are based on graph theoretic principles rather than analyses of matrices. Also, new devices such as metamodels and causal diagrams, as well as an increased emphasis on queries and probabilistic reasoning, are now included. Estimation under a graph theory framework permits the use of Bayesian or likelihood methods. The guidelines presented start from a declaration of the goals of the analysis. We then discuss how theory frames the modeling process, requirements for causal interpretation, model specification choices, selection of estimation method, model evaluation options, and use of queries, both to summarize retrospective results and for prospective analyses. The illustrative example presented involves monitoring data from wetlands on Mount Desert Island, home of Acadia National Park. Our presentation walks through the decision process involved in developing and evaluating models, as well as drawing inferences from the resulting prediction equations. In addition to evaluating hypotheses about the connections between human activities and biotic responses, we illustrate how the structural equation (SE) model can be queried to understand how interventions might take advantage of an environmental threshold to limit Typha invasions. The guidelines presented provide for an updated definition of the SEM process that subsumes the historical matrix approach under a graph-theory implementation. The implementation is also designed to permit complex specifications and to be compatible with various estimation methods. Finally, they are meant to foster the use of probabilistic reasoning in both retrospective and prospective considerations of the quantitative implications of the results.

  15. A Voice-Based E-Examination Framework for Visually Impaired Students in Open and Distance Learning

    ERIC Educational Resources Information Center

    Azeta, Ambrose A.; Inam, Itorobong A.; Daramola, Olawande

    2018-01-01

    Voice-based systems allow users access to information on the internet over a voice interface. Prior studies on Open and Distance Learning (ODL) e-examination systems that make use of voice interface do not sufficiently exhibit intelligent form of assessment, which diminishes the rigor of examination. The objective of this paper is to improve on…

  16. School Accountability, Autonomy, Choice, and the Level of Student Achievement: International Evidence from PISA 2003. OECD Education Working Papers, No. 13

    ERIC Educational Resources Information Center

    Wobmann, Ludger; Ludemann, Elke; Schutz, Gabriela; West, Martin R.

    2007-01-01

    Accountability, autonomy, and choice play a leading role in recent school reforms in many countries. This report provides new evidence on whether students perform better in school systems that have such institutional measures in place. We implement an internationally comparative approach within a rigorous micro-econometric framework that accounts…

  17. Improvements to Strategic Planning and Implementation through Enhanced Correlation with Decision-Making Frameworks

    ERIC Educational Resources Information Center

    McCready, John W.

    2010-01-01

    The purpose of this study was to examine use of decision-making tools and feedback in strategic planning in order to develop a rigorous process that would promote the efficiency of strategic planning for acquisitions in the United States Coast Guard (USCG). Strategic planning is critical to agencies such as the USCG in order to be effective…

  18. A Reduced Basis Method with Exact-Solution Certificates for Symmetric Coercive Equations

    DTIC Science & Technology

    2013-11-06

    the energy associated with the infinite - dimensional weak solution of parametrized symmetric coercive partial differential equations with piecewise...builds bounds with respect to the infinite - dimensional weak solution, aims to entirely remove the issue of the “truth” within the certified reduced basis...framework. We in particular introduce a reduced basis method that provides rigorous upper and lower bounds

  19. Toward the Development of a Program Quality Framework for Career and Technical Education Programs: A Researcher-Practitioner Collaborative Project

    ERIC Educational Resources Information Center

    Brodersen, R. Marc; Yanoski, David; Hyslop, Alisha; Imperatore, Catherine

    2016-01-01

    Career and technical education (CTE) programs of study are subject to rigorous state and federal accountability systems that provide information on key student outcomes. However, while these outcome measures can form a basis for identifying high- and low-performing programs, they are insufficient for answering underlying questions about how or why…

  20. Supporting Excellence: A Framework for Developing, Implementing, and Sustaining a High-Quality District Curriculum. First Edition

    ERIC Educational Resources Information Center

    Council of the Great City Schools, 2017

    2017-01-01

    In the ongoing effort to improve instructional standards in our nation's urban public schools, the Council of the Great City Schools has released resources to help districts determine the quality and alignment of instructional materials at each grade level; to ensure that materials for English language learners are rigorous and aligned to district…

  1. Exploring the Influence of 21st Century Skills in a Dual Language Program: A Case Study

    ERIC Educational Resources Information Center

    Heinrichs, Christine R.

    2016-01-01

    Preparing students as 21st century learners is a key reform in education. The Partnership for 21st Century Skills developed a framework that identifies outcomes needed for successful implementation of rigorous standards. The Dual Language (DL) program was identified as a structure for reform with systems and practices which can be used to prepare…

  2. Principals in the Pipeline: Districts Construct a Framework to Develop School Leadership

    ERIC Educational Resources Information Center

    Mendels, Pamela

    2012-01-01

    A diverse school district hugging the eastern border of Washington, D.C., Prince George's County, has introduced rigorous hiring methods and other practices to boost the quality of leadership in its 198 schools. In so doing, the district has also earned a spot among the pioneers in efforts nationally to ensure that public schools are led by the…

  3. Reconsidering Social Cohesion: Developing a Definition and Analytical Framework for Empirical Research

    ERIC Educational Resources Information Center

    Chan, Joseph; To, Ho-Pong; Chan, Elaine

    2006-01-01

    Despite its growing currency in academic and policy circles, social cohesion is a term in need of a clearer and more rigorous definition. This article provides a critical review of the ways social cohesion has been conceptualized in the literature in many cases, definitions are too loosely made, with a common confusion between the content and the…

  4. Rating a Teacher Observation Tool: Five Ways to Ensure Classroom Observations are Focused and Rigorous

    ERIC Educational Resources Information Center

    New Teacher Project, 2011

    2011-01-01

    This "Rating a Teacher Observation Tool" identifies five simple questions and provides an easy-to-use scorecard to help policymakers decide whether an observation framework is likely to produce fair and accurate results. The five questions are: (1) Do the criteria and tools cover the classroom performance areas most connected to student outcomes?…

  5. Mother Tongue Education in Primary Teacher Education in Kenya: A Language Management Critique of the Quota System

    ERIC Educational Resources Information Center

    Mwaniki, Munene

    2014-01-01

    Mother tongue education (MTE) has been a subject of rigorous debate for more than half a century, in both industrialised and developing societies. Despite disparate views on MTE, there is an uneasy consensus on its importance in educational systems, especially in the foundational years. Using the Language Management Framework, the article provides…

  6. Rigor and Relevance of Contextualized Measures in Mathematics, Literacy and Inquiry Learning for Rural Public Schooling in Medellin

    ERIC Educational Resources Information Center

    Amador-Lankster, Clara

    2018-01-01

    The purpose of this article is to discuss a Fulbright Evaluation Framework and to analyze findings resulting from implementation of two contextualized measures designed as LEARNING BY DOING in response to achievement expectations from the National Education Ministry in Colombia in three areas. The goal of the Fulbright funded project was to…

  7. Academic Rigor and Economic Value: GED[R] and High School Students' Perceptions and Misperceptions of the GED[R] vs. the High School Diploma

    ERIC Educational Resources Information Center

    Horne, Lela M.; Rachal, John R.; Shelley, Kyna

    2012-01-01

    A mixed methods framework utilized quantitative and qualitative data to determine whether statistically significant differences existed between high school and GED[R] student perceptions of credential value. An exploratory factor analysis (n=326) extracted four factors and then a MANOVA procedure was performed with a stratified quota sample…

  8. Genomic similarity and kernel methods I: advancements by building on mathematical and statistical foundations.

    PubMed

    Schaid, Daniel J

    2010-01-01

    Measures of genomic similarity are the basis of many statistical analytic methods. We review the mathematical and statistical basis of similarity methods, particularly based on kernel methods. A kernel function converts information for a pair of subjects to a quantitative value representing either similarity (larger values meaning more similar) or distance (smaller values meaning more similar), with the requirement that it must create a positive semidefinite matrix when applied to all pairs of subjects. This review emphasizes the wide range of statistical methods and software that can be used when similarity is based on kernel methods, such as nonparametric regression, linear mixed models and generalized linear mixed models, hierarchical models, score statistics, and support vector machines. The mathematical rigor for these methods is summarized, as is the mathematical framework for making kernels. This review provides a framework to move from intuitive and heuristic approaches to define genomic similarities to more rigorous methods that can take advantage of powerful statistical modeling and existing software. A companion paper reviews novel approaches to creating kernels that might be useful for genomic analyses, providing insights with examples [1]. Copyright © 2010 S. Karger AG, Basel.

  9. Properties of field functionals and characterization of local functionals

    NASA Astrophysics Data System (ADS)

    Brouder, Christian; Dang, Nguyen Viet; Laurent-Gengoux, Camille; Rejzner, Kasia

    2018-02-01

    Functionals (i.e., functions of functions) are widely used in quantum field theory and solid-state physics. In this paper, functionals are given a rigorous mathematical framework and their main properties are described. The choice of the proper space of test functions (smooth functions) and of the relevant concept of differential (Bastiani differential) are discussed. The relation between the multiple derivatives of a functional and the corresponding distributions is described in detail. It is proved that, in a neighborhood of every test function, the support of a smooth functional is uniformly compactly supported and the order of the corresponding distribution is uniformly bounded. Relying on a recent work by Dabrowski, several spaces of functionals are furnished with a complete and nuclear topology. In view of physical applications, it is shown that most formal manipulations can be given a rigorous meaning. A new concept of local functionals is proposed and two characterizations of them are given: the first one uses the additivity (or Hammerstein) property, the second one is a variant of Peetre's theorem. Finally, the first step of a cohomological approach to quantum field theory is carried out by proving a global Poincaré lemma and defining multi-vector fields and graded functionals within our framework.

  10. Ecological Dynamics as a Theoretical Framework for Development of Sustainable Behaviours towards the Environment

    ERIC Educational Resources Information Center

    Brymer, Eric; Davids, Keith

    2013-01-01

    This paper proposes how the theoretical framework of ecological dynamics can provide an influential model of the learner and the learning process to pre-empt effective behaviour changes. Here we argue that ecological dynamics supports a well-established model of the learner ideally suited to the environmental education context because of its…

  11. An Exploration of E-Learning Benefits for Saudi Arabia: Toward Policy Reform

    ERIC Educational Resources Information Center

    Alrashidi, Abdulaziz

    2013-01-01

    Purpose: The purpose of this study was to examine policies and solutions addressing (a) improving education for citizens of the Kingdom of Saudi Arabia and (b) providing alternative instructional delivery methods, including e-learning for those living in remote areas. Theoretical Framework: The theoretical framework of this study was based on the…

  12. Applying a Conceptual Design Framework to Study Teachers' Use of Educational Technology

    ERIC Educational Resources Information Center

    Holmberg, Jörgen

    2017-01-01

    Theoretical outcomes of design-based research (DBR) are often presented in the form of local theory design principles. This article suggests a complementary theoretical construction in DBR, in the form of a "design framework" at a higher abstract level, to study and inform educational design with ICT in different situated contexts.…

  13. A Theoretical Framework to Guide the Re-Engineering of Technology Education

    ERIC Educational Resources Information Center

    Kelley, Todd; Kellam, Nadia

    2009-01-01

    Before leaders in technology education are able to identify a theoretical framework upon which a curriculum is to stand, they must first grapple with two opposing views of the purpose of technology education--education for all learners or career/technical education. Dakers (2006) identifies two opposing philosophies that can serve as a framework…

  14. Functional Path Analysis as a Multivariate Technique in Developing a Theory of Participation in Adult Education.

    ERIC Educational Resources Information Center

    Martin, James L.

    This paper reports on attempts by the author to construct a theoretical framework of adult education participation using a theory development process and the corresponding multivariate statistical techniques. Two problems are identified: the lack of theoretical framework in studying problems, and the limiting of statistical analysis to univariate…

  15. The Relationships between Students' Use of Instant Messaging and Their Psychological Sense of Community

    ERIC Educational Resources Information Center

    Thomas, Amanda Garland

    2009-01-01

    The purpose of this study was to understand the extent to which students' psychological sense of community was influenced by IM use using the psychological sense of community theoretical framework created by McMillan and Chavis (1986), and the student development theoretical frameworks created by Schlossberg (1989) and Astin (1984). Thus, this…

  16. Proverbs as Theoretical Frameworks for Lifelong Learning in Indigenous African Education

    ERIC Educational Resources Information Center

    Avoseh, Mejai B. M.

    2013-01-01

    Every aspect of a community's life and values in indigenous Africa provide the theoretical framework for education. The holistic worldview of the traditional system places a strong emphasis on the centrality of the human element and orature in the symmetrical relationship between life and learning. This article focuses on proverbs and the words…

  17. Unpacking Teacher-Researcher Collaboration with Three Theoretical Frameworks: A Case of Expansive Learning Activity?

    ERIC Educational Resources Information Center

    Gade, Sharada

    2015-01-01

    Long association with a mathematics teacher at a Grade 4-6 school in Sweden, is basis for reporting a case of teacher-researcher collaboration. Three theoretical frameworks used to study its development over time are relational knowing, relational agency and cogenerative dialogue. While relational knowing uses narrative perspectives to explore the…

  18. A Theoretical Framework for Organizing the Effect of the Internet on Cognitive Development

    ERIC Educational Resources Information Center

    Johnson, Genevieve Marie

    2006-01-01

    The number of children and adolescents accessing the Internet as well as the amount of time online are steadily increasing. The most common online activities include playing video games, navigating web sites, and communicating via chat rooms, email, and instant messaging. A theoretical framework for understanding the effects of Internet use on…

  19. A Theoretical Framework for Serious Game Design: Exploring Pedagogy, Play and Fidelity and Their Implications for the Design Process

    ERIC Educational Resources Information Center

    Rooney, Pauline

    2012-01-01

    It is widely acknowledged that digital games can provide an engaging, motivating and "fun" experience for students. However an entertaining game does not necessarily constitute a meaningful, valuable learning experience. For this reason, experts espouse the importance of underpinning serious games with a sound theoretical framework which…

  20. Variation Theory: A Theory of Learning and a Useful Theoretical Framework for Chemical Education Research

    ERIC Educational Resources Information Center

    Bussey, Thomas J.; Orgill, MaryKay; Crippen, Kent J.

    2013-01-01

    Instructors are constantly baffled by the fact that two students who are sitting in the same class, who have access to the same materials, can come to understand a particular chemistry concept differently. Variation theory offers a theoretical framework from which to explore possible variations in experience and the resulting differences in…

  1. Developing a Theoretical Framework for Examining Student Understanding of Fractional Concepts: An Historical Accounting

    ERIC Educational Resources Information Center

    Cooper, Susan M.; Wilkerson, Trena L.; Montgomery, Mark; Mechell, Sara; Arterbury, Kristin; Moore, Sherrie

    2012-01-01

    In 2007, a group of mathematics educators and researchers met to examine rational numbers and why children have such an issue with them. An extensive review of the literature on fractional understanding was conducted. The ideas in that literature were then consolidated into a theoretical framework for examining fractions. Once that theoretical…

  2. Measuring implementation behaviour of menu guidelines in the childcare setting: confirmatory factor analysis of a theoretical domains framework questionnaire (TDFQ).

    PubMed

    Seward, Kirsty; Wolfenden, Luke; Wiggers, John; Finch, Meghan; Wyse, Rebecca; Oldmeadow, Christopher; Presseau, Justin; Clinton-McHarg, Tara; Yoong, Sze Lin

    2017-04-04

    While there are number of frameworks which focus on supporting the implementation of evidence based approaches, few psychometrically valid measures exist to assess constructs within these frameworks. This study aimed to develop and psychometrically assess a scale measuring each domain of the Theoretical Domains Framework for use in assessing the implementation of dietary guidelines within a non-health care setting (childcare services). A 75 item 14-domain Theoretical Domains Framework Questionnaire (TDFQ) was developed and administered via telephone interview to 202 centre based childcare service cooks who had a role in planning the service menu. Confirmatory factor analysis (CFA) was undertaken to assess the reliability, discriminant validity and goodness of fit of the 14-domain theoretical domain framework measure. For the CFA, five iterative processes of adjustment were undertaken where 14 items were removed, resulting in a final measure consisting of 14 domains and 61 items. For the final measure: the Chi-Square goodness of fit statistic was 3447.19; the Standardized Root Mean Square Residual (SRMR) was 0.070; the Root Mean Square Error of Approximation (RMSEA) was 0.072; and the Comparative Fit Index (CFI) had a value of 0.78. While only one of the three indices support goodness of fit of the measurement model tested, a 14-domain model with 61 items showed good discriminant validity and internally consistent items. Future research should aim to assess the psychometric properties of the developed TDFQ in other community-based settings.

  3. Obesity in sub-Saharan Africa: development of an ecological theoretical framework.

    PubMed

    Scott, Alison; Ejikeme, Chinwe Stella; Clottey, Emmanuel Nii; Thomas, Joy Goens

    2013-03-01

    The prevalence of overweight and obesity is increasing in sub-Saharan Africa (SSA). There is a need for theoretical frameworks to catalyze further research and to inform the development of multi-level, context-appropriate interventions. In this commentary, we propose a preliminary ecological theoretical framework to conceptualize factors that contribute to increases in overweight and obesity in SSA. The framework is based on a Causality Continuum model [Coreil et al. Social and Behavioral Foundations of Public Health. Sage Publications, Thousand Oaks] that considers distant, intermediate and proximate influences. The influences incorporated in the model include globalization and urbanization as distant factors; occupation, social relationships, built environment and cultural perceptions of weight as intermediate factors and caloric intake, physical inactivity and genetics as proximate factors. The model illustrates the interaction of factors along a continuum, from the individual to the global marketplace, in shaping trends in overweight and obesity in SSA. The framework will be presented, each influence elucidated and implications for research and intervention development discussed. There is a tremendous need for further research on obesity in SSA. An improved evidence base will serve to validate and develop the proposed framework further.

  4. Overarching framework for data-based modelling

    NASA Astrophysics Data System (ADS)

    Schelter, Björn; Mader, Malenka; Mader, Wolfgang; Sommerlade, Linda; Platt, Bettina; Lai, Ying-Cheng; Grebogi, Celso; Thiel, Marco

    2014-02-01

    One of the main modelling paradigms for complex physical systems are networks. When estimating the network structure from measured signals, typically several assumptions such as stationarity are made in the estimation process. Violating these assumptions renders standard analysis techniques fruitless. We here propose a framework to estimate the network structure from measurements of arbitrary non-linear, non-stationary, stochastic processes. To this end, we propose a rigorous mathematical theory that underlies this framework. Based on this theory, we present a highly efficient algorithm and the corresponding statistics that are immediately sensibly applicable to measured signals. We demonstrate its performance in a simulation study. In experiments of transitions between vigilance stages in rodents, we infer small network structures with complex, time-dependent interactions; this suggests biomarkers for such transitions, the key to understand and diagnose numerous diseases such as dementia. We argue that the suggested framework combines features that other approaches followed so far lack.

  5. A Formal Framework for the Analysis of Algorithms That Recover From Loss of Separation

    NASA Technical Reports Server (NTRS)

    Butler, RIcky W.; Munoz, Cesar A.

    2008-01-01

    We present a mathematical framework for the specification and verification of state-based conflict resolution algorithms that recover from loss of separation. In particular, we propose rigorous definitions of horizontal and vertical maneuver correctness that yield horizontal and vertical separation, respectively, in a bounded amount of time. We also provide sufficient conditions for independent correctness, i.e., separation under the assumption that only one aircraft maneuvers, and for implicitly coordinated correctness, i.e., separation under the assumption that both aircraft maneuver. An important benefit of this approach is that different aircraft can execute different algorithms and implicit coordination will still be achieved, as long as they all meet the explicit criteria of the framework. Towards this end we have sought to make the criteria as general as possible. The framework presented in this paper has been formalized and mechanically verified in the Prototype Verification System (PVS).

  6. Harnessing Implementation Science to Increase the Impact of Health Equity Research.

    PubMed

    Chinman, Matthew; Woodward, Eva N; Curran, Geoffrey M; Hausmann, Leslie R M

    2017-09-01

    Health disparities are differences in health or health care between groups based on social, economic, and/or environmental disadvantage. Disparity research often follows 3 steps: detecting (phase 1), understanding (phase 2), and reducing (phase 3), disparities. Although disparities have narrowed over time, many remain. We argue that implementation science could enhance disparities research by broadening the scope of phase 2 studies and offering rigorous methods to test disparity-reducing implementation strategies in phase 3 studies. We briefly review the focus of phase 2 and phase 3 disparities research. We then provide a decision tree and case examples to illustrate how implementation science frameworks and research designs could further enhance disparity research. Most health disparities research emphasizes patient and provider factors as predominant mechanisms underlying disparities. Applying implementation science frameworks like the Consolidated Framework for Implementation Research could help disparities research widen its scope in phase 2 studies and, in turn, develop broader disparities-reducing implementation strategies in phase 3 studies. Many phase 3 studies of disparity-reducing implementation strategies are similar to case studies, whose designs are not able to fully test causality. Implementation science research designs offer rigorous methods that could accelerate the pace at which equity is achieved in real-world practice. Disparities can be considered a "special case" of implementation challenges-when evidence-based clinical interventions are delivered to, and received by, vulnerable populations at lower rates. Bringing together health disparities research and implementation science could advance equity more than either could achieve on their own.

  7. Designing effective human-automation-plant interfaces: a control-theoretic perspective.

    PubMed

    Jamieson, Greg A; Vicente, Kim J

    2005-01-01

    In this article, we propose the application of a control-theoretic framework to human-automation interaction. The framework consists of a set of conceptual distinctions that should be respected in automation research and design. We demonstrate how existing automation interface designs in some nuclear plants fail to recognize these distinctions. We further show the value of the approach by applying it to modes of automation. The design guidelines that have been proposed in the automation literature are evaluated from the perspective of the framework. This comparison shows that the framework reveals insights that are frequently overlooked in this literature. A new set of design guidelines is introduced that builds upon the contributions of previous research and draws complementary insights from the control-theoretic framework. The result is a coherent and systematic approach to the design of human-automation-plant interfaces that will yield more concrete design criteria and a broader set of design tools. Applications of this research include improving the effectiveness of human-automation interaction design and the relevance of human-automation interaction research.

  8. An integrated organisation-wide data quality management and information governance framework: theoretical underpinnings.

    PubMed

    Liaw, Siaw-Teng; Pearce, Christopher; Liyanage, Harshana; Liaw, Gladys S S; de Lusignan, Simon

    2014-01-01

    Increasing investment in eHealth aims to improve cost effectiveness and safety of care. Data extraction and aggregation can create new data products to improve professional practice and provide feedback to improve the quality of source data. A previous systematic review concluded that locally relevant clinical indicators and use of clinical record systems could support clinical governance. We aimed to extend and update the review with a theoretical framework. We searched PubMed, Medline, Web of Science, ABI Inform (Proquest) and Business Source Premier (EBSCO) using the terms curation, information ecosystem, data quality management (DQM), data governance, information governance (IG) and data stewardship. We focused on and analysed the scope of DQM and IG processes, theoretical frameworks, and determinants of the processing, quality assurance, presentation and sharing of data across the enterprise. There are good theoretical reasons for integrated governance, but there is variable alignment of DQM, IG and health system objectives across the health enterprise. Ethical constraints exist that require health information ecosystems to process data in ways that are aligned with improving health and system efficiency and ensuring patient safety. Despite an increasingly 'big-data' environment, DQM and IG in health services are still fragmented across the data production cycle. We extend current work on DQM and IG with a theoretical framework for integrated IG across the data cycle. The dimensions of this theory-based framework would require testing with qualitative and quantitative studies to examine the applicability and utility, along with an evaluation of its impact on data quality across the health enterprise.

  9. Death of a Simulated Pediatric Patient: Toward a More Robust Theoretical Framework.

    PubMed

    McBride, Mary E; Schinasi, Dana Aronson; Moga, Michael Alice; Tripathy, Shreepada; Calhoun, Aaron

    2017-12-01

    A theoretical framework was recently proposed that encapsulates learner responses to simulated death due to action or inaction in the pediatric context. This framework, however, was developed at an institution that allows simulated death and thus does not address the experience of those centers at which this technique is not used. To address this, we performed a parallel qualitative study with the intent of augmenting the initial framework. We conducted focus groups, using a constructivist grounded theory approach, using physicians and nurses who have experienced a simulated cardiac arrest. The participants were recruited via e-mail. Transcripts were analyzed by coders blinded to the original framework to generate a list of provisional themes that were iteratively refined. These themes were then compared with the themes from the original article and used to derive a consensus model that incorporated the most relevant features of each. Focus group data yielded 7 themes. Six were similar to those developed in the original framework. One important exception was noted; however, those learners not exposed to patient death due to action or inaction often felt that the mannequin's survival was artificial. This additional theme was incorporated into a revised framework. The original framework addresses most aspects of learner reactions to simulated death. Our work suggests that adding the theme pertaining to the lack of realism that can be perceived when the mannequin is unexpectedly saved results in a more robust theoretical framework transferable to centers that do not allow mannequin death.

  10. Inferring the nature of anthropogenic threats from long-term abundance records.

    PubMed

    Shoemaker, Kevin T; Akçakaya, H Resit

    2015-02-01

    Diagnosing the processes that threaten species persistence is critical for recovery planning and risk forecasting. Dominant threats are typically inferred by experts on the basis of a patchwork of informal methods. Transparent, quantitative diagnostic tools would contribute much-needed consistency, objectivity, and rigor to the process of diagnosing anthropogenic threats. Long-term census records, available for an increasingly large and diverse set of taxa, may exhibit characteristic signatures of specific threatening processes and thereby provide information for threat diagnosis. We developed a flexible Bayesian framework for diagnosing threats on the basis of long-term census records and diverse ancillary sources of information. We tested this framework with simulated data from artificial populations subjected to varying degrees of exploitation and habitat loss and several real-world abundance time series for which threatening processes are relatively well understood: bluefin tuna (Thunnus maccoyii) and Atlantic cod (Gadus morhua) (exploitation) and Red Grouse (Lagopus lagopus scotica) and Eurasian Skylark (Alauda arvensis) (habitat loss). Our method correctly identified the process driving population decline for over 90% of time series simulated under moderate to severe threat scenarios. Successful identification of threats approached 100% for severe exploitation and habitat loss scenarios. Our method identified threats less successfully when threatening processes were weak and when populations were simultaneously affected by multiple threats. Our method selected the presumed true threat model for all real-world case studies, although results were somewhat ambiguous in the case of the Eurasian Skylark. In the latter case, incorporation of an ancillary source of information (records of land-use change) increased the weight assigned to the presumed true model from 70% to 92%, illustrating the value of the proposed framework in bringing diverse sources of information into a common rigorous framework. Ultimately, our framework may greatly assist conservation organizations in documenting threatening processes and planning species recovery. © 2014 Society for Conservation Biology.

  11. Communication: Rigorous quantum dynamics of O + O{sub 2} exchange reactions on an ab initio potential energy surface substantiate the negative temperature dependence of rate coefficients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Yaqin; Sun, Zhigang, E-mail: zsun@dicp.ac.cn, E-mail: dawesr@mst.edu, E-mail: hguo@unm.edu; Center for Advanced Chemical Physics, University of Science and Technology of China, 96 Jinzhai Road, Hefei 230026

    2014-08-28

    The kinetics and dynamics of several O + O{sub 2} isotope exchange reactions have been investigated on a recently determined accurate global O{sub 3} potential energy surface using a time-dependent wave packet method. The agreement between calculated and measured rate coefficients is significantly improved over previous work. More importantly, the experimentally observed negative temperature dependence of the rate coefficients is for the first time rigorously reproduced theoretically. This negative temperature dependence can be attributed to the absence in the new potential energy surface of a submerged “reef” structure, which was present in all previous potential energy surfaces. In addition, contributionsmore » of rotational excited states of the diatomic reactant further accentuate the negative temperature dependence.« less

  12. Rigorous derivation of the effective model describing a non-isothermal fluid flow in a vertical pipe filled with porous medium

    NASA Astrophysics Data System (ADS)

    Beneš, Michal; Pažanin, Igor

    2018-03-01

    This paper reports an analytical investigation of non-isothermal fluid flow in a thin (or long) vertical pipe filled with porous medium via asymptotic analysis. We assume that the fluid inside the pipe is cooled (or heated) by the surrounding medium and that the flow is governed by the prescribed pressure drop between pipe's ends. Starting from the dimensionless Darcy-Brinkman-Boussinesq system, we formally derive a macroscopic model describing the effective flow at small Brinkman-Darcy number. The asymptotic approximation is given by the explicit formulae for the velocity, pressure and temperature clearly acknowledging the effects of the cooling (heating) and porous structure. The theoretical error analysis is carried out to indicate the order of accuracy and to provide a rigorous justification of the effective model.

  13. Leader or Manager: Academic Library Leader's Leadership Orientation Considered Ideal by Faculty, Administrators and Librarians at Private, Nonprofit, Doctoral Universities in Southern California

    ERIC Educational Resources Information Center

    Tripuraneni, Vinaya L.

    2010-01-01

    Purpose: The purpose of this study is to identify the leadership orientation of the academic library leader considered ideal by faculty, administrators and librarians in private, non-profit, doctoral universities in Southern California. Theoretical Framework: The theoretical framework used for this study was Bolman and Deal's Leadership…

  14. Toward an Instructional Philosophy: "A Theoretical Framework for Teaching and Training at Salman Bin Abdulaziz University (SAU)"

    ERIC Educational Resources Information Center

    Qandile, Yasine A.; Al-Qasim, Wajeeh Q.

    2014-01-01

    The purpose of this study is to construct a clear instructional philosophy for Salman bin Abdulaziz University as a fundamental basis for teaching and training as well as a theoretical framework for curriculum design and development. The study attempts to answer the main questions about pertaining to the basic structure of contemporary higher…

  15. First-Year Biology Students' Understandings of Meiosis: An Investigation Using a Structural Theoretical Framework

    ERIC Educational Resources Information Center

    Quinn, Frances; Pegg, John; Panizzon, Debra

    2009-01-01

    Meiosis is a biological concept that is both complex and important for students to learn. This study aims to explore first-year biology students' explanations of the process of meiosis, using an explicit theoretical framework provided by the Structure of the Observed Learning Outcome (SOLO) model. The research was based on responses of 334…

  16. Rural Employment, Migration, and Economic Development: Theoretical Issues and Empirical Evidence from Africa. Africa Rural Employment Paper No. 1.

    ERIC Educational Resources Information Center

    Byerlee, Derek; Eicher, Carl K.

    Employment problems in Africa were examined with special emphasis on rural employment and migration within the context of overall economic development. A framework was provided for analyzing rural employment in development; that framework was used to analyze empirical information from Africa; and theoretical issues were raised in analyzing rural…

  17. Toward an Integrative Theoretical Framework for Explaining Beliefs about Wife Beating: A Study among Students of Nursing from Turkey

    ERIC Educational Resources Information Center

    Haj-Yahia, Muhammad M.; Uysal, Aynur

    2011-01-01

    An integrative theoretical framework was tested as the basis for explaining beliefs about wife beating among Turkish nursing students. Based on a survey design, 406 nursing students (404 females) in all 4 years of undergraduate studies completed a self-administered questionnaire. Questionnaires were distributed and collected from the participants…

  18. The Pedagogy of Primary Historical Sources in Mathematics: Classroom Practice Meets Theoretical Frameworks

    ERIC Educational Resources Information Center

    Barnett, Janet Heine; Lodder, Jerry; Pengelley, David

    2014-01-01

    We analyze our method of teaching with primary historical sources within the context of theoretical frameworks for the role of history in teaching mathematics developed by Barbin, Fried, Jahnke, Jankvist, and Kjeldsen and Blomhøj, and more generally from the perspective of Sfard's theory of learning as communication. We present case studies…

  19. Understanding, Selecting, and Integrating a Theoretical Framework in Dissertation Research: Creating the Blueprint for Your "House"

    ERIC Educational Resources Information Center

    Grant, Cynthia; Osanloo, Azadeh

    2014-01-01

    The theoretical framework is one of the most important aspects in the research process, yet is often misunderstood by doctoral candidates as they prepare their dissertation research study. The importance of theory-driven thinking and acting is emphasized in relation to the selection of a topic, the development of research questions, the…

  20. [Development of the theoretical framework and the item pool of the peri-operative recovery scale for integrative medicine].

    PubMed

    Su, Bi-ying; Liu, Shao-nan; Li, Xiao-yan

    2011-11-01

    To study the train of thoughts and procedures for developing the theoretical framework and the item pool of the peri-operative recovery scale for integrative medicine, thus making preparation for the development of this scale and psychometric testing. Under the guidance for Chinese medicine theories and the guidance for developing psychometric scale, the theoretical framework and the item pool of the scale were initially laid out by literature retrieval, and expert consultation, etc. The scale covered the domains of physical function, mental function, activity function, pain, and general assessment. Besides, social function is involved, which is suitable for pre-operative testing and long-term therapeutic efficacy testing after discharge from hospital. Each domain should cover correlated Zang-Fu organs, qi, blood, and the patient-reported outcomes. Totally 122 items were initially covered in the item pool according to theoretical framework of the scale. The peri-operative recovery scale of integrative medicine was the embodiment of the combination of Chinese medicine theories and patient-reported outcome concepts. The scale could reasonably assess the peri-operative recovery outcomes of patients treated by integrative medicine.

  1. Experiences of using the Theoretical Domains Framework across diverse clinical environments: a qualitative study.

    PubMed

    Phillips, Cameron J; Marshall, Andrea P; Chaves, Nadia J; Jankelowitz, Stacey K; Lin, Ivan B; Loy, Clement T; Rees, Gwyneth; Sakzewski, Leanne; Thomas, Susie; To, The-Phung; Wilkinson, Shelley A; Michie, Susan

    2015-01-01

    The Theoretical Domains Framework (TDF) is an integrative framework developed from a synthesis of psychological theories as a vehicle to help apply theoretical approaches to interventions aimed at behavior change. This study explores experiences of TDF use by professionals from multiple disciplines across diverse clinical settings. Mixed methods were used to examine experiences, attitudes, and perspectives of health professionals in using the TDF in health care implementation projects. Individual interviews were conducted with ten health care professionals from six disciplines who used the TDF in implementation projects. Deductive content and thematic analysis were used. Three main themes and associated subthemes were identified including: 1) reasons for use of the TDF (increased confidence, broader perspective, and theoretical underpinnings); 2) challenges using the TDF (time and resources, operationalization of the TDF) and; 3) future use of the TDF. The TDF provided a useful, flexible framework for a diverse group of health professionals working across different clinical settings for the assessment of barriers and targeting resources to influence behavior change for implementation projects. The development of practical tools and training or support is likely to aid the utility of TDF.

  2. Experiences of using the Theoretical Domains Framework across diverse clinical environments: a qualitative study

    PubMed Central

    Phillips, Cameron J; Marshall, Andrea P; Chaves, Nadia J; Jankelowitz, Stacey K; Lin, Ivan B; Loy, Clement T; Rees, Gwyneth; Sakzewski, Leanne; Thomas, Susie; To, The-Phung; Wilkinson, Shelley A; Michie, Susan

    2015-01-01

    Background The Theoretical Domains Framework (TDF) is an integrative framework developed from a synthesis of psychological theories as a vehicle to help apply theoretical approaches to interventions aimed at behavior change. Purpose This study explores experiences of TDF use by professionals from multiple disciplines across diverse clinical settings. Methods Mixed methods were used to examine experiences, attitudes, and perspectives of health professionals in using the TDF in health care implementation projects. Individual interviews were conducted with ten health care professionals from six disciplines who used the TDF in implementation projects. Deductive content and thematic analysis were used. Results Three main themes and associated subthemes were identified including: 1) reasons for use of the TDF (increased confidence, broader perspective, and theoretical underpinnings); 2) challenges using the TDF (time and resources, operationalization of the TDF) and; 3) future use of the TDF. Conclusion The TDF provided a useful, flexible framework for a diverse group of health professionals working across different clinical settings for the assessment of barriers and targeting resources to influence behavior change for implementation projects. The development of practical tools and training or support is likely to aid the utility of TDF. PMID:25834455

  3. A Computational Framework for Automation of Point Defect Calculations

    NASA Astrophysics Data System (ADS)

    Goyal, Anuj; Gorai, Prashun; Peng, Haowei; Lany, Stephan; Stevanovic, Vladan; National Renewable Energy Laboratory, Golden, Colorado 80401 Collaboration

    A complete and rigorously validated open-source Python framework to automate point defect calculations using density functional theory has been developed. The framework provides an effective and efficient method for defect structure generation, and creation of simple yet customizable workflows to analyze defect calculations. The package provides the capability to compute widely accepted correction schemes to overcome finite-size effects, including (1) potential alignment, (2) image-charge correction, and (3) band filling correction to shallow defects. Using Si, ZnO and In2O3as test examples, we demonstrate the package capabilities and validate the methodology. We believe that a robust automated tool like this will enable the materials by design community to assess the impact of point defects on materials performance. National Renewable Energy Laboratory, Golden, Colorado 80401.

  4. A GROUNDED THEORY STUDY OF THE PROCESS USED TO NEGOTIATE CONDOM USE AMONG AFRICAN-AMERICAN WOMEN: REVIEW OF THE LITERATURE.

    PubMed

    Hunter, Teressa Sanders; Tilley, Donna Scott

    2015-01-01

    This review of the literature identifies themes, variable, goals, and gaps in the literature related to HIV and AIDS among African American women. Black Feminist Epistemology and symbolic interactionism are used as a theoretical perspective and philosophical framework to examine experiences and social behaviors of African-American women and to guide and framework to explain the findings from the literature. This theoretical perspective/philosophical framework can also be used in understanding processes used by African-American women in behavioral, social, and intimate interactions.

  5. Addressing the need for an infection prevention and control framework that incorporates the role of surveillance: a discussion paper.

    PubMed

    Mitchell, Brett G; Gardner, Anne

    2014-03-01

    To present a discussion on theoretical frameworks in infection prevention and control. Infection prevention and control programmes have been in place for several years in response to the incidence of healthcare-associated infections and their associated morbidity and mortality. Theoretical frameworks play an important role in formalizing the understanding of infection prevention activities. Discussion paper. A literature search using electronic databases was conducted for published articles in English addressing theoretical frameworks in infection prevention and control between 1980-2012. Nineteen papers that included a reference to frameworks were identified in the review. A narrative analysis of these papers was completed. Two models were identified and neither included the role of surveillance. To reduce the risk of acquiring a healthcare-associated infection, a multifaceted approach to infection prevention is required. One key component in this approach is surveillance. The review identified two infection prevention and control frameworks, yet these are rarely applied in infection prevention and control programmes. Only one framework considered the multifaceted approach required for infection prevention. It did not, however, incorporate the role of surveillance. We present a framework that incorporates the role of surveillance into a biopsychosocial approach to infection prevention and control. Infection prevention and control programmes and associated research are led primarily by nurses. There is a need for an explicit infection prevention and control framework incorporating the important role that surveillance has in infection prevention activities. This study presents one framework for further critique and discussion. © 2013 John Wiley & Sons Ltd.

  6. Decision support models for solid waste management: Review and game-theoretic approaches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karmperis, Athanasios C., E-mail: athkarmp@mail.ntua.gr; Army Corps of Engineers, Hellenic Army General Staff, Ministry of Defence; Aravossis, Konstantinos

    Highlights: ► The mainly used decision support frameworks for solid waste management are reviewed. ► The LCA, CBA and MCDM models are presented and their strengths, weaknesses, similarities and possible combinations are analyzed. ► The game-theoretic approach in a solid waste management context is presented. ► The waste management bargaining game is introduced as a specific decision support framework. ► Cooperative and non-cooperative game-theoretic approaches to decision support for solid waste management are discussed. - Abstract: This paper surveys decision support models that are commonly used in the solid waste management area. Most models are mainly developed within three decisionmore » support frameworks, which are the life-cycle assessment, the cost–benefit analysis and the multi-criteria decision-making. These frameworks are reviewed and their strengths and weaknesses as well as their critical issues are analyzed, while their possible combinations and extensions are also discussed. Furthermore, the paper presents how cooperative and non-cooperative game-theoretic approaches can be used for the purpose of modeling and analyzing decision-making in situations with multiple stakeholders. Specifically, since a waste management model is sustainable when considering not only environmental and economic but also social aspects, the waste management bargaining game is introduced as a specific decision support framework in which future models can be developed.« less

  7. Why do children and adolescents bully their peers? A critical review of key theoretical frameworks.

    PubMed

    Thomas, Hannah J; Connor, Jason P; Scott, James G

    2018-05-01

    Bullying is a significant public health problem for children and adolescents worldwide. Evidence suggests that both being bullied (bullying victimisation) and bullying others (bullying perpetration) are associated with concurrent and future mental health problems. The onset and course of bullying perpetration are influenced by individual as well as systemic factors. Identifying effective solutions to address bullying requires a fundamental understanding of why it occurs. Drawing from multi-disciplinary domains, this review provides a summary and synthesis of the key theoretical frameworks applied to understanding and intervening on the issue of bullying. A number of explanatory models have been used to elucidate the dynamics of bullying, and broadly these correspond with either system (e.g., social-ecological, family systems, peer-group socialisation) or individual-level (e.g., developmental psychopathology, genetic, resource control, social-cognitive) frameworks. Each theory adds a unique perspective; however, no single framework comprehensively explains why bullying occurs. This review demonstrates that the integration of theoretical perspectives achieves a more nuanced understanding of bullying which is necessary for strengthening evidence-based interventions. Future progress requires researchers to integrate both the systems and individual-level theoretical frameworks to further improve current interventions. More effective intervention across different systems as well as tailoring interventions to the specific needs of the individuals directly involved in bullying will reduce exposure to a key risk factor for mental health problems.

  8. Local and global approaches to the problem of Poincaré recurrences. Applications in nonlinear dynamics

    NASA Astrophysics Data System (ADS)

    Anishchenko, V. S.; Boev, Ya. I.; Semenova, N. I.; Strelkova, G. I.

    2015-07-01

    We review rigorous and numerical results on the statistics of Poincaré recurrences which are related to the modern development of the Poincaré recurrence problem. We analyze and describe the rigorous results which are achieved both in the classical (local) approach and in the recently developed global approach. These results are illustrated by numerical simulation data for simple chaotic and ergodic systems. It is shown that the basic theoretical laws can be applied to noisy systems if the probability measure is ergodic and stationary. Poincaré recurrences are studied numerically in nonautonomous systems. Statistical characteristics of recurrences are analyzed in the framework of the global approach for the cases of positive and zero topological entropy. We show that for the positive entropy, there is a relationship between the Afraimovich-Pesin dimension, Lyapunov exponents and the Kolmogorov-Sinai entropy either without and in the presence of external noise. The case of zero topological entropy is exemplified by numerical results for the Poincare recurrence statistics in the circle map. We show and prove that the dependence of minimal recurrence times on the return region size demonstrates universal properties for the golden and the silver ratio. The behavior of Poincaré recurrences is analyzed at the critical point of Feigenbaum attractor birth. We explore Poincaré recurrences for an ergodic set which is generated in the stroboscopic section of a nonautonomous oscillator and is similar to a circle shift. Based on the obtained results we show how the Poincaré recurrence statistics can be applied for solving a number of nonlinear dynamics issues. We propose and illustrate alternative methods for diagnosing effects of external and mutual synchronization of chaotic systems in the context of the local and global approaches. The properties of the recurrence time probability density can be used to detect the stochastic resonance phenomenon. We also discuss how the fractal dimension of chaotic attractors can be estimated using the Poincaré recurrence statistics.

  9. Professionalization of the Senior Chinese Officer Corps Trends and Implications

    DTIC Science & Technology

    1997-01-01

    81The officers who retired were Ye Jianying , Nie Rongzhen, Xu Xiangqian, Wang Zhen, Song Renqiong, and Li Desheng. Of course, the political impact of...increased education level, functional spe- cialization, and adherence to retirement norms.4 Li Cheng and Lynn White, in their 1993 Asian Survey article...making rigorous comparative analysis untenable. Second, Li and White do not place their results or analysis in any theoretical context. In

  10. Hollow-cylinder waveguide isolators for use at millimeter wavelengths

    NASA Technical Reports Server (NTRS)

    Kanda, M.; May, W. G.

    1974-01-01

    A semiconductor waveguide isolator consisting of a hollow column of a semiconductor mounted coaxially is considered in a circular waveguide in a longitudinal dc magnetic field. An elementary and physical analysis based on the excitation of plane waves in the guide and a more rigorous mode matching analysis are presented. These theoretical predictions are compared with experimental results for an InSb isolator at 94GHz and 75 K.

  11. The relationship of rain-induced cross-polarization discrimination to attenuation for 10 to 30 GHz earth-space radio links

    NASA Technical Reports Server (NTRS)

    Stutzman, W. L.; Runyon, D. L.

    1984-01-01

    Rain depolarization is quantified through the cross-polarization discrimination (XPD) versus attenuation relationship. Such a relationship is derived by curve fitting to a rigorous theoretical model (the multiple scattering model) to determine the variation of the parameters involved. This simple isolation model (SIM) is compared to data from several earth-space link experiments and to three other models.

  12. Tunable intraparticle frameworks for creating complex heterostructured nanoparticle libraries

    NASA Astrophysics Data System (ADS)

    Fenton, Julie L.; Steimle, Benjamin C.; Schaak, Raymond E.

    2018-05-01

    Complex heterostructured nanoparticles with precisely defined materials and interfaces are important for many applications. However, rationally incorporating such features into nanoparticles with rigorous morphology control remains a synthetic bottleneck. We define a modular divergent synthesis strategy that progressively transforms simple nanoparticle synthons into increasingly sophisticated products. We introduce a series of tunable interfaces into zero-, one-, and two-dimensional copper sulfide nanoparticles using cation exchange reactions. Subsequent manipulation of these intraparticle frameworks yielded a library of 47 distinct heterostructured metal sulfide derivatives, including particles that contain asymmetric, patchy, porous, and sculpted nanoarchitectures. This generalizable mix-and-match strategy provides predictable retrosynthetic pathways to complex nanoparticle features that are otherwise inaccessible.

  13. Integrating Content and Literacy in Social Studies: Assessing Instructional Materials and Student Work from a Common Core-Aligned Intervention

    ERIC Educational Resources Information Center

    Reisman, Abby

    2017-01-01

    The Common Core State Standards (CCSS) call on science and social studies teachers to engage in literacy instruction that prepares students for the academic rigors of college. The Literacy Design Collaborative (LDC) designed a framework to address the challenge of literacy-content integration. At the heart of the intervention are fill-in-the-blank…

  14. Self-report: psychology's four-letter word.

    PubMed

    Haeffel, Gerald J; Howard, George S

    2010-01-01

    Self-report continues to be one of the most widely used measurement strategies in psychology despite longstanding concerns about its validity and scientific rigor. In this article, the merits of self-report are examined from a philosophy of science perspective. A framework is also provided for evaluating self-report measures. Specifically, four issues are presented that can be used as a decision aid when making choices about measurement.

  15. The necessity of a theory of biology for tissue engineering: metabolism-repair systems.

    PubMed

    Ganguli, Suman; Hunt, C Anthony

    2004-01-01

    Since there is no widely accepted global theory of biology, tissue engineering and bioengineering lack a theoretical understanding of the systems being engineered. By default, tissue engineering operates with a "reductionist" theoretical approach, inherited from traditional engineering of non-living materials. Long term, that approach is inadequate, since it ignores essential aspects of biology. Metabolism-repair systems are a theoretical framework which explicitly represents two "functional" aspects of living organisms: self-repair and self-replication. Since repair and replication are central to tissue engineering, we advance metabolism-repair systems as a potential theoretical framework for tissue engineering. We present an overview of the framework, and indicate directions to pursue for extending it to the context of tissue engineering. We focus on biological networks, both metabolic and cellular, as one such direction. The construction of these networks, in turn, depends on biological protocols. Together these concepts may help point the way to a global theory of biology appropriate for tissue engineering.

  16. How to Map Theory: Reliable Methods Are Fruitless Without Rigorous Theory.

    PubMed

    Gray, Kurt

    2017-09-01

    Good science requires both reliable methods and rigorous theory. Theory allows us to build a unified structure of knowledge, to connect the dots of individual studies and reveal the bigger picture. Some have criticized the proliferation of pet "Theories," but generic "theory" is essential to healthy science, because questions of theory are ultimately those of validity. Although reliable methods and rigorous theory are synergistic, Action Identification suggests psychological tension between them: The more we focus on methodological details, the less we notice the broader connections. Therefore, psychology needs to supplement training in methods (how to design studies and analyze data) with training in theory (how to connect studies and synthesize ideas). This article provides a technique for visually outlining theory: theory mapping. Theory mapping contains five elements, which are illustrated with moral judgment and with cars. Also included are 15 additional theory maps provided by experts in emotion, culture, priming, power, stress, ideology, morality, marketing, decision-making, and more (see all at theorymaps.org ). Theory mapping provides both precision and synthesis, which helps to resolve arguments, prevent redundancies, assess the theoretical contribution of papers, and evaluate the likelihood of surprising effects.

  17. Combinatorial compatibility as habit-controlling factor in lysozyme crystallization I. Monomeric and tetrameric F faces derived graph-theoretically

    NASA Astrophysics Data System (ADS)

    Strom, C. S.; Bennema, P.

    1997-03-01

    A series of two articles discusses possible morphological evidence for oligomerization of growth units in the crystallization of tetragonal lysozyme, based on a rigorous graph-theoretic derivation of the F faces. In the first study (Part I), the growth layers are derived as valid networks satisfying the conditions of F slices in the context of the PBC theory using the graph-theoretic method implemented in program FFACE [C.S. Strom, Z. Krist. 172 (1985) 11]. The analysis is performed in monomeric and alternative tetrameric and octameric formulations of the unit cell, assuming tetramer formation according to the strongest bonds. F (flat) slices with thickness Rdhkl ( {1}/{2} < R ≤ 1 ) are predicted theoretically in the forms 1 1 0, 0 1 1, 1 1 1. The relevant energies are established in the broken bond model. The relation between possible oligomeric specifications of the unit cell and combinatorially feasible F slice compositions in these orientations is explored.

  18. Self, College Experiences, and Society: Rethinking the Theoretical Foundations of Student Development Theory

    ERIC Educational Resources Information Center

    Winkle-Wagner, Rachelle

    2012-01-01

    This article examines the psychological theoretical foundations of college student development theory and the theoretical assumptions of this framework. A complimentary, sociological perspective and the theoretical assumptions of this approach are offered. The potential limitations of the overuse of each perspective are considered. The conclusion…

  19. Threshold for extinction and survival in stochastic tumor immune system

    NASA Astrophysics Data System (ADS)

    Li, Dongxi; Cheng, Fangjuan

    2017-10-01

    This paper mainly investigates the stochastic character of tumor growth and extinction in the presence of immune response of a host organism. Firstly, the mathematical model describing the interaction and competition between the tumor cells and immune system is established based on the Michaelis-Menten enzyme kinetics. Then, the threshold conditions for extinction, weak persistence and stochastic persistence of tumor cells are derived by the rigorous theoretical proofs. Finally, stochastic simulation are taken to substantiate and illustrate the conclusion we have derived. The modeling results will be beneficial to understand to concept of immunoediting, and develop the cancer immunotherapy. Besides, our simple theoretical model can help to obtain new insight into the complexity of tumor growth.

  20. Determination of effective mass of heavy hole from phonon-assisted excitonic luminescence spectra in ZnO

    NASA Astrophysics Data System (ADS)

    Shi, S. L.; Xu, S. J.

    2011-03-01

    Longitudinal optical (LO) phonon-assisted luminescence spectra of free excitons in high-quality ZnO crystal were investigated both experimentally and theoretically. By using the rigorous Segall-Mahan model based on the Green's function, good agreement between the experimental emission spectra involving one or two LO phonons and theoretical spectra can be achieved when only one adjustable parameter (effective mass of heavy hole) was adopted. This leads to determination of the heavy-hole effective mass mh⊥ = (0.8 m0 and mh∥ = 5.0 m0) in ZnO. Influence of anisotropic effective masses of heavy holes on the phonon sidebands is also discussed.

  1. Polarization sensitivity testing of off-plane reflection gratings

    NASA Astrophysics Data System (ADS)

    Marlowe, Hannah; McEntaffer, Randal L.; DeRoo, Casey T.; Miles, Drew M.; Tutt, James H.; Laubis, Christian; Soltwisch, Victor

    2015-09-01

    Off-Plane reflection gratings were previously predicted to have different efficiencies when the incident light is polarized in the transverse-magnetic (TM) versus transverse-electric (TE) orientations with respect to the grating grooves. However, more recent theoretical calculations which rigorously account for finitely conducting, rather than perfectly conducting, grating materials no longer predict significant polarization sensitivity. We present the first empirical results for radially ruled, laminar groove profile gratings in the off-plane mount which demonstrate no difference in TM versus TE efficiency across our entire 300-1500 eV bandpass. These measurements together with the recent theoretical results confirm that grazing incidence off-plane reflection gratings using real, not perfectly conducting, materials are not polarization sensitive.

  2. MUSIC-characterization of small scatterers for normal measurement data

    NASA Astrophysics Data System (ADS)

    Griesmaier, Roland; Hanke, Martin

    2009-07-01

    We investigate the reconstruction of the positions of a collection of small metallic objects buried beneath the ground from measurements of the vertical component of scattered fields corresponding to vertically polarized dipole excitations on a horizontal two-dimensional measurement device above the surface of the ground. A MUSIC reconstruction method for this problem has recently been proposed by Iakovleva et al (2007 IEEE Trans. Antennas Propag. 55 2598). In this paper, we give a rigorous theoretical justification of this method. To that end we prove a characterization of the positions of the scatterers in terms of the measurement data, applying an asymptotic analysis of the scattered fields. We present numerical results to illustrate our theoretical findings.

  3. Exploring Environmental Factors in Nursing Workplaces That Promote Psychological Resilience: Constructing a Unified Theoretical Model.

    PubMed

    Cusack, Lynette; Smith, Morgan; Hegney, Desley; Rees, Clare S; Breen, Lauren J; Witt, Regina R; Rogers, Cath; Williams, Allison; Cross, Wendy; Cheung, Kin

    2016-01-01

    Building nurses' resilience to complex and stressful practice environments is necessary to keep skilled nurses in the workplace and ensuring safe patient care. A unified theoretical framework titled Health Services Workplace Environmental Resilience Model (HSWERM), is presented to explain the environmental factors in the workplace that promote nurses' resilience. The framework builds on a previously-published theoretical model of individual resilience, which identified the key constructs of psychological resilience as self-efficacy, coping and mindfulness, but did not examine environmental factors in the workplace that promote nurses' resilience. This unified theoretical framework was developed using a literary synthesis drawing on data from international studies and literature reviews on the nursing workforce in hospitals. The most frequent workplace environmental factors were identified, extracted and clustered in alignment with key constructs for psychological resilience. Six major organizational concepts emerged that related to a positive resilience-building workplace and formed the foundation of the theoretical model. Three concepts related to nursing staff support (professional, practice, personal) and three related to nursing staff development (professional, practice, personal) within the workplace environment. The unified theoretical model incorporates these concepts within the workplace context, linking to the nurse, and then impacting on personal resilience and workplace outcomes, and its use has the potential to increase staff retention and quality of patient care.

  4. Rehabilitation goal setting with community dwelling adults with acquired brain injury: a theoretical framework derived from clinicians' reflections on practice.

    PubMed

    Prescott, Sarah; Fleming, Jennifer; Doig, Emmah

    2017-06-11

    The aim of this study was to explore clinicians' experiences of implementing goal setting with community dwelling clients with acquired brain injury, to develop a goal setting practice framework. Grounded theory methodology was employed. Clinicians, representing six disciplines across seven services, were recruited and interviewed until theoretical saturation was achieved. A total of 22 clinicians were interviewed. A theoretical framework was developed to explain how clinicians support clients to actively engage in goal setting in routine practice. The framework incorporates three phases: a needs identification phase, a goal operationalisation phase, and an intervention phase. Contextual factors, including personal and environmental influences, also affect how clinicians and clients engage in this process. Clinicians use additional strategies to support clients with impaired self-awareness. These include structured communication and metacognitive strategies to operationalise goals. For clients with emotional distress, clinicians provide additional time and intervention directed at new identity development. The goal setting practice framework may guide clinician's understanding of how to engage in client-centred goal setting in brain injury rehabilitation. There is a predilection towards a client-centred goal setting approach in the community setting, however, contextual factors can inhibit implementation of this approach. Implications for Rehabilitation The theoretical framework describes processes used to develop achievable client-centred goals with people with brain injury. Building rapport is a core strategy to engage clients with brain injury in goal setting. Clients with self-awareness impairment benefit from additional metacognitive strategies to participate in goal setting. Clients with emotional distress may need additional time for new identity development.

  5. Episodic Laryngeal Breathing Disorders: Literature Review and Proposal of Preliminary Theoretical Framework.

    PubMed

    Shembel, Adrianna C; Sandage, Mary J; Verdolini Abbott, Katherine

    2017-01-01

    The purposes of this literature review were (1) to identify and assess frameworks for clinical characterization of episodic laryngeal breathing disorders (ELBD) and their subtypes, (2) to integrate concepts from these frameworks into a novel theoretical paradigm, and (3) to provide a preliminary algorithm to classify clinical features of ELBD for future study of its clinical manifestations and underlying pathophysiological mechanisms. This is a literature review. Peer-reviewed literature from 1983 to 2015 pertaining to models for ELBD was searched using Pubmed, Ovid, Proquest, Cochrane Database of Systematic Reviews, and Google Scholar. Theoretical models for ELBD were identified, evaluated, and integrated into a novel comprehensive framework. Consensus across three salient models provided a working definition and inclusionary criteria for ELBD within the new framework. Inconsistencies and discrepancies within the models provided an analytic platform for future research. Comparison among three conceptual models-(1) Irritable larynx syndrome, (2) Dichotomous triggers, and (3) Periodic occurrence of laryngeal obstruction-showed that the models uniformly consider ELBD to involve episodic laryngeal obstruction causing dyspnea. The models differed in their description of source of dyspnea, in their inclusion of corollary behaviors, in their inclusion of other laryngeal-based behaviors (eg, cough), and types of triggers. The proposed integrated theoretical framework for ELBD provides a preliminary systematic platform for the identification of key clinical feature patterns indicative of ELBD and associated clinical subgroups. This algorithmic paradigm should evolve with better understanding of this spectrum of disorders and its underlying pathophysiological mechanisms. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  6. Individual behavioral phenotypes: an integrative meta-theoretical framework. Why "behavioral syndromes" are not analogs of "personality".

    PubMed

    Uher, Jana

    2011-09-01

    Animal researchers are increasingly interested in individual differences in behavior. Their interpretation as meaningful differences in behavioral strategies stable over time and across contexts, adaptive, heritable, and acted upon by natural selection has triggered new theoretical developments. However, the analytical approaches used to explore behavioral data still address population-level phenomena, and statistical methods suitable to analyze individual behavior are rarely applied. I discuss fundamental investigative principles and analytical approaches to explore whether, in what ways, and under which conditions individual behavioral differences are actually meaningful. I elaborate the meta-theoretical ideas underlying common theoretical concepts and integrate them into an overarching meta-theoretical and methodological framework. This unravels commonalities and differences, and shows that assumptions of analogy to concepts of human personality are not always warranted and that some theoretical developments may be based on methodological artifacts. Yet, my results also highlight possible directions for new theoretical developments in animal behavior research. Copyright © 2011 Wiley Periodicals, Inc.

  7. Modeling intragranular diffusion in low-connectivity granular media

    NASA Astrophysics Data System (ADS)

    Ewing, Robert P.; Liu, Chongxuan; Hu, Qinhong

    2012-03-01

    Characterizing the diffusive exchange of solutes between bulk water in an aquifer and water in the intragranular pores of the solid phase is still challenging despite decades of study. Many disparities between observation and theory could be attributed to low connectivity of the intragranular pores. The presence of low connectivity indicates that a useful conceptual framework is percolation theory. The present study was initiated to develop a percolation-based finite difference (FD) model, and to test it rigorously against both random walk (RW) simulations of diffusion starting from nonequilibrium, and data on Borden sand published by Ball and Roberts (1991a,b) and subsequently reanalyzed by Haggerty and Gorelick (1995) using a multirate mass transfer (MRMT) approach. The percolation-theoretical model is simple and readily incorporated into existing FD models. The FD model closely matches the RW results using only a single fitting parameter, across a wide range of pore connectivities. Simulation of the Borden sand experiment without pore connectivity effects reproduced the MRMT analysis, but including low pore connectivity effects improved the fit. Overall, the theory and simulation results show that low intragranular pore connectivity can produce diffusive behavior that appears as if the solute had undergone slow sorption, despite the absence of any sorption process, thereby explaining some hitherto confusing aspects of intragranular diffusion.

  8. Toward a scientifically rigorous basis for developing mapped ecological regions.

    USGS Publications Warehouse

    McMahon, G.; Wiken, E.B.; Gauthier, D.A.

    2004-01-01

    Despite the wide use of ecological regions in conservation and resource-management evaluations and assessments, a commonly accepted theoretical basis for ecological regionalization does not exist. This fact, along with the paucity of focus on ecological regionalization by professional associations, journals, and faculties, has inhibited the advancement of a broadly acceptable scientific basis for the development, use, and verification of ecological regions. The central contention of this article is that ecological regions should improve our understanding of geographic and ecological phenomena associated with biotic and abiotic processes occurring in individual regions and also of processes characteristic of interactions and dependencies among multiple regions. Research associated with any ecoregional framework should facilitate development of hypotheses about ecological phenomena and dominant landscape elements associated with these phenomena, how these phenomena are structured in space, and how they function in a hierarchy. Success in addressing the research recommendations outlined in this article cannot occur within an ad hoc, largely uncoordinated research environment. Successful implementation of this plan will require activities--coordination, funding, and education--that are both scientific and administrative in nature. Perhaps the most important element of an infrastructure to support the scientific work of ecoregionalization would be a national or international authority similar to the Water and Science Technology Board of the National Academy of Sciences.

  9. A sense of life: computational and experimental investigations with models of biochemical and evolutionary processes.

    PubMed

    Mishra, Bud; Daruwala, Raoul-Sam; Zhou, Yi; Ugel, Nadia; Policriti, Alberto; Antoniotti, Marco; Paxia, Salvatore; Rejali, Marc; Rudra, Archisman; Cherepinsky, Vera; Silver, Naomi; Casey, William; Piazza, Carla; Simeoni, Marta; Barbano, Paolo; Spivak, Marina; Feng, Jiawu; Gill, Ofer; Venkatesh, Mysore; Cheng, Fang; Sun, Bing; Ioniata, Iuliana; Anantharaman, Thomas; Hubbard, E Jane Albert; Pnueli, Amir; Harel, David; Chandru, Vijay; Hariharan, Ramesh; Wigler, Michael; Park, Frank; Lin, Shih-Chieh; Lazebnik, Yuri; Winkler, Franz; Cantor, Charles R; Carbone, Alessandra; Gromov, Mikhael

    2003-01-01

    We collaborate in a research program aimed at creating a rigorous framework, experimental infrastructure, and computational environment for understanding, experimenting with, manipulating, and modifying a diverse set of fundamental biological processes at multiple scales and spatio-temporal modes. The novelty of our research is based on an approach that (i) requires coevolution of experimental science and theoretical techniques and (ii) exploits a certain universality in biology guided by a parsimonious model of evolutionary mechanisms operating at the genomic level and manifesting at the proteomic, transcriptomic, phylogenic, and other higher levels. Our current program in "systems biology" endeavors to marry large-scale biological experiments with the tools to ponder and reason about large, complex, and subtle natural systems. To achieve this ambitious goal, ideas and concepts are combined from many different fields: biological experimentation, applied mathematical modeling, computational reasoning schemes, and large-scale numerical and symbolic simulations. From a biological viewpoint, the basic issues are many: (i) understanding common and shared structural motifs among biological processes; (ii) modeling biological noise due to interactions among a small number of key molecules or loss of synchrony; (iii) explaining the robustness of these systems in spite of such noise; and (iv) cataloging multistatic behavior and adaptation exhibited by many biological processes.

  10. Quinary wurtzite Zn-Ga-Ge-N-O solid solutions and their photocatalytic properties under visible light irradiation

    PubMed Central

    Xie, Yinghao; Wu, Fangfang; Sun, Xiaoqin; Chen, Hongmei; Lv, Meilin; Ni, Shuang; Liu, Gang; Xu, Xiaoxiang

    2016-01-01

    Wurtzite solid solutions between GaN and ZnO highlight an intriguing paradigm for water splitting into hydrogen and oxygen using solar energy. However, large composition discrepancy often occurs inside the compound owing to the volatile nature of Zn, thereby prescribing rigorous terms on synthetic conditions. Here we demonstrate the merits of constituting quinary Zn-Ga-Ge-N-O solid solutions by introducing Ge into the wurtzite framework. The presence of Ge not only mitigates the vaporization of Zn but also strongly promotes particle crystallization. Synthetic details for these quinary compounds were systematically explored and their photocatalytic properties were thoroughly investigated. Proper starting molar ratios of Zn/Ga/Ge are of primary importance for single phase formation, high particle crystallinity and good photocatalytic performance. Efficient photocatalytic hydrogen and oxygen production from water were achieved for these quinary solid solutions which is strongly correlated with Ge content in the structure. Apparent quantum efficiency for optimized sample approaches 1.01% for hydrogen production and 1.14% for oxygen production. Theoretical calculation reveals the critical role of Zn for the band gap reduction in these solid solutions and their superior photocatalytic acitivity can be understood by the preservation of Zn in the structure as well as a good crystallinity after introducing Ge. PMID:26755070

  11. Quinary wurtzite Zn-Ga-Ge-N-O solid solutions and their photocatalytic properties under visible light irradiation.

    PubMed

    Xie, Yinghao; Wu, Fangfang; Sun, Xiaoqin; Chen, Hongmei; Lv, Meilin; Ni, Shuang; Liu, Gang; Xu, Xiaoxiang

    2016-01-12

    Wurtzite solid solutions between GaN and ZnO highlight an intriguing paradigm for water splitting into hydrogen and oxygen using solar energy. However, large composition discrepancy often occurs inside the compound owing to the volatile nature of Zn, thereby prescribing rigorous terms on synthetic conditions. Here we demonstrate the merits of constituting quinary Zn-Ga-Ge-N-O solid solutions by introducing Ge into the wurtzite framework. The presence of Ge not only mitigates the vaporization of Zn but also strongly promotes particle crystallization. Synthetic details for these quinary compounds were systematically explored and their photocatalytic properties were thoroughly investigated. Proper starting molar ratios of Zn/Ga/Ge are of primary importance for single phase formation, high particle crystallinity and good photocatalytic performance. Efficient photocatalytic hydrogen and oxygen production from water were achieved for these quinary solid solutions which is strongly correlated with Ge content in the structure. Apparent quantum efficiency for optimized sample approaches 1.01% for hydrogen production and 1.14% for oxygen production. Theoretical calculation reveals the critical role of Zn for the band gap reduction in these solid solutions and their superior photocatalytic acitivity can be understood by the preservation of Zn in the structure as well as a good crystallinity after introducing Ge.

  12. Assessing health status and quality-of-life instruments: attributes and review criteria.

    PubMed

    Aaronson, Neil; Alonso, Jordi; Burnam, Audrey; Lohr, Kathleen N; Patrick, Donald L; Perrin, Edward; Stein, Ruth E

    2002-05-01

    The field of health status and quality of life (QoL) measurement - as a formal discipline with a cohesive theoretical framework, accepted methods, and diverse applications--has been evolving for the better part of 30 years. To identify health status and QoL instruments and review them against rigorous criteria as a precursor to creating an instrument library for later dissemination, the Medical Outcomes Trust in 1994 created an independently functioning Scientific Advisory Committee (SAC). In the mid-1990s, the SAC defined a set of attributes and criteria to carry out instrument assessments; 5 years later, it updated and revised these materials to take account of the expanding theories and technologies upon which such instruments were being developed. This paper offers the SAC's current conceptualization of eight key attributes of health status and QoL instruments (i.e., conceptual and measurement model; reliability; validity; responsiveness; interpretability; respondent and administrative burden; alternate forms; and cultural and language adaptations) and the criteria by which instruments would be reviewed on each of those attributes. These are suggested guidelines for the field to consider and debate; as measurement techniques become both more familiar and more sophisticated, we expect that experts will wish to update and refine these criteria accordingly.

  13. Quinary wurtzite Zn-Ga-Ge-N-O solid solutions and their photocatalytic properties under visible light irradiation

    NASA Astrophysics Data System (ADS)

    Xie, Yinghao; Wu, Fangfang; Sun, Xiaoqin; Chen, Hongmei; Lv, Meilin; Ni, Shuang; Liu, Gang; Xu, Xiaoxiang

    2016-01-01

    Wurtzite solid solutions between GaN and ZnO highlight an intriguing paradigm for water splitting into hydrogen and oxygen using solar energy. However, large composition discrepancy often occurs inside the compound owing to the volatile nature of Zn, thereby prescribing rigorous terms on synthetic conditions. Here we demonstrate the merits of constituting quinary Zn-Ga-Ge-N-O solid solutions by introducing Ge into the wurtzite framework. The presence of Ge not only mitigates the vaporization of Zn but also strongly promotes particle crystallization. Synthetic details for these quinary compounds were systematically explored and their photocatalytic properties were thoroughly investigated. Proper starting molar ratios of Zn/Ga/Ge are of primary importance for single phase formation, high particle crystallinity and good photocatalytic performance. Efficient photocatalytic hydrogen and oxygen production from water were achieved for these quinary solid solutions which is strongly correlated with Ge content in the structure. Apparent quantum efficiency for optimized sample approaches 1.01% for hydrogen production and 1.14% for oxygen production. Theoretical calculation reveals the critical role of Zn for the band gap reduction in these solid solutions and their superior photocatalytic acitivity can be understood by the preservation of Zn in the structure as well as a good crystallinity after introducing Ge.

  14. Rare variation facilitates inferences of fine-scale population structure in humans.

    PubMed

    O'Connor, Timothy D; Fu, Wenqing; Mychaleckyj, Josyf C; Logsdon, Benjamin; Auer, Paul; Carlson, Christopher S; Leal, Suzanne M; Smith, Joshua D; Rieder, Mark J; Bamshad, Michael J; Nickerson, Deborah A; Akey, Joshua M

    2015-03-01

    Understanding the genetic structure of human populations has important implications for the design and interpretation of disease mapping studies and reconstructing human evolutionary history. To date, inferences of human population structure have primarily been made with common variants. However, recent large-scale resequencing studies have shown an abundance of rare variation in humans, which may be particularly useful for making inferences of fine-scale population structure. To this end, we used an information theory framework and extensive coalescent simulations to rigorously quantify the informativeness of rare and common variation to detect signatures of fine-scale population structure. We show that rare variation affords unique insights into patterns of recent population structure. Furthermore, to empirically assess our theoretical findings, we analyzed high-coverage exome sequences in 6,515 European and African American individuals. As predicted, rare variants are more informative than common polymorphisms in revealing a distinct cluster of European-American individuals, and subsequent analyses demonstrate that these individuals are likely of Ashkenazi Jewish ancestry. Our results provide new insights into the population structure using rare variation, which will be an important factor to account for in rare variant association studies. © The Author 2014. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  15. Using qualitative methods for attribute development for discrete choice experiments: issues and recommendations.

    PubMed

    Coast, Joanna; Al-Janabi, Hareth; Sutton, Eileen J; Horrocks, Susan A; Vosper, A Jane; Swancutt, Dawn R; Flynn, Terry N

    2012-06-01

    Attribute generation for discrete choice experiments (DCEs) is often poorly reported, and it is unclear whether this element of research is conducted rigorously. This paper explores issues associated with developing attributes for DCEs and contrasts different qualitative approaches. The paper draws on eight studies, four developed attributes for measures, and four developed attributes for more ad hoc policy questions. Issues that have become apparent through these studies include the following: the theoretical framework for random utility theory and the need for attributes that are neither too close to the latent construct nor too intrinsic to people's personality; the need to think about attribute development as a two-stage process involving conceptual development followed by refinement of language to convey the intended meaning; and the difficulty in resolving tensions inherent in the reductiveness of condensing complex and nuanced qualitative findings into precise terms. The comparison of alternative qualitative approaches suggests that the nature of data collection will depend both on the characteristics of the question (its sensitivity, for example) and the availability of existing qualitative information. An iterative, constant comparative approach to analysis is recommended. Finally, the paper provides a series of recommendations for improving the reporting of this element of DCE studies. Copyright © 2011 John Wiley & Sons, Ltd.

  16. Tokunaga river networks: New empirical evidence and applications to transport problems

    NASA Astrophysics Data System (ADS)

    Tejedor, A.; Zaliapin, I. V.

    2013-12-01

    The Tokunaga self-similarity has proven to be an important constraint for the observed river networks. Notably, various Horton laws are naturally satisfied by the Tokunaga networks, which makes this model of considerable interest for theoretical analysis and modeling of environmental transport. Recall that Horton self-similarity is a weaker property of a tree graph that addresses its principal branching; it is a counterpart of the power-law size distribution for system's elements. The stronger Tokunaga self-similarity addresses so-called side branching; it ensures that different levels of a hierarchy have the same probabilistic structure (in a sense that can be rigorously defined). We describe an improved statistical framework for testing self-similarity in a finite tree and estimating the related parameters. The developed inference is applied to the major river basins in continental United States and Iberian Peninsula. The results demonstrate the validity of the Tokunaga model for the majority of the examined networks with very narrow (universal) range of parameter values. Next, we explore possible relationships between the Tokunaga parameter anomalies (deviations from the universal values) and climatic and geomorphologic characteristics of a region. Finally, we apply the Tokunaga model to explore vulnerability of river networks, defined via reaction of the river discharge to a storm.

  17. Psychosocial measures used to assess the effectiveness of school-based nutrition education programs: review and analysis of self-report instruments for children 8 to 12 years old.

    PubMed

    Hernández-Garbanzo, Yenory; Brosh, Joanne; Serrano, Elena L; Cason, Katherine L; Bhattarai, Ranju

    2013-01-01

    To identify the psychometric properties of evaluation instruments that measure mediators of dietary behaviors in school-aged children. Systematic search of scientific databases limited to 1999-2010. Psychometric properties related to development and testing of self-report instruments for children 8-12 years old. Systematic search of 189 articles and review of 15 instruments (20 associated articles) meeting the inclusion criteria. Search terms used included children, school, nutrition, diet, nutrition education, and evaluation. Fourteen studies used a theoretical framework to guide the instrument's development. Knowledge and self-efficacy were the most commonly used psychosocial measures. Twelve instruments focused on specific nutrition-related behaviors. Eight instruments included over 40 items and used age-appropriate response formats. Acceptable reliability properties were most commonly reported for attitude and self-efficacy measures. Although most of the instruments were reviewed by experts (n = 8) and/or pilot-tested (n = 9), only 7 were tested using both rigorous types of validity and with low-income youth. Results from this review suggest that additional research is needed to develop more robust psychosocial measures for dietary behaviors, for low-income youth audiences. Copyright © 2013 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  18. Toward a Scientifically Rigorous Basis for Developing Mapped Ecological Regions

    NASA Astrophysics Data System (ADS)

    McMahon, Gerard; Wiken, Ed B.; Gauthier, David A.

    2004-04-01

    Despite the wide use of ecological regions in conservation and resource-management evaluations and assessments, a commonly accepted theoretical basis for ecological regionalization does not exist. This fact, along with the paucity of focus on ecological regionalization by professional associations, journals, and faculties, has inhibited the advancement of a broadly acceptable scientific basis for the development, use, and verification of ecological regions. The central contention of this article is that ecological regions should improve our understanding of geographic and ecological phenomena associated with biotic and abiotic processes occurring in individual regions and also of processes characteristic of interactions and dependencies among multiple regions. Research associated with any ecoregional framework should facilitate development of hypotheses about ecological phenomena and dominant landscape elements associated with these phenomena, how these phenomena are structured in space, and how they function in a hierarchy. Success in addressing the research recommendations outlined in this article cannot occur within an ad hoc, largely uncoordinated research environment. Successful implementation of this plan will require activities—coordination, funding, and education—that are both scientific and administrative in nature. Perhaps the most important element of an infrastructure to support the scientific work of ecoregionalization would be a national or international authority similar to the Water and Science Technology Board of the National Academy of Sciences.

  19. Enhancing the isotropy of lateral resolution in coherent structured illumination microscopy

    PubMed Central

    Park, Joo Hyun; Lee, Jae Yong; Lee, Eun Seong

    2014-01-01

    We present a method to improve the isotropy of spatial resolution in a structured illumination microscopy (SIM) implemented for imaging non-fluorescent samples. To alleviate the problem of anisotropic resolution involved with the previous scheme of coherent SIM that employs the two orthogonal standing-wave illumination, referred to as the orthogonal SIM, we introduce a hexagonal-lattice illumination that incorporates three standing-wave fields simultaneously superimposed at the orientations equally divided in the lateral plane. A theoretical formulation is worked out rigorously for the coherent image formation with such a simultaneous multiple-beam illumination and an explicit Fourier-domain framework is derived for reconstructing an image with enhanced resolution. Using a computer-synthesized resolution target as a 2D coherent sample, we perform numerical simulations to examine the imaging characteristics of our three-angle SIM compared with the orthogonal SIM. The investigation on the 2D resolving power with the various test patterns of different periods and orientations reveal that the orientation-dependent undulation of lateral resolution can be reduced from 27% to 8% by using the three-angle SIM while the best resolution (0.54 times the resolution limit of conventional coherent imaging) in the directions of structured illumination is slightly deteriorated by 4.6% from that of the orthogonal SIM. PMID:24940548

  20. Multi-Source Multi-Target Dictionary Learning for Prediction of Cognitive Decline

    PubMed Central

    Zhang, Jie; Li, Qingyang; Caselli, Richard J.; Thompson, Paul M.; Ye, Jieping; Wang, Yalin

    2017-01-01

    Alzheimer’s Disease (AD) is the most common type of dementia. Identifying correct biomarkers may determine pre-symptomatic AD subjects and enable early intervention. Recently, Multi-task sparse feature learning has been successfully applied to many computer vision and biomedical informatics researches. It aims to improve the generalization performance by exploiting the shared features among different tasks. However, most of the existing algorithms are formulated as a supervised learning scheme. Its drawback is with either insufficient feature numbers or missing label information. To address these challenges, we formulate an unsupervised framework for multi-task sparse feature learning based on a novel dictionary learning algorithm. To solve the unsupervised learning problem, we propose a two-stage Multi-Source Multi-Target Dictionary Learning (MMDL) algorithm. In stage 1, we propose a multi-source dictionary learning method to utilize the common and individual sparse features in different time slots. In stage 2, supported by a rigorous theoretical analysis, we develop a multi-task learning method to solve the missing label problem. Empirical studies on an N = 3970 longitudinal brain image data set, which involves 2 sources and 5 targets, demonstrate the improved prediction accuracy and speed efficiency of MMDL in comparison with other state-of-the-art algorithms. PMID:28943731

  1. Animal reintroductions: an innovative assessment of survival

    USGS Publications Warehouse

    Muths, Erin L.; Bailey, Larissa L.; Watry, Mary Kay

    2014-01-01

    Quantitative evaluations of reintroductions are infrequent and assessments of milestones reached before a project is completed, or abandoned due to lack of funding, are rare. However, such assessments, which are promoted in adaptive management frameworks, are critical. Quantification can provide defensible estimates of biological success, such as the number of survivors from a released cohort, with associated cost per animal. It is unlikely that the global issues of endangered wildlife and population declines will abate, therefore, assurance colonies and reintroductions are likely to become more common. If such endeavors are to be successful biologically or achieve adequate funding, implementation must be more rigorous and accountable. We use a novel application of a multistate, robust design capture-recapture model to estimate survival of reintroduced tadpoles through metamorphosis (i.e., the number of individuals emerging from the pond) and thereby provide a quantitative measure of effort and success for an "in progress" reintroduction of toads. Our data also suggest that tadpoles released at later developmental stages have an increased probability of survival and that eggs laid in the wild hatched at higher rates than eggs laid by captive toads. We illustrate how an interim assessment can identify problems, highlight successes, and provide information for use in adjusting the effort or implementing a Decision-Theoretic adaptive management strategy.

  2. Perceptions of Why Women Stay in Physically Abusive Relationships: A Comparative Study of Chinese and U.S. College Students.

    PubMed

    Pugh, Brandie; Li, Luye; Sun, Ivan Y

    2018-05-01

    In both China and the United States, public attitudes toward intimate partner violence (IPV) have shifted from viewing IPV as a tolerable, private matter to viewing it as a matter of public concern that should be dealt with as a crime. Empirical and comparative examinations of the perceptions of why women stay in physically abusive relationships are lacking. Answering this question calls for comprehensive, methodologically rigorous research. Using survey data collected from approximately 1,000 college students from two Chinese and two U.S. universities, this study empirically compared and contrasted factors that impact U.S. and Chinese students' perceptions as to why women remain in physically abusive relationships. Utilizing a theoretical framework of social constructionism, two common reasons were assessed: Women stay in physically abusive relationships because of learned helplessness and positive beliefs in the relationship/hope for the future. The results show that viewing IPV as a crime, gender, and beliefs of the causes of IPV were robust predictors of college students' perceptions toward why women stay in physically abusive relationships. U.S. college students were more likely to express sympathy and understanding toward why women remain in abusive relationships than Chinese students. Directions for future research and policy implications were discussed.

  3. Mixed reality framework for collective motion patterns of swarms with delay coupling

    NASA Astrophysics Data System (ADS)

    Szwaykowska, Klementyna; Schwartz, Ira

    The formation of coherent patterns in swarms of interacting self-propelled autonomous agents is an important subject for many applications within the field of distributed robotic systems. However, there are significant logistical challenges associated with testing fully distributed systems in real-world settings. In this paper, we provide a rigorous theoretical justification for the use of mixed-reality experiments as a stepping stone to fully physical testing of distributed robotic systems. We also model and experimentally realize a mixed-reality large-scale swarm of delay-coupled agents. Our analyses, assuming agents communicating over an Erdos-Renyi network, demonstrate the existence of stable coherent patterns that can be achieved only with delay coupling and that are robust to decreasing network connectivity and heterogeneity in agent dynamics. We show how the bifurcation structure for emergence of different patterns changes with heterogeneity in agent acceleration capabilities and limited connectivity in the network as a function of coupling strength and delay. Our results are verified through simulation as well as preliminary experimental results of delay-induced pattern formation in a mixed-reality swarm. K. S. was a National Research Council postdoctoral fellow. I.B.S was supported by the U.S. Naval Research Laboratory funding (N0001414WX00023) and office of Naval Research (N0001414WX20610).

  4. Robotic neurorehabilitation: a computational motor learning perspective

    PubMed Central

    Huang, Vincent S; Krakauer, John W

    2009-01-01

    Conventional neurorehabilitation appears to have little impact on impairment over and above that of spontaneous biological recovery. Robotic neurorehabilitation has the potential for a greater impact on impairment due to easy deployment, its applicability across of a wide range of motor impairment, its high measurement reliability, and the capacity to deliver high dosage and high intensity training protocols. We first describe current knowledge of the natural history of arm recovery after stroke and of outcome prediction in individual patients. Rehabilitation strategies and outcome measures for impairment versus function are compared. The topics of dosage, intensity, and time of rehabilitation are then discussed. Robots are particularly suitable for both rigorous testing and application of motor learning principles to neurorehabilitation. Computational motor control and learning principles derived from studies in healthy subjects are introduced in the context of robotic neurorehabilitation. Particular attention is paid to the idea of context, task generalization and training schedule. The assumptions that underlie the choice of both movement trajectory programmed into the robot and the degree of active participation required by subjects are examined. We consider rehabilitation as a general learning problem, and examine it from the perspective of theoretical learning frameworks such as supervised and unsupervised learning. We discuss the limitations of current robotic neurorehabilitation paradigms and suggest new research directions from the perspective of computational motor learning. PMID:19243614

  5. Electronic ferroelectricity induced by charge and orbital orderings.

    PubMed

    Yamauchi, Kunihiko; Barone, Paolo

    2014-03-12

    After the revival of the magnetoelectric effect which took place in the early 2000s, the interest in multiferroic materials displaying simultaneous presence of spontaneous long-range magnetic and dipolar order has motivated an exponential growth of research activity, from both the experimental and theoretical perspectives. Within this context, and relying also on the rigorous formulation of macroscopic polarization as provided by the Berry-phase approach, it has been possible to identify new microscopic mechanisms responsible for the appearance of ferroelectricity. In particular, it has been realized that electronic spin, charge and orbital degrees of freedom may be responsible for the breaking of the space-inversion symmetry, a necessary condition for the appearance of electric polarization, even in centrosymmetric crystal structures. In view of its immediate potential application in magnetoelectric-based devices, many efforts have been made to understand how magnetic orderings may lead to ferroelectric polarization, and to identify candidate materials. On the other hand, the role of charge and orbital degrees of freedom, which have received much less attention, has been predicted to be non-negligible in several cases. Here, we review recent theoretical advances in the field of so-called electronic ferroelectricity, focusing on the possible mechanisms by which charge- and/or orbital-ordering effects may cause the appearance of macroscopic polarization. Generally, a naive distinction can be drawn between materials displaying almost localized electrons and those characterized by a strong covalent character and delocalized electrons. As for the latter, an intuitive understanding of basic mechanisms is provided in the framework of tight-binding model Hamiltonians, which are used to shed light on unusual charge/orbital effects in half-doped manganites, whereas the case of magnetite will be thoroughly discussed in light of recent progress pointing to an electronic origin of its proposed ferroelectric and magnetoelectric properties.

  6. Flexoelectricity from density-functional perturbation theory

    NASA Astrophysics Data System (ADS)

    Stengel, Massimiliano

    2013-11-01

    We derive the complete flexoelectric tensor, including electronic and lattice-mediated effects, of an arbitrary insulator in terms of the microscopic linear response of the crystal to atomic displacements. The basic ingredient, which can be readily calculated from first principles in the framework of density-functional perturbation theory, is the quantum-mechanical probability current response to a long-wavelength acoustic phonon. Its second-order Taylor expansion in the wave vector q around the Γ (q=0) point in the Brillouin zone naturally yields the flexoelectric tensor. At order one in q we recover Martin's theory of piezoelectricity [Martin, Phys. Rev. B 5, 1607 (1972)], thus providing an alternative derivation thereof. To put our derivations on firm theoretical grounds, we perform a thorough analysis of the nonanalytic behavior of the dynamical matrix and other response functions in a vicinity of Γ. Based on this analysis, we find that there is an ambiguity in the specification of the “zero macroscopic field” condition in the flexoelectric case; such arbitrariness can be related to an analytic band-structure term, in close analogy to the theory of deformation potentials. As a by-product, we derive a rigorous generalization of the Cochran-Cowley formula [Cochran and Cowley, J. Phys. Chem. Solids 23, 447 (1962)] to higher orders in q. This can be of great utility in building reliable atomistic models of electromechanical phenomena, as well as for improving the accuracy of the calculation of phonon dispersion curves. Finally, we discuss the physical interpretation of the various contributions to the flexoelectric response, either in the static or dynamic regime, and we relate our findings to earlier theoretical works on the subject.

  7. Acoustic waves in unsaturated soils

    NASA Astrophysics Data System (ADS)

    Lo, Wei-Cheng; Sposito, Garrison

    2013-09-01

    Seminal papers by Brutsaert (1964) and Brutsaert and Luthin (1964) provided the first rigorous theoretical framework for examining the poroelastic behavior of unsaturated soils, including an important application linking acoustic wave propagation to soil hydraulic properties. Theoretical developments during the 50 years that followed have led Lo et al., (2005) to a comprehensive model of these phenomena, but the relationship of its elasticity parameters to standard poroelasticity parameters measured in hydrogeology has not been established. In the present study, we develop this relationship for three key parameters, the Gassman modulus, Skempton coefficient, and Biot-Willis coefficient by generalizing them to an unsaturated porous medium. We demonstrate the remarkable result that well-known and widely applied relationships among these parameters for a porous medium saturated by a single fluid are also valid under very general conditions for unsaturated soils. We show further that measurement of the Biot-Willis coefficient along with three of the six elasticity coefficients in the model of Lo et al. (2005) is sufficient to characterize poroelastic behavior. The elasticity coefficients in the model of Lo et al. (2005) are sensitive to the dependence of capillary pressure on water saturation and its viscous-drag coefficients are functions of relative permeability, implying that hysteresis in the water retention curve and hydraulic conductivity function should affect acoustic wave behavior in unsaturated soils. To quantify these as-yet unknown effects, we performed numerical simulations for Dune sand at two representative wave excitation frequencies. Our results show that the acoustic wave investigated by Brutsaert and Luthin (1964) propagates at essentially the same speed during imbibition and drainage, but is attenuated more during drainage than imbibition. Overall, effects on acoustic wave behavior caused by hysteresis become more significant as the excitation frequency increases.

  8. Generation of quantum entangled states in nonlinear plasmonic structures and metamaterials (Presentation Recording)

    NASA Astrophysics Data System (ADS)

    Poddubny, Alexander N.; Sukhorukov, Andrey A.

    2015-09-01

    The practical development of quantum plasmonic circuits incorporating non-classical interference [1] and sources of entangled states calls for a versatile quantum theoretical framework which can fully describe the generation and detection of entangled photons and plasmons. However, majority of the presently used theoretical approaches are typically limited to the toy models assuming loss-less and nondispersive elements or including just a few resonant modes. Here, we present a rigorous Green function approach describing entangled photon-plasmon state generation through spontaneous wave mixing in realistic metal-dielectric nanostructures. Our approach is based on the local Huttner-Barnett quantization scheme [2], which enables problem formulation in terms of a Hermitian Hamiltonian where the losses and dispersion are fully encoded in the electromagnetic Green functions. Hence, the problem can be addressed by the standard quantum mechanical perturbation theory, overcoming mathematical difficulties associated with other quantization schemes. We derive explicit expressions with clear physical meaning for the spatially dependent two-photon detection probability, single-photon detection probability and single-photon density matrix. In the limiting case of low-loss nondispersive waveguides our approach reproduces the previous results [3,4]. Importantly, our technique is far more general and can quantitatively describe generation and detection of spatially-entangled photons in arbitrary metal-dielectric structures taking into account actual losses and dispersion. This is essential to perform the design and optimization of plasmonic structures for generation and control of quantum entangled states. [1] J.S. Fakonas, H. Lee, Y.A. Kelaita and H.A. Atwater, Nature Photonics 8, 317(2014) [2] W. Vogel and D.-G. Welsch, Quantum Optics, Wiley (2006). [3] D.A. Antonosyan, A.S. Solntsev and A.A. Sukhorukov, Phys. Rev. A 90 043845 (2014) [4] L.-G. Helt, J.E. Sipe and M.J. Steel, arXiv: 1407.4219

  9. Investigating the Cosmic Web with Topological Data Analysis

    NASA Astrophysics Data System (ADS)

    Cisewski-Kehe, Jessi; Wu, Mike; Fasy, Brittany; Hellwing, Wojciech; Lovell, Mark; Rinaldo, Alessandro; Wasserman, Larry

    2018-01-01

    Data exhibiting complicated spatial structures are common in many areas of science (e.g. cosmology, biology), but can be difficult to analyze. Persistent homology is a popular approach within the area of Topological Data Analysis that offers a new way to represent, visualize, and interpret complex data by extracting topological features, which can be used to infer properties of the underlying structures. In particular, TDA may be useful for analyzing the large-scale structure (LSS) of the Universe, which is an intricate and spatially complex web of matter. In order to understand the physics of the Universe, theoretical and computational cosmologists develop large-scale simulations that allow for visualizing and analyzing the LSS under varying physical assumptions. Each point in the 3D data set represents a galaxy or a cluster of galaxies, and topological summaries ("persistent diagrams") can be obtained summarizing the different ordered holes in the data (e.g. connected components, loops, voids).The topological summaries are interesting and informative descriptors of the Universe on their own, but hypothesis tests using the topological summaries would provide a way to make more rigorous comparisons of LSS under different theoretical models. For example, the received cosmological model has cold dark matter (CDM); however, while the case is strong for CDM, there are some observational inconsistencies with this theory. Another possibility is warm dark matter (WDM). It is of interest to see if a CDM Universe and WDM Universe produce LSS that is topologically distinct.We present several possible test statistics for two-sample hypothesis tests using the topological summaries, carryout a simulation study to investigate the suitableness of the proposed test statistics using simulated data from a variation of the Voronoi foam model, and finally we apply the proposed inference framework to WDM vs. CDM cosmological simulation data.

  10. The Fundamentals of Care Framework as a Point-of-Care Nursing Theory.

    PubMed

    Kitson, Alison L

    Nursing theories have attempted to shape the everyday practice of clinical nurses and patient care. However, many theories-because of their level of abstraction and distance from everyday caring activity-have failed to help nurses undertake the routine practical aspects of nursing care in a theoretically informed way. The purpose of the paper is to present a point-of-care theoretical framework, called the fundamentals of care (FOC) framework, which explains, guides, and potentially predicts the quality of care nurses provide to patients, their carers, and family members. The theoretical framework is presented: person-centered fundamental care (PCFC)-the outcome for the patient and the nurse and the goal of the FOC framework are achieved through the active management of the practice process, which involves the nurse and the patient working together to integrate three core dimensions: establishing the nurse-patient relationship, integrating the FOC into the patient's care plan, and ensuring that the setting or context where care is transacted and coordinated is conducive to achieving PCFC outcomes. Each dimension has multiple elements and subelements, which require unique assessment for each nurse-patient encounter. The FOC framework is presented along with two scenarios to demonstrate its usefulness. The dimensions, elements, and subelements are described, and next steps in the development are articulated.

  11. New theoretical framework for designing nonionic surfactant mixtures that exhibit a desired adsorption kinetics behavior.

    PubMed

    Moorkanikkara, Srinivas Nageswaran; Blankschtein, Daniel

    2010-12-21

    How does one design a surfactant mixture using a set of available surfactants such that it exhibits a desired adsorption kinetics behavior? The traditional approach used to address this design problem involves conducting trial-and-error experiments with specific surfactant mixtures. This approach is typically time-consuming and resource-intensive and becomes increasingly challenging when the number of surfactants that can be mixed increases. In this article, we propose a new theoretical framework to identify a surfactant mixture that most closely meets a desired adsorption kinetics behavior. Specifically, the new theoretical framework involves (a) formulating the surfactant mixture design problem as an optimization problem using an adsorption kinetics model and (b) solving the optimization problem using a commercial optimization package. The proposed framework aims to identify the surfactant mixture that most closely satisfies the desired adsorption kinetics behavior subject to the predictive capabilities of the chosen adsorption kinetics model. Experiments can then be conducted at the identified surfactant mixture condition to validate the predictions. We demonstrate the reliability and effectiveness of the proposed theoretical framework through a realistic case study by identifying a nonionic surfactant mixture consisting of up to four alkyl poly(ethylene oxide) surfactants (C(10)E(4), C(12)E(5), C(12)E(6), and C(10)E(8)) such that it most closely exhibits a desired dynamic surface tension (DST) profile. Specifically, we use the Mulqueen-Stebe-Blankschtein (MSB) adsorption kinetics model (Mulqueen, M.; Stebe, K. J.; Blankschtein, D. Langmuir 2001, 17, 5196-5207) to formulate the optimization problem as well as the SNOPT commercial optimization solver to identify a surfactant mixture consisting of these four surfactants that most closely exhibits the desired DST profile. Finally, we compare the experimental DST profile measured at the surfactant mixture condition identified by the new theoretical framework with the desired DST profile and find good agreement between the two profiles.

  12. Patient Autonomy in a High-Tech Care Context - A Theoretical Framework.

    PubMed

    Lindberg, Catharina; Fagerström, Cecilia; Willman, Ania

    2018-06-12

    To synthesise and interpret previous findings with the aim of developing a theoretical framework for patient autonomy in a high-tech care context. Putting the somewhat abstract concept of patient autonomy into practice can prove difficult since when it is highlighted in healthcare literature the patient perspective is often invisible. Autonomy presumes that a person has experience, education, self-discipline and decision-making capacity. Reference to autonomy in relation to patients in high-tech care environments could therefore be considered paradoxical, as in most cases these persons are vulnerable, with impaired physical and/or metacognitive capacity, thus making extended knowledge of patient autonomy for these persons even more important. Theory development. The basic approaches in theory development by Walker and Avant were used to create a theoretical framework through an amalgamation of the results from three qualitative studies conducted previously by the same research group. A theoretical framework - the control-partnership-transition framework - was delineated disclosing different parts co-creating the prerequisites for patient autonomy in high-tech care environments. Assumptions and propositional statements that guide theory development were also outlined, as were guiding principles for use in day-to-day nursing care. Four strategies used by patients were revealed: the strategy of control, the strategy of partnership, the strategy of trust, and the strategy of transition. An extended knowledge base, founded on theoretical reasoning about patient autonomy, could facilitate nursing care that would allow people to remain/become autonomous in the role of patient in high-tech care environments. The control-partnership-transition framework would be of help in supporting and defending patient autonomy when caring for individual patients, as it provides an understanding of the strategies employed by patients to achieve autonomy in high-tech care contexts. The guiding principles for patient autonomy presented could be used in nursing guidelines. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  13. Machine learning in the string landscape

    NASA Astrophysics Data System (ADS)

    Carifio, Jonathan; Halverson, James; Krioukov, Dmitri; Nelson, Brent D.

    2017-09-01

    We utilize machine learning to study the string landscape. Deep data dives and conjecture generation are proposed as useful frameworks for utilizing machine learning in the landscape, and examples of each are presented. A decision tree accurately predicts the number of weak Fano toric threefolds arising from reflexive polytopes, each of which determines a smooth F-theory compactification, and linear regression generates a previously proven conjecture for the gauge group rank in an ensemble of 4/3× 2.96× {10}^{755} F-theory compactifications. Logistic regression generates a new conjecture for when E 6 arises in the large ensemble of F-theory compactifications, which is then rigorously proven. This result may be relevant for the appearance of visible sectors in the ensemble. Through conjecture generation, machine learning is useful not only for numerics, but also for rigorous results.

  14. Multi-scale theoretical investigation of hydrogen storage in covalent organic frameworks.

    PubMed

    Tylianakis, Emmanuel; Klontzas, Emmanouel; Froudakis, George E

    2011-03-01

    The quest for efficient hydrogen storage materials has been the limiting step towards the commercialization of hydrogen as an energy carrier and has attracted a lot of attention from the scientific community. Sophisticated multi-scale theoretical techniques have been considered as a valuable tool for the prediction of materials storage properties. Such techniques have also been used for the investigation of hydrogen storage in a novel category of porous materials known as Covalent Organic Frameworks (COFs). These framework materials are consisted of light elements and are characterized by exceptional physicochemical properties such as large surface areas and pore volumes. Combinations of ab initio, Molecular Dynamics (MD) and Grand Canonical Monte-Carlo (GCMC) calculations have been performed to investigate the hydrogen adsorption in these ultra-light materials. The purpose of the present review is to summarize the theoretical hydrogen storage studies that have been published after the discovery of COFs. Experimental and theoretical studies have proven that COFs have comparable or better hydrogen storage abilities than other competitive materials such as MOF. The key factors that can lead to the improvement of the hydrogen storage properties of COFs are highlighted, accompanied with some recently presented theoretical multi-scale studies concerning these factors.

  15. Teaching for clinical reasoning - helping students make the conceptual links.

    PubMed

    McMillan, Wendy Jayne

    2010-01-01

    Dental educators complain that students struggle to apply what they have learnt theoretically in the clinical context. This paper is premised on the assumption that there is a relationship between conceptual thinking and clinical reasoning. The paper provides a theoretical framework for understanding the relationship between conceptual learning and clinical reasoning. A review of current literature is used to explain the way in which conceptual understanding influences clinical reasoning and the transfer of theoretical understandings to the clinical context. The paper argues that the connections made between concepts are what is significant about conceptual understanding. From this point of departure the paper describes teaching strategies that facilitate the kinds of learning opportunities that students need in order to develop conceptual understanding and to be able to transfer knowledge from theoretical to clinical contexts. Along with a variety of teaching strategies, the value of concept maps is discussed. The paper provides a framework for understanding the difficulties that students have in developing conceptual networks appropriate for later clinical reasoning. In explaining how students learn for clinical application, the paper provides a theoretical framework that can inform how dental educators facilitate the conceptual learning, and later clinical reasoning, of their students.

  16. How Do the First Days Count? A Case Study of Qatar Experience in Emergency Risk Communication during the MERS-CoV Outbreak.

    PubMed

    Nour, Mohamed; Alhajri, Mohd; Farag, Elmoubasher A B A; Al-Romaihi, Hamad E; Al-Thani, Mohamed; Al-Marri, Salih; Savoia, Elena

    2017-12-19

    This case study is the first to be developed in the Middle East region to document what happened during the response to the 2013 MERS outbreak in Qatar. It provides a description of key epidemiologic events and news released from a prime daily newspaper and main Emergency Risk Communication (ERC) actions that were undertaken by public health authorities. Using the Crisis and Emergency Risk Communication (CERC) theoretical framework, the study analyzes how the performed ERC strategies during the first days of the outbreak might have contributed to the outbreak management. MERS-CoV related events were chronologically tracked, together with the relevant stories that were published in a major newspaper over the course of three distinct phases of the epidemic. The collected media stories were then assessed against the practiced emergency risk communication (ERC) activities during the same time frame. The Crisis & Emergency Risk Communication (CERC) framework was partially followed during the early days of the MERS-CoV epidemic, which were characterized by overwhelming uncertainty. The SCH's commitment to a proactive and open risk communication strategy since day one, contributed to creating the SCH's image as a credible source of information and allowed for the quick initiation of the overall response efforts. Yet, conflicting messages and over reassurance were among the observed pitfalls of the implemented ERC strategy. The adoption of CERC principles can help restore and maintain the credibility of responding agencies. Further work is needed to develop more rigorous and comprehensive research strategies that address sharing of information by mainstream as well as social media for a more accurate assessment of the impact of the ERC strategy.

  17. A new statistical framework to assess structural alignment quality using information compression

    PubMed Central

    Collier, James H.; Allison, Lloyd; Lesk, Arthur M.; Garcia de la Banda, Maria; Konagurthu, Arun S.

    2014-01-01

    Motivation: Progress in protein biology depends on the reliability of results from a handful of computational techniques, structural alignments being one. Recent reviews have highlighted substantial inconsistencies and differences between alignment results generated by the ever-growing stock of structural alignment programs. The lack of consensus on how the quality of structural alignments must be assessed has been identified as the main cause for the observed differences. Current methods assess structural alignment quality by constructing a scoring function that attempts to balance conflicting criteria, mainly alignment coverage and fidelity of structures under superposition. This traditional approach to measuring alignment quality, the subject of considerable literature, has failed to solve the problem. Further development along the same lines is unlikely to rectify the current deficiencies in the field. Results: This paper proposes a new statistical framework to assess structural alignment quality and significance based on lossless information compression. This is a radical departure from the traditional approach of formulating scoring functions. It links the structural alignment problem to the general class of statistical inductive inference problems, solved using the information-theoretic criterion of minimum message length. Based on this, we developed an efficient and reliable measure of structural alignment quality, I-value. The performance of I-value is demonstrated in comparison with a number of popular scoring functions, on a large collection of competing alignments. Our analysis shows that I-value provides a rigorous and reliable quantification of structural alignment quality, addressing a major gap in the field. Availability: http://lcb.infotech.monash.edu.au/I-value Contact: arun.konagurthu@monash.edu Supplementary information: Online supplementary data are available at http://lcb.infotech.monash.edu.au/I-value/suppl.html PMID:25161241

  18. On the Ice Nucleation Spectrum

    NASA Technical Reports Server (NTRS)

    Barahona, D.

    2012-01-01

    This work presents a novel formulation of the ice nucleation spectrum, i.e. the function relating the ice crystal concentration to cloud formation conditions and aerosol properties. The new formulation is physically-based and explicitly accounts for the dependency of the ice crystal concentration on temperature, supersaturation, cooling rate, and particle size, surface area and composition. This is achieved by introducing the concepts of ice nucleation coefficient (the number of ice germs present in a particle) and nucleation probability dispersion function (the distribution of ice nucleation coefficients within the aerosol population). The new formulation is used to generate ice nucleation parameterizations for the homogeneous freezing of cloud droplets and the heterogeneous deposition ice nucleation on dust and soot ice nuclei. For homogeneous freezing, it was found that by increasing the dispersion in the droplet volume distribution the fraction of supercooled droplets in the population increases. For heterogeneous ice nucleation the new formulation consistently describes singular and stochastic behavior within a single framework. Using a fundamentally stochastic approach, both cooling rate independence and constancy of the ice nucleation fraction over time, features typically associated with singular behavior, were reproduced. Analysis of the temporal dependency of the ice nucleation spectrum suggested that experimental methods that measure the ice nucleation fraction over few seconds would tend to underestimate the ice nuclei concentration. It is shown that inferring the aerosol heterogeneous ice nucleation properties from measurements of the onset supersaturation and temperature may carry significant error as the variability in ice nucleation properties within the aerosol population is not accounted for. This work provides a simple and rigorous ice nucleation framework where theoretical predictions, laboratory measurements and field campaign data can be reconciled, and that is suitable for application in atmospheric modeling studies.

  19. Theoretical Models and Operational Frameworks in Public Health Ethics

    PubMed Central

    Petrini, Carlo

    2010-01-01

    The article is divided into three sections: (i) an overview of the main ethical models in public health (theoretical foundations); (ii) a summary of several published frameworks for public health ethics (practical frameworks); and (iii) a few general remarks. Rather than maintaining the superiority of one position over the others, the main aim of the article is to summarize the basic approaches proposed thus far concerning the development of public health ethics by describing and comparing the various ideas in the literature. With this in mind, an extensive list of references is provided. PMID:20195441

  20. Unsupervised active learning based on hierarchical graph-theoretic clustering.

    PubMed

    Hu, Weiming; Hu, Wei; Xie, Nianhua; Maybank, Steve

    2009-10-01

    Most existing active learning approaches are supervised. Supervised active learning has the following problems: inefficiency in dealing with the semantic gap between the distribution of samples in the feature space and their labels, lack of ability in selecting new samples that belong to new categories that have not yet appeared in the training samples, and lack of adaptability to changes in the semantic interpretation of sample categories. To tackle these problems, we propose an unsupervised active learning framework based on hierarchical graph-theoretic clustering. In the framework, two promising graph-theoretic clustering algorithms, namely, dominant-set clustering and spectral clustering, are combined in a hierarchical fashion. Our framework has some advantages, such as ease of implementation, flexibility in architecture, and adaptability to changes in the labeling. Evaluations on data sets for network intrusion detection, image classification, and video classification have demonstrated that our active learning framework can effectively reduce the workload of manual classification while maintaining a high accuracy of automatic classification. It is shown that, overall, our framework outperforms the support-vector-machine-based supervised active learning, particularly in terms of dealing much more efficiently with new samples whose categories have not yet appeared in the training samples.

Top