Sample records for theory analysis methods

  1. The Tongue and Quill

    DTIC Science & Technology

    2004-08-01

    ethnography , phenomenological study , grounded theory study and content analysis. THE HISTORICAL METHOD Methods I. Qualitative Research Methods ... Phenomenological Study 4. Grounded Theory Study 5. Content Analysis II. Quantitative Research Methods A...A. The Historical Method B. General Qualitative

  2. Between practice and theory: Melanie Klein, Anna Freud and the development of child analysis.

    PubMed

    Donaldson, G

    1996-04-01

    An examination of the early history of child analysis in the writings of Melanie Klein and Anna Freud reveals how two different and opposing approaches to child analysis arose at the same time. The two methods of child analysis are rooted in a differential emphasis on psychoanalytic theory and practice. The Kleinian method derives from the application of technique while the Anna Freudian method is driven by theory. Furthermore, by holding to the Freudian theory of child development Anna Freud was forced to limit the scope of child analysis, while Klein's application of Freudian practice has led to new discoveries about the development of the infant psyche.

  3. Using Molecular Modeling in Teaching Group Theory Analysis of the Infrared Spectra of Organometallic Compounds

    ERIC Educational Resources Information Center

    Wang, Lihua

    2012-01-01

    A new method is introduced for teaching group theory analysis of the infrared spectra of organometallic compounds using molecular modeling. The main focus of this method is to enhance student understanding of the symmetry properties of vibrational modes and of the group theory analysis of infrared (IR) spectra by using visual aids provided by…

  4. Nonstandard Methods in Lie Theory

    ERIC Educational Resources Information Center

    Goldbring, Isaac Martin

    2009-01-01

    In this thesis, we apply model theory to Lie theory and geometric group theory. These applications of model theory come via nonstandard analysis. In Lie theory, we use nonstandard methods to prove two results. First, we give a positive solution to the local form of Hilbert's Fifth Problem, which asks whether every locally euclidean local…

  5. Concept analysis and the building blocks of theory: misconceptions regarding theory development.

    PubMed

    Bergdahl, Elisabeth; Berterö, Carina M

    2016-10-01

    The purpose of this article is to discuss the attempts to justify concepts analysis as a way to construct theory - a notion often advocated in nursing. The notion that concepts are the building blocks or threads from which theory is constructed is often repeated. It can be found in many articles and well-known textbooks. However, this notion is seldom explained or defended. The notion of concepts as building blocks has also been questioned by several authors. However, most of these authors seem to agree to some degree that concepts are essential components from which theory is built. Discussion paper. Literature was reviewed to synthesize and debate current knowledge. Our point is that theory is not built by concepts analysis or clarification and we will show that this notion has its basis in some serious misunderstandings. We argue that concept analysis is not a part of sound scientific method and should be abandoned. The current methods of concept analysis in nursing have no foundation in philosophy of science or in language philosophy. The type of concept analysis performed in nursing is not a way to 'construct' theory. Rather, theories are formed by creative endeavour to propose a solution to a scientific and/or practical problem. The bottom line is that the current style and form of concept analysis in nursing should be abandoned in favour of methods in line with modern theory of science. © 2016 John Wiley & Sons Ltd.

  6. Realist explanatory theory building method for social epidemiology: a protocol for a mixed method multilevel study of neighbourhood context and postnatal depression.

    PubMed

    Eastwood, John G; Jalaludin, Bin B; Kemp, Lynn A

    2014-01-01

    A recent criticism of social epidemiological studies, and multi-level studies in particular has been a paucity of theory. We will present here the protocol for a study that aims to build a theory of the social epidemiology of maternal depression. We use a critical realist approach which is trans-disciplinary, encompassing both quantitative and qualitative traditions, and that assumes both ontological and hierarchical stratification of reality. We describe a critical realist Explanatory Theory Building Method comprising of an: 1) emergent phase, 2) construction phase, and 3) confirmatory phase. A concurrent triangulated mixed method multilevel cross-sectional study design is described. The Emergent Phase uses: interviews, focus groups, exploratory data analysis, exploratory factor analysis, regression, and multilevel Bayesian spatial data analysis to detect and describe phenomena. Abductive and retroductive reasoning will be applied to: categorical principal component analysis, exploratory factor analysis, regression, coding of concepts and categories, constant comparative analysis, drawing of conceptual networks, and situational analysis to generate theoretical concepts. The Theory Construction Phase will include: 1) defining stratified levels; 2) analytic resolution; 3) abductive reasoning; 4) comparative analysis (triangulation); 5) retroduction; 6) postulate and proposition development; 7) comparison and assessment of theories; and 8) conceptual frameworks and model development. The strength of the critical realist methodology described is the extent to which this paradigm is able to support the epistemological, ontological, axiological, methodological and rhetorical positions of both quantitative and qualitative research in the field of social epidemiology. The extensive multilevel Bayesian studies, intensive qualitative studies, latent variable theory, abductive triangulation, and Inference to Best Explanation provide a strong foundation for Theory Construction. The study will contribute to defining the role that realism and mixed methods can play in explaining the social determinants and developmental origins of health and disease.

  7. Modal analysis applied to circular, rectangular, and coaxial waveguides

    NASA Technical Reports Server (NTRS)

    Hoppe, D. J.

    1988-01-01

    Recent developments in the analysis of various waveguide components and feedhorns using Modal Analysis (Mode Matching Method) are summarized. A brief description of the theory is presented, and the important features of the method are pointed out. Specific examples in circular, rectangular, and coaxial waveguides are included, with comparisons between the theory and experimental measurements. Extensions to the methods are described.

  8. Structural reliability analysis under evidence theory using the active learning kriging model

    NASA Astrophysics Data System (ADS)

    Yang, Xufeng; Liu, Yongshou; Ma, Panke

    2017-11-01

    Structural reliability analysis under evidence theory is investigated. It is rigorously proved that a surrogate model providing only correct sign prediction of the performance function can meet the accuracy requirement of evidence-theory-based reliability analysis. Accordingly, a method based on the active learning kriging model which only correctly predicts the sign of the performance function is proposed. Interval Monte Carlo simulation and a modified optimization method based on Karush-Kuhn-Tucker conditions are introduced to make the method more efficient in estimating the bounds of failure probability based on the kriging model. Four examples are investigated to demonstrate the efficiency and accuracy of the proposed method.

  9. Methodological tools for the collection and analysis of participant observation data using grounded theory.

    PubMed

    Laitinen, Heleena; Kaunonen, Marja; Astedt-Kurki, Päivi

    2014-11-01

    To give clarity to the analysis of participant observation in nursing when implementing the grounded theory method. Participant observation (PO) is a method of collecting data that reveals the reality of daily life in a specific context. In grounded theory, interviews are the primary method of collecting data but PO gives a distinctive insight, revealing what people are really doing, instead of what they say they are doing. However, more focus is needed on the analysis of PO. An observational study carried out to gain awareness of nursing care and its electronic documentation in four acute care wards in hospitals in Finland. Discussion of using the grounded theory method and PO as a data collection tool. The following methodological tools are discussed: an observational protocol, jotting of notes, microanalysis, the use of questioning, constant comparison, and writing and illustrating. Each tool has specific significance in collecting and analysing data, working in constant interaction. Grounded theory and participant observation supplied rich data and revealed the complexity of the daily reality of acute care. In this study, the methodological tools provided a base for the study at the research sites and outside. The process as a whole was challenging. It was time-consuming and it required rigorous and simultaneous data collection and analysis, including reflective writing. Using these methodological tools helped the researcher stay focused from data collection and analysis to building theory. Using PO as a data collection method in qualitative nursing research provides insights. It is not commonly discussed in nursing research and therefore this study can provide insight, which cannot be seen or revealed by using other data collection methods. Therefore, this paper can produce a useful tool for those who intend to use PO and grounded theory in their nursing research.

  10. The Constant Comparative Analysis Method Outside of Grounded Theory

    ERIC Educational Resources Information Center

    Fram, Sheila M.

    2013-01-01

    This commentary addresses the gap in the literature regarding discussion of the legitimate use of Constant Comparative Analysis Method (CCA) outside of Grounded Theory. The purpose is to show the strength of using CCA to maintain the emic perspective and how theoretical frameworks can maintain the etic perspective throughout the analysis. My…

  11. Comparing the Effectiveness of SPSS and EduG Using Different Designs for Generalizability Theory

    ERIC Educational Resources Information Center

    Teker, Gulsen Tasdelen; Guler, Nese; Uyanik, Gulden Kaya

    2015-01-01

    Generalizability theory (G theory) provides a broad conceptual framework for social sciences such as psychology and education, and a comprehensive construct for numerous measurement events by using analysis of variance, a strong statistical method. G theory, as an extension of both classical test theory and analysis of variance, is a model which…

  12. Theory, Method, and Triangulation in the Study of Street Children.

    ERIC Educational Resources Information Center

    Lucchini, Riccardo

    1996-01-01

    Describes how a comparative study of street children in Montevideo (Uruguay), Rio de Janeiro, and Mexico City contributes to a synergism between theory and method. Notes how theoretical approaches of symbolic interactionism, genetic structuralism, and habitus theory complement interview, participant observation, and content analysis methods;…

  13. Recent developments in rotary-wing aerodynamic theory

    NASA Technical Reports Server (NTRS)

    Johnson, W.

    1986-01-01

    Current progress in the computational analysis of rotary-wing flowfields is surveyed, and some typical results are presented in graphs. Topics examined include potential theory, rotating coordinate systems, lifting-surface theory (moving singularity, fixed wing, and rotary wing), panel methods (surface singularity representations, integral equations, and compressible flows), transonic theory (the small-disturbance equation), wake analysis (hovering rotor-wake models and transonic blade-vortex interaction), limitations on computational aerodynamics, and viscous-flow methods (dynamic-stall theories and lifting-line theory). It is suggested that the present algorithms and advanced computers make it possible to begin working toward the ultimate goal of turbulent Navier-Stokes calculations for an entire rotorcraft.

  14. Applicability of linearized-theory attached-flow methods to design and analysis of flap systems at low speeds for thin swept wings with sharp leading edges

    NASA Technical Reports Server (NTRS)

    Carlson, Harry W.; Darden, Christine M.

    1987-01-01

    Low-speed experimental force and data on a series of thin swept wings with sharp leading edges and leading and trailing-edge flaps are compared with predictions made using a linearized-theory method which includes estimates of vortex forces. These comparisons were made to assess the effectiveness of linearized-theory methods for use in the design and analysis of flap systems in subsonic flow. Results demonstrate that linearized-theory, attached-flow methods (with approximate representation of vortex forces) can form the basis of a rational system for flap design and analysis. Even attached-flow methods that do not take vortex forces into account can be used for the selection of optimized flap-system geometry, but design-point performance levels tend to be underestimated unless vortex forces are included. Illustrative examples of the use of these methods in the design of efficient low-speed flap systems are included.

  15. Realist theory construction for a mixed method multilevel study of neighbourhood context and postnatal depression.

    PubMed

    Eastwood, John G; Kemp, Lynn A; Jalaludin, Bin B

    2016-01-01

    We have recently described a protocol for a study that aims to build a theory of neighbourhood context and postnatal depression. That protocol proposed a critical realist Explanatory Theory Building Method comprising of an: (1) emergent phase, (2) construction phase, and (3) confirmatory phase. A concurrent triangulated mixed method multilevel cross-sectional study design was described. The protocol also described in detail the Theory Construction Phase which will be presented here. The Theory Construction Phase will include: (1) defining stratified levels; (2) analytic resolution; (3) abductive reasoning; (4) comparative analysis (triangulation); (5) retroduction; (6) postulate and proposition development; (7) comparison and assessment of theories; and (8) conceptual frameworks and model development. The stratified levels of analysis in this study were predominantly social and psychological. The abductive analysis used the theoretical frames of: Stress Process; Social Isolation; Social Exclusion; Social Services; Social Capital, Acculturation Theory and Global-economic level mechanisms. Realist propositions are presented for each analysis of triangulated data. Inference to best explanation is used to assess and compare theories. A conceptual framework of maternal depression, stress and context is presented that includes examples of mechanisms at psychological, social, cultural and global-economic levels. Stress was identified as a necessary mechanism that has the tendency to cause several outcomes including depression, anxiety, and health harming behaviours. The conceptual framework subsequently included conditional mechanisms identified through the retroduction including the stressors of isolation and expectations and buffers of social support and trust. The meta-theory of critical realism is used here to generate and construct social epidemiological theory using stratified ontology and both abductive and retroductive analysis. The findings will be applied to the development of a middle range theory and subsequent programme theory for local perinatal child and family interventions.

  16. An Information-Correction Method for Testlet-Based Test Analysis: From the Perspectives of Item Response Theory and Generalizability Theory. Research Report. ETS RR-17-27

    ERIC Educational Resources Information Center

    Li, Feifei

    2017-01-01

    An information-correction method for testlet-based tests is introduced. This method takes advantage of both generalizability theory (GT) and item response theory (IRT). The measurement error for the examinee proficiency parameter is often underestimated when a unidimensional conditional-independence IRT model is specified for a testlet dataset. By…

  17. A Comparison of Measurement Equivalence Methods Based on Confirmatory Factor Analysis and Item Response Theory.

    ERIC Educational Resources Information Center

    Flowers, Claudia P.; Raju, Nambury S.; Oshima, T. C.

    Current interest in the assessment of measurement equivalence emphasizes two methods of analysis, linear, and nonlinear procedures. This study simulated data using the graded response model to examine the performance of linear (confirmatory factor analysis or CFA) and nonlinear (item-response-theory-based differential item function or IRT-Based…

  18. Theoretical investigation of cyromazine tautomerism using density functional theory and Møller–Plesset perturbation theory methods

    USDA-ARS?s Scientific Manuscript database

    A computational chemistry analysis of six unique tautomers of cyromazine, a pesticide used for fly control, was performed with density functional theory (DFT) and canonical second order Møller–Plesset perturbation theory (MP2) methods to gain insight into the contributions of molecular structure to ...

  19. Toward a new methodological paradigm for testing theories of health behavior and health behavior change.

    PubMed

    Noar, Seth M; Mehrotra, Purnima

    2011-03-01

    Traditional theory testing commonly applies cross-sectional (and occasionally longitudinal) survey research to test health behavior theory. Since such correlational research cannot demonstrate causality, a number of researchers have called for the increased use of experimental methods for theory testing. We introduce the multi-methodological theory-testing (MMTT) framework for testing health behavior theory. The MMTT framework introduces a set of principles that broaden the perspective of how we view evidence for health behavior theory. It suggests that while correlational survey research designs represent one method of testing theory, the weaknesses of this approach demand that complementary approaches be applied. Such approaches include randomized lab and field experiments, mediation analysis of theory-based interventions, and meta-analysis. These alternative approaches to theory testing can demonstrate causality in a much more robust way than is possible with correlational survey research methods. Such approaches should thus be increasingly applied in order to more completely and rigorously test health behavior theory. Greater application of research derived from the MMTT may lead researchers to refine and modify theory and ultimately make theory more valuable to practitioners. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  20. Grounded Theory as a "Family of Methods": A Genealogical Analysis to Guide Research

    ERIC Educational Resources Information Center

    Babchuk, Wayne A.

    2011-01-01

    This study traces the evolution of grounded theory from a nuclear to an extended family of methods and considers the implications that decision-making based on informed choices throughout all phases of the research process has for realizing the potential of grounded theory for advancing adult education theory and practice. [This paper was…

  1. An EGR performance evaluation and decision-making approach based on grey theory and grey entropy analysis

    PubMed Central

    2018-01-01

    Exhaust gas recirculation (EGR) is one of the main methods of reducing NOX emissions and has been widely used in marine diesel engines. This paper proposes an optimized comprehensive assessment method based on multi-objective grey situation decision theory, grey relation theory and grey entropy analysis to evaluate the performance and optimize rate determination of EGR, which currently lack clear theoretical guidance. First, multi-objective grey situation decision theory is used to establish the initial decision-making model according to the main EGR parameters. The optimal compromise between diesel engine combustion and emission performance is transformed into a decision-making target weight problem. After establishing the initial model and considering the characteristics of EGR under different conditions, an optimized target weight algorithm based on grey relation theory and grey entropy analysis is applied to generate the comprehensive evaluation and decision-making model. Finally, the proposed method is successfully applied to a TBD234V12 turbocharged diesel engine, and the results clearly illustrate the feasibility of the proposed method for providing theoretical support and a reference for further EGR optimization. PMID:29377956

  2. An EGR performance evaluation and decision-making approach based on grey theory and grey entropy analysis.

    PubMed

    Zu, Xianghuan; Yang, Chuanlei; Wang, Hechun; Wang, Yinyan

    2018-01-01

    Exhaust gas recirculation (EGR) is one of the main methods of reducing NOX emissions and has been widely used in marine diesel engines. This paper proposes an optimized comprehensive assessment method based on multi-objective grey situation decision theory, grey relation theory and grey entropy analysis to evaluate the performance and optimize rate determination of EGR, which currently lack clear theoretical guidance. First, multi-objective grey situation decision theory is used to establish the initial decision-making model according to the main EGR parameters. The optimal compromise between diesel engine combustion and emission performance is transformed into a decision-making target weight problem. After establishing the initial model and considering the characteristics of EGR under different conditions, an optimized target weight algorithm based on grey relation theory and grey entropy analysis is applied to generate the comprehensive evaluation and decision-making model. Finally, the proposed method is successfully applied to a TBD234V12 turbocharged diesel engine, and the results clearly illustrate the feasibility of the proposed method for providing theoretical support and a reference for further EGR optimization.

  3. Grounded theory in music therapy research.

    PubMed

    O'Callaghan, Clare

    2012-01-01

    Grounded theory is one of the most common methodologies used in constructivist (qualitative) music therapy research. Researchers use the term "grounded theory" when denoting varying research designs and theoretical outcomes. This may be challenging for novice researchers when considering whether grounded theory is appropriate for their research phenomena. This paper examines grounded theory within music therapy research. Grounded theory is briefly described, including some of its "contested" ideas. A literature search was conducted using the descriptor "music therapy and grounded theory" in Pubmed, CINAHL PsychlNFO, SCOPUS, ERIC (CSA), Web of Science databases, and a music therapy monograph series. A descriptive analysis was performed on the uncovered studies to examine researched phenomena, grounded theory methods used, and how findings were presented, Thirty music therapy research projects were found in refereed journals and monographs from 1993 to "in press." The Strauss and Corbin approach to grounded theory dominates the field. Descriptors to signify grounded theory components in the studies greatly varied. Researchers have used partial or complete grounded theory methods to examine clients', family members', staff, music therapy "overhearers," music therapists', and students' experiences, as well as music therapy creative products and professional views, issues, and literature. Seven grounded theories were offered. It is suggested that grounded theory researchers clarify what and who inspired their design, why partial grounded theory methods were used (when relevant), and their ontology. By elucidating assumptions underpinning the data collection, analysis, and findings' contribution, researchers will continue to improve music therapy research using grounded theory methods.

  4. Micromechanics-Based Progressive Failure Analysis of Composite Laminates Using Different Constituent Failure Theories

    NASA Technical Reports Server (NTRS)

    Moncada, Albert M.; Chattopadhyay, Aditi; Bednarcyk, Brett A.; Arnold, Steven M.

    2008-01-01

    Predicting failure in a composite can be done with ply level mechanisms and/or micro level mechanisms. This paper uses the Generalized Method of Cells and High-Fidelity Generalized Method of Cells micromechanics theories, coupled with classical lamination theory, as implemented within NASA's Micromechanics Analysis Code with Generalized Method of Cells. The code is able to implement different failure theories on the level of both the fiber and the matrix constituents within a laminate. A comparison is made among maximum stress, maximum strain, Tsai-Hill, and Tsai-Wu failure theories. To verify the failure theories the Worldwide Failure Exercise (WWFE) experiments have been used. The WWFE is a comprehensive study that covers a wide range of polymer matrix composite laminates. The numerical results indicate good correlation with the experimental results for most of the composite layups, but also point to the need for more accurate resin damage progression models.

  5. Using grounded theory to create a substantive theory of promoting schoolchildren's mental health.

    PubMed

    Puolakka, Kristiina; Haapasalo-Pesu, Kirsi-Maria; Kiikkala, Irma; Astedt-Kurki, Päivi; Paavilainen, Eija

    2013-01-01

    To discuss the creation of a substantive theory using grounded theory. This article provides an example of generating theory from a study of mental health promotion at a high school in Finland. Grounded theory is a method for creating explanatory theory. It is a valuable tool for health professionals when studying phenomena that affect patients' health, offering a deeper understanding of nursing methods and knowledge. Interviews with school employees, students and parents, and verbal responses to the 'school wellbeing profile survey', as well as working group memos related to the development activities. Participating children were aged between 12 and 15. The analysis was conducted by applying the grounded theory method and involved open coding of the material, constant comparison, axial coding and selective coding after identifying the core category. The analysis produced concepts about mental health promotion in school and assumptions about relationships. Grounded theory proved to be an effective means of eliciting people's viewpoints on mental health promotion. The personal views of different parties make it easier to identify an action applicable to practice.

  6. Statistical analysis of activation and reaction energies with quasi-variational coupled-cluster theory

    NASA Astrophysics Data System (ADS)

    Black, Joshua A.; Knowles, Peter J.

    2018-06-01

    The performance of quasi-variational coupled-cluster (QV) theory applied to the calculation of activation and reaction energies has been investigated. A statistical analysis of results obtained for six different sets of reactions has been carried out, and the results have been compared to those from standard single-reference methods. In general, the QV methods lead to increased activation energies and larger absolute reaction energies compared to those obtained with traditional coupled-cluster theory.

  7. Applicability of Complexity Theory to Martian Fluvial Systems: A Preliminary Analysis

    NASA Technical Reports Server (NTRS)

    Rosenshein, E. B.

    2003-01-01

    In the last 15 years, terrestrial geomorphology has been revolutionized by the theories of chaotic systems, fractals, self-organization, and selforganized criticality. Except for the application of fractal theory to the analysis of lava flows and rampart craters on Mars, these theories have not yet been applied to problems of Martian landscape evolution. These complexity theories are elucidated below, along with the methods used to relate these theories to the realities of Martian fluvial systems.

  8. Integrated analysis on static/dynamic aeroelasticity of curved panels based on a modified local piston theory

    NASA Astrophysics Data System (ADS)

    Yang, Zhichun; Zhou, Jian; Gu, Yingsong

    2014-10-01

    A flow field modified local piston theory, which is applied to the integrated analysis on static/dynamic aeroelastic behaviors of curved panels, is proposed in this paper. The local flow field parameters used in the modification are obtained by CFD technique which has the advantage to simulate the steady flow field accurately. This flow field modified local piston theory for aerodynamic loading is applied to the analysis of static aeroelastic deformation and flutter stabilities of curved panels in hypersonic flow. In addition, comparisons are made between results obtained by using the present method and curvature modified method. It shows that when the curvature of the curved panel is relatively small, the static aeroelastic deformations and flutter stability boundaries obtained by these two methods have little difference, while for curved panels with larger curvatures, the static aeroelastic deformation obtained by the present method is larger and the flutter stability boundary is smaller compared with those obtained by the curvature modified method, and the discrepancy increases with the increasing of curvature of panels. Therefore, the existing curvature modified method is non-conservative compared to the proposed flow field modified method based on the consideration of hypersonic flight vehicle safety, and the proposed flow field modified local piston theory for curved panels enlarges the application range of piston theory.

  9. Conducting meta-analyses of HIV prevention literatures from a theory-testing perspective.

    PubMed

    Marsh, K L; Johnson, B T; Carey, M P

    2001-09-01

    Using illustrations from HIV prevention research, the current article advocates approaching meta-analysis as a theory-testing scientific method rather than as merely a set of rules for quantitative analysis. Like other scientific methods, meta-analysis has central concerns with internal, external, and construct validity. The focus of a meta-analysis should only rarely be merely describing the effects of health promotion, but rather should be on understanding and explaining phenomena and the processes underlying them. The methodological decisions meta-analysts make in conducting reviews should be guided by a consideration of the underlying goals of the review (e.g., simply effect size estimation or, preferably theory testing). From the advocated perspective that a health behavior meta-analyst should test theory, the authors present a number of issues to be considered during the conduct of meta-analyses.

  10. Mixing a Grounded Theory Approach with a Randomized Controlled Trial Related to Intimate Partner Violence: What Challenges Arise for Mixed Methods Research?

    PubMed Central

    Catallo, Cristina; Jack, Susan M.; Ciliska, Donna; MacMillan, Harriet L.

    2013-01-01

    Little is known about how to systematically integrate complex qualitative studies within the context of randomized controlled trials. A two-phase sequential explanatory mixed methods study was conducted in Canada to understand how women decide to disclose intimate partner violence in emergency department settings. Mixing a RCT (with a subanalysis of data) with a grounded theory approach required methodological modifications to maintain the overall rigour of this mixed methods study. Modifications were made to the following areas of the grounded theory approach to support the overall integrity of the mixed methods study design: recruitment of participants, maximum variation and negative case sampling, data collection, and analysis methods. Recommendations for future studies include: (1) planning at the outset to incorporate a qualitative approach with a RCT and to determine logical points during the RCT to integrate the qualitative component and (2) consideration for the time needed to carry out a RCT and a grounded theory approach, especially to support recruitment, data collection, and analysis. Data mixing strategies should be considered during early stages of the study, so that appropriate measures can be developed and used in the RCT to support initial coding structures and data analysis needs of the grounded theory phase. PMID:23577245

  11. Defining, illustrating and reflecting on logic analysis with an example from a professional development program.

    PubMed

    Tremblay, Marie-Claude; Brousselle, Astrid; Richard, Lucie; Beaudet, Nicole

    2013-10-01

    Program designers and evaluators should make a point of testing the validity of a program's intervention theory before investing either in implementation or in any type of evaluation. In this context, logic analysis can be a particularly useful option, since it can be used to test the plausibility of a program's intervention theory using scientific knowledge. Professional development in public health is one field among several that would truly benefit from logic analysis, as it appears to be generally lacking in theorization and evaluation. This article presents the application of this analysis method to an innovative public health professional development program, the Health Promotion Laboratory. More specifically, this paper aims to (1) define the logic analysis approach and differentiate it from similar evaluative methods; (2) illustrate the application of this method by a concrete example (logic analysis of a professional development program); and (3) reflect on the requirements of each phase of logic analysis, as well as on the advantages and disadvantages of such an evaluation method. Using logic analysis to evaluate the Health Promotion Laboratory showed that, generally speaking, the program's intervention theory appeared to have been well designed. By testing and critically discussing logic analysis, this article also contributes to further improving and clarifying the method. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. Refined methods of aeroelastic analysis and optimization. [swept wings, propeller theory, and subsonic flutter

    NASA Technical Reports Server (NTRS)

    Ashley, H.

    1984-01-01

    Graduate research activity in the following areas is reported: the divergence of laminated composite lifting surfaces, subsonic propeller theory and aeroelastic analysis, and cross sectional resonances in wind tunnels.

  13. Coding, Constant Comparisons, and Core Categories: A Worked Example for Novice Constructivist Grounded Theorists.

    PubMed

    Giles, Tracey M; de Lacey, Sheryl; Muir-Cochrane, Eimear

    2016-01-01

    Grounded theory method has been described extensively in the literature. Yet, the varying processes portrayed can be confusing for novice grounded theorists. This article provides a worked example of the data analysis phase of a constructivist grounded theory study that examined family presence during resuscitation in acute health care settings. Core grounded theory methods are exemplified, including initial and focused coding, constant comparative analysis, memo writing, theoretical sampling, and theoretical saturation. The article traces the construction of the core category "Conditional Permission" from initial and focused codes, subcategories, and properties, through to its position in the final substantive grounded theory.

  14. Careful with Those Priors: A Note on Bayesian Estimation in Two-Parameter Logistic Item Response Theory Models

    ERIC Educational Resources Information Center

    Marcoulides, Katerina M.

    2018-01-01

    This study examined the use of Bayesian analysis methods for the estimation of item parameters in a two-parameter logistic item response theory model. Using simulated data under various design conditions with both informative and non-informative priors, the parameter recovery of Bayesian analysis methods were examined. Overall results showed that…

  15. A Qualitative Analysis Framework Using Natural Language Processing and Graph Theory

    ERIC Educational Resources Information Center

    Tierney, Patrick J.

    2012-01-01

    This paper introduces a method of extending natural language-based processing of qualitative data analysis with the use of a very quantitative tool--graph theory. It is not an attempt to convert qualitative research to a positivist approach with a mathematical black box, nor is it a "graphical solution". Rather, it is a method to help qualitative…

  16. Contact stresses in gear teeth: A new method of analysis

    NASA Technical Reports Server (NTRS)

    Somprakit, Paisan; Huston, Ronald L.; Oswald, Fred B.

    1991-01-01

    A new, innovative procedure called point load superposition for determining the contact stresses in mating gear teeth. It is believed that this procedure will greatly extend both the range of applicability and the accuracy of gear contact stress analysis. Point load superposition is based upon fundamental solutions from the theory of elasticity. It is an iterative numerical procedure which has distinct advantages over the classical Hertz method, the finite element method, and over existing applications with the boundary element method. Specifically, friction and sliding effects, which are either excluded from or difficult to study with the classical methods, are routinely handled with the new procedure. Presented here are the basic theory and the algorithms. Several examples are given. Results are consistent with those of the classical theories. Applications to spur gears are discussed.

  17. Application of abstract harmonic analysis to the high-speed recognition of images

    NASA Technical Reports Server (NTRS)

    Usikov, D. A.

    1979-01-01

    Methods are constructed for rapidly computing correlation functions using the theory of abstract harmonic analysis. The theory developed includes as a particular case the familiar Fourier transform method for a correlation function which makes it possible to find images which are independent of their translation in the plane. Two examples of the application of the general theory described are the search for images, independent of their rotation and scale, and the search for images which are independent of their translations and rotations in the plane.

  18. Solution of elastic-plastic stress analysis problems by the p-version of the finite element method

    NASA Technical Reports Server (NTRS)

    Szabo, Barna A.; Actis, Ricardo L.; Holzer, Stefan M.

    1993-01-01

    The solution of small strain elastic-plastic stress analysis problems by the p-version of the finite element method is discussed. The formulation is based on the deformation theory of plasticity and the displacement method. Practical realization of controlling discretization errors for elastic-plastic problems is the main focus. Numerical examples which include comparisons between the deformation and incremental theories of plasticity under tight control of discretization errors are presented.

  19. Realist identification of group-level latent variables for perinatal social epidemiology theory building.

    PubMed

    Eastwood, John Graeme; Jalaludin, Bin Badrudin; Kemp, Lynn Ann; Phung, Hai Ngoc

    2014-01-01

    We have previously reported in this journal on an ecological study of perinatal depressive symptoms in South Western Sydney. In that article, we briefly reported on a factor analysis that was utilized to identify empirical indicators for analysis. In this article, we report on the mixed method approach that was used to identify those latent variables. Social epidemiology has been slow to embrace a latent variable approach to the study of social, political, economic, and cultural structures and mechanisms, partly for philosophical reasons. Critical realist ontology and epistemology have been advocated as an appropriate methodological approach to both theory building and theory testing in the health sciences. We describe here an emergent mixed method approach that uses qualitative methods to identify latent constructs followed by factor analysis using empirical indicators chosen to measure identified qualitative codes. Comparative analysis of the findings is reported together with a limited description of realist approaches to abstract reasoning.

  20. Similitude design for the vibration problems of plates and shells: A review

    NASA Astrophysics Data System (ADS)

    Zhu, Yunpeng; Wang, You; Luo, Zhong; Han, Qingkai; Wang, Deyou

    2017-06-01

    Similitude design plays a vital role in the analysis of vibration and shock problems encountered in large engineering equipment. Similitude design, including dimensional analysis and governing equation method, is founded on the dynamic similitude theory. This study reviews the application of similitude design methods in engineering practice and summarizes the major achievements of the dynamic similitude theory in structural vibration and shock problems in different fields, including marine structures, civil engineering structures, and large power equipment. This study also reviews the dynamic similitude design methods for thin-walled and composite material plates and shells, including the most recent work published by the authors. Structure sensitivity analysis is used to evaluate the scaling factors to attain accurate distorted scaling laws. Finally, this study discusses the existing problems and the potential of the dynamic similitude theory for the analysis of vibration and shock problems of structures.

  1. The value of theory in programmes to implement clinical guidelines: Insights from a retrospective mixed-methods evaluation of a programme to increase adherence to national guidelines for chronic disease in primary care

    PubMed Central

    Sheringham, Jessica; Solmi, Francesca; Ariti, Cono; Baim-Lance, Abigail; Morris, Steve; Fulop, Naomi J.

    2017-01-01

    Background Programmes have had limited success in improving guideline adherence for chronic disease. Use of theory is recommended but is often absent in programmes conducted in ‘real-world’ rather than research settings. Materials and methods This mixed-methods study tested a retrospective theory-based approach to evaluate a ‘real-world’ programme in primary care to improve adherence to national guidelines for chronic obstructive pulmonary disease (COPD). Qualitative data, comprising analysis of documents generated throughout the programme (n>300), in-depth interviews with planners (clinicians, managers and improvement experts involved in devising, planning, and implementing the programme, n = 14) and providers (practice clinicians, n = 14) were used to construct programme theories, experiences of implementation and contextual factors influencing care. Quantitative analyses comprised controlled before-and-after analyses to test ‘early’ and evolved’ programme theories with comparators grounded in each theory. ‘Early’ theory predicted the programme would reduce emergency hospital admissions (EHA). It was tested using national analysis of standardized borough-level EHA rates between programme and comparator boroughs. ‘Evolved’ theory predicted practices with higher programme participation would increase guideline adherence and reduce EHA and costs. It was tested using a difference-in-differences analysis with linked primary and secondary care data to compare changes in diagnosis, management, EHA and costs, over time and by programme participation. Results Contrary to programme planners’ predictions in ‘early’ and ‘evolved’ programme theories, admissions did not change following the programme. However, consistent with ‘evolved’ theory, higher guideline adoption occurred in practices with greater programme participation. Conclusions Retrospectively constructing theories based on the ideas of programme planners can enable evaluators to address some limitations encountered when evaluating programmes without a theoretical base. Prospectively articulating theory aided by existing models and mid-range implementation theories may strengthen guideline adoption efforts by prompting planners to scrutinise implementation methods. Benefits of deriving programme theory, with or without the aid of mid-range implementation theories, however, may be limited when the evidence underpinning guidelines is flawed. PMID:28328942

  2. Qualitative research in healthcare: an introduction to grounded theory using thematic analysis.

    PubMed

    Chapman, A L; Hadfield, M; Chapman, C J

    2015-01-01

    In today's NHS, qualitative research is increasingly important as a method of assessing and improving quality of care. Grounded theory has developed as an analytical approach to qualitative data over the last 40 years. It is primarily an inductive process whereby theoretical insights are generated from data, in contrast to deductive research where theoretical hypotheses are tested via data collection. Grounded theory has been one of the main contributors to the acceptance of qualitative methods in a wide range of applied social sciences. The influence of grounded theory as an approach is, in part, based on its provision of an explicit framework for analysis and theory generation. Furthermore the stress upon grounding research in the reality of participants has also given it credence in healthcare research. As with all analytical approaches, grounded theory has drawbacks and limitations. It is important to have an understanding of these in order to assess the applicability of this approach to healthcare research. In this review we outline the principles of grounded theory, and focus on thematic analysis as the analytical approach used most frequently in grounded theory studies, with the aim of providing clinicians with the skills to critically review studies using this methodology.

  3. Fatigue Analysis of Overhead Sign and Signal Structures

    DOT National Transportation Integrated Search

    1994-05-01

    This report documents methods of fatigue analysis for overhead sign and signal structures. The main purpose of this report is to combine pertinent wind loading and vibration theory, fatigue damage theory, and experimental data into a useable fatigue ...

  4. Survey and analysis of research on supersonic drag-due-to-lift minimization with recommendations for wing design

    NASA Technical Reports Server (NTRS)

    Carlson, Harry W.; Mann, Michael J.

    1992-01-01

    A survey of research on drag-due-to-lift minimization at supersonic speeds, including a study of the effectiveness of current design and analysis methods was conducted. The results show that a linearized theory analysis with estimated attainable thrust and vortex force effects can predict with reasonable accuracy the lifting efficiency of flat wings. Significantly better wing performance can be achieved through the use of twist and camber. Although linearized theory methods tend to overestimate the amount of twist and camber required for a given application and provide an overly optimistic performance prediction, these deficiencies can be overcome by implementation of recently developed empirical corrections. Numerous examples of the correlation of experiment and theory are presented to demonstrate the applicability and limitations of linearized theory methods with and without empirical corrections. The use of an Euler code for the estimation of aerodynamic characteristics of a twisted and cambered wing and its application to design by iteration are discussed.

  5. Hybrid Evidence Theory-based Finite Element/Statistical Energy Analysis method for mid-frequency analysis of built-up systems with epistemic uncertainties

    NASA Astrophysics Data System (ADS)

    Yin, Shengwen; Yu, Dejie; Yin, Hui; Lü, Hui; Xia, Baizhan

    2017-09-01

    Considering the epistemic uncertainties within the hybrid Finite Element/Statistical Energy Analysis (FE/SEA) model when it is used for the response analysis of built-up systems in the mid-frequency range, the hybrid Evidence Theory-based Finite Element/Statistical Energy Analysis (ETFE/SEA) model is established by introducing the evidence theory. Based on the hybrid ETFE/SEA model and the sub-interval perturbation technique, the hybrid Sub-interval Perturbation and Evidence Theory-based Finite Element/Statistical Energy Analysis (SIP-ETFE/SEA) approach is proposed. In the hybrid ETFE/SEA model, the uncertainty in the SEA subsystem is modeled by a non-parametric ensemble, while the uncertainty in the FE subsystem is described by the focal element and basic probability assignment (BPA), and dealt with evidence theory. Within the hybrid SIP-ETFE/SEA approach, the mid-frequency response of interest, such as the ensemble average of the energy response and the cross-spectrum response, is calculated analytically by using the conventional hybrid FE/SEA method. Inspired by the probability theory, the intervals of the mean value, variance and cumulative distribution are used to describe the distribution characteristics of mid-frequency responses of built-up systems with epistemic uncertainties. In order to alleviate the computational burdens for the extreme value analysis, the sub-interval perturbation technique based on the first-order Taylor series expansion is used in ETFE/SEA model to acquire the lower and upper bounds of the mid-frequency responses over each focal element. Three numerical examples are given to illustrate the feasibility and effectiveness of the proposed method.

  6. Combined linear theory/impact theory method for analysis and design of high speed configurations

    NASA Technical Reports Server (NTRS)

    Brooke, D.; Vondrasek, D. V.

    1980-01-01

    Pressure distributions on a wing body at Mach 4.63 are calculated. The combined theory is shown to give improved predictions over either linear theory or impact theory alone. The combined theory is also applied in the inverse design mode to calculate optimum camber slopes at Mach 4.63. Comparisons with optimum camber slopes obtained from unmodified linear theory show large differences. Analysis of the results indicate that the combined theory correctly predicts the effect of thickness on the loading distributions at high Mach numbers, and that finite thickness wings optimized at high Mach numbers using unmodified linear theory will not achieve the minimum drag characteristics for which they are designed.

  7. Shear Lag in Box Beams Methods of Analysis and Experimental Investigations

    NASA Technical Reports Server (NTRS)

    Kuhn, Paul; Chiarito, Patrick T

    1942-01-01

    The bending stresses in the covers of box beams or wide-flange beams differ appreciably from the stresses predicted by the ordinary bending theory on account of shear deformation of the flanges. The problem of predicting these differences has become known as the shear-lag problem. The first part of this paper deals with methods of shear-lag analysis suitable for practical use. The second part of the paper describes strain-gage tests made by the NACA to verify the theory. Three tests published by other investigators are also analyzed by the proposed method. The third part of the paper gives numerical examples illustrating the methods of analysis. An appendix gives comparisons with other methods, particularly with the method of Ebner and Koller.

  8. A variable-order laminated plate theory based on the variational-asymptotical method

    NASA Technical Reports Server (NTRS)

    Lee, Bok W.; Sutyrin, Vladislav G.; Hodges, Dewey H.

    1993-01-01

    The variational-asymptotical method is a mathematical technique by which the three-dimensional analysis of laminated plate deformation can be split into a linear, one-dimensional, through-the-thickness analysis and a nonlinear, two-dimensional, plate analysis. The elastic constants used in the plate analysis are obtained from the through-the-thickness analysis, along with approximate, closed-form three-dimensional distributions of displacement, strain, and stress. In this paper, a theory based on this technique is developed which is capable of approximating three-dimensional elasticity to any accuracy desired. The asymptotical method allows for the approximation of the through-the-thickness behavior in terms of the eigenfunctions of a certain Sturm-Liouville problem associated with the thickness coordinate. These eigenfunctions contain all the necessary information about the nonhomogeneities along the thickness coordinate of the plate and thus possess the appropriate discontinuities in the derivatives of displacement. The theory is presented in this paper along with numerical results for the eigenfunctions of various laminated plates.

  9. Research on numerical algorithms for large space structures

    NASA Technical Reports Server (NTRS)

    Denman, E. D.

    1982-01-01

    Numerical algorithms for large space structures were investigated with particular emphasis on decoupling method for analysis and design. Numerous aspects of the analysis of large systems ranging from the algebraic theory to lambda matrices to identification algorithms were considered. A general treatment of the algebraic theory of lambda matrices is presented and the theory is applied to second order lambda matrices.

  10. Category's analysis and operational project capacity method of transformation in design

    NASA Astrophysics Data System (ADS)

    Obednina, S. V.; Bystrova, T. Y.

    2015-10-01

    The method of transformation is attracting widespread interest in fields such contemporary design. However, in theory of design little attention has been paid to a categorical status of the term "transformation". This paper presents the conceptual analysis of transformation based on the theory of form employed in the influential essays by Aristotle and Thomas Aquinas. In the present work the transformation as a method of shaping design has been explored as well as potential application of this term in design has been demonstrated.

  11. Grounded Theory in Medical Education Research.

    PubMed

    Tavakol, Mohsen; Torabi, Sima; Akbar Zeinaloo, Ali

    2006-12-01

    The grounded theory method provides a systematic way to generate theoretical constructs or concepts that illuminate psychosocial processes common to individual who have a similar experience of the phenomenon under investigation. There has been an increase in the number of published research reports that use the grounded theory method. However, there has been less medical education research, which is based on the grounded theory tradition. The purpose of this paper is to introduce basic tenants of qualitative research paradigm with specific reference to ground theory. The paper aims to encourage readers to think how they might possibly use the grounded theory method in medical education research and to apply such a method to their own areas of interest. The important features of a grounded theory as well as its implications for medical education research are explored. Data collection and analysis are also discussed. It seems to be reasonable to incorporate knowledge of this kind in medical education research.

  12. Feasibility of combining linear theory and impact theory methods for the analysis and design of high speed configurations

    NASA Technical Reports Server (NTRS)

    Brooke, D.; Vondrasek, D. V.

    1978-01-01

    The aerodynamic influence coefficients calculated using an existing linear theory program were used to modify the pressures calculated using impact theory. Application of the combined approach to several wing-alone configurations shows that the combined approach gives improved predictions of the local pressure and loadings over either linear theory alone or impact theory alone. The approach not only removes most of the short-comings of the individual methods, as applied in the Mach 4 to 8 range, but also provides the basis for an inverse design procedure applicable to high speed configurations.

  13. Innovating Method of Existing Mechanical Product Based on TRIZ Theory

    NASA Astrophysics Data System (ADS)

    Zhao, Cunyou; Shi, Dongyan; Wu, Han

    Main way of product development is adaptive design and variant design based on existing product. In this paper, conceptual design frame and its flow model of innovating products is put forward through combining the methods of conceptual design and TRIZ theory. Process system model of innovating design that includes requirement analysis, total function analysis and decomposing, engineering problem analysis, finding solution of engineering problem and primarily design is constructed and this establishes the base for innovating design of existing product.

  14. Challenges in combining different data sets during analysis when using grounded theory.

    PubMed

    Rintala, Tuula-Maria; Paavilainen, Eija; Astedt-Kurki, Päivi

    2014-05-01

    To describe the challenges in combining two data sets during grounded theory analysis. The use of grounded theory in nursing research is common. It is a suitable method for studying human action and interaction. It is recommended that many alternative sources of data are collected to create as rich a dataset as possible. Data from interviews with people with diabetes (n=19) and their family members (n=19). Combining two data sets. When using grounded theory, there are numerous challenges in collecting and managing data, especially for the novice researcher. One challenge is to combine different data sets during the analysis. There are many methodological textbooks about grounded theory but there is little written in the literature about combining different data sets. Discussion is needed on the management of data and the challenges of grounded theory. This article provides a means for combining different data sets in the grounded theory analysis process.

  15. Developing Written Text Production Competence Using the Reader-Response Method

    ERIC Educational Resources Information Center

    Demény, Paraschiva

    2012-01-01

    The first part of the present paper deals with the analysis of the literary theory and linguistic background of the reader-response method, respectively with the presentation of the process of composition and its psychological components. The reader-response textual interpretation method can take several different approaches of literary theory,…

  16. [Cybernetics and biology].

    PubMed

    Vasil'ev, G F

    2013-01-01

    Owing to methodical disadvantages, the theory of control still lacks the potential for the analysis of biological systems. To get the full benefit of the method in addition to the algorithmic model of control (as of today the only used model in the theory of control) a parametric model of control is offered to employ. The reasoning for it is explained. The approach suggested provides the possibility to use all potential of the modern theory of control for the analysis of biological systems. The cybernetic approach is shown taking a system of the rise of glucose concentration in blood as an example.

  17. Shape design sensitivity analysis and optimization of three dimensional elastic solids using geometric modeling and automatic regridding. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Yao, Tse-Min; Choi, Kyung K.

    1987-01-01

    An automatic regridding method and a three dimensional shape design parameterization technique were constructed and integrated into a unified theory of shape design sensitivity analysis. An algorithm was developed for general shape design sensitivity analysis of three dimensional eleastic solids. Numerical implementation of this shape design sensitivity analysis method was carried out using the finite element code ANSYS. The unified theory of shape design sensitivity analysis uses the material derivative of continuum mechanics with a design velocity field that represents shape change effects over the structural design. Automatic regridding methods were developed by generating a domain velocity field with boundary displacement method. Shape design parameterization for three dimensional surface design problems was illustrated using a Bezier surface with boundary perturbations that depend linearly on the perturbation of design parameters. A linearization method of optimization, LINRM, was used to obtain optimum shapes. Three examples from different engineering disciplines were investigated to demonstrate the accuracy and versatility of this shape design sensitivity analysis method.

  18. An analysis of possible applications of fuzzy set theory to the actuarial credibility theory

    NASA Technical Reports Server (NTRS)

    Ostaszewski, Krzysztof; Karwowski, Waldemar

    1992-01-01

    In this work, we review the basic concepts of actuarial credibility theory from the point of view of introducing applications of the fuzzy set-theoretic method. We show how the concept of actuarial credibility can be modeled through the fuzzy set membership functions and how fuzzy set methods, especially fuzzy pattern recognition, can provide an alternative tool for estimating credibility.

  19. Apps of Steel: Are Exercise Apps Providing Consumers with Realistic Expectations?: A Content Analysis of Exercise Apps for Presence of Behavior Change Theory

    ERIC Educational Resources Information Center

    Cowan, Logan T.; Van Wagenen, Sarah A.; Brown, Brittany A.; Hedin, Riley J.; Seino-Stephan, Yukiko; Hall, P. Cougar; West, Joshua H.

    2013-01-01

    Objective. To quantify the presence of health behavior theory constructs in iPhone apps targeting physical activity. Methods. This study used a content analysis of 127 apps from Apple's (App Store) "Health & Fitness" category. Coders downloaded the apps and then used an established theory-based instrument to rate each app's inclusion of…

  20. Development of an Analysis Method to Identify the Root Causes of Finding from the Air Force Environmental Compliance Assessment and Management Program (ECAMP)

    DTIC Science & Technology

    1994-09-01

    Theories and Applications ..................... 17 Theories of Motivation ........................... 18 Maslow’s Hierarchy of Needs...18 Herzberg’s Motivation-Hygiene Theory ............. 19 Instrinsic vs Extrinsic Assumptions ................... 22 McGregor’s Theory X and Theory ...Y Assumptions ...... ... 22 Vroom’s Expectancy Theory ..................... 24 Applications ........ .......................... 25 Tell People What

  1. Analysis of the photophysical properties of zearalenone using density functional theory

    USDA-ARS?s Scientific Manuscript database

    The intrinsic photophysical properties of the resorcylic acid moiety of zearalenone offer a convenient label free method to determine zearalenone levels in contaminated agricultural products. Density functional theory and steady-state fluorescence methods were applied to investigate the role of stru...

  2. Application of the variational-asymptotical method to composite plates

    NASA Technical Reports Server (NTRS)

    Hodges, Dewey H.; Lee, Bok W.; Atilgan, Ali R.

    1992-01-01

    A method is developed for the 3D analysis of laminated plate deformation which is an extension of a variational-asymptotical method by Atilgan and Hodges (1991). Both methods are based on the treatment of plate deformation by splitting the 3D analysis into linear through-the-thickness analysis and 2D plate analysis. Whereas the first technique tackles transverse shear deformation in the second asymptotical approximation, the present method simplifies its treatment and restricts it to the first approximation. Both analytical techniques are applied to the linear cylindrical bending problem, and the strain and stress distributions are derived and compared with those of the exact solution. The present theory provides more accurate results than those of the classical laminated-plate theory for the transverse displacement of 2-, 3-, and 4-layer cross-ply laminated plates. The method can give reliable estimates of the in-plane strain and displacement distributions.

  3. Extending methods: using Bourdieu's field analysis to further investigate taste

    NASA Astrophysics Data System (ADS)

    Schindel Dimick, Alexandra

    2015-06-01

    In this commentary on Per Anderhag, Per-Olof Wickman and Karim Hamza's article Signs of taste for science, I consider how their study is situated within the concern for the role of science education in the social and cultural production of inequality. Their article provides a finely detailed methodology for analyzing the constitution of taste within science education classrooms. Nevertheless, because the authors' socially situated methodology draws upon Bourdieu's theories, it seems equally important to extend these methods to consider how and why students make particular distinctions within a relational context—a key aspect of Bourdieu's theory of cultural production. By situating the constitution of taste within Bourdieu's field analysis, researchers can explore the ways in which students' tastes and social positionings are established and transformed through time, space, place, and their ability to navigate the field. I describe the process of field analysis in relation to the authors' paper and suggest that combining the authors' methods with a field analysis can provide a strong methodological and analytical framework in which theory and methods combine to create a detailed understanding of students' interest in relation to their context.

  4. Fully Nonlinear Modeling and Analysis of Precision Membranes

    NASA Technical Reports Server (NTRS)

    Pai, P. Frank; Young, Leyland G.

    2003-01-01

    High precision membranes are used in many current space applications. This paper presents a fully nonlinear membrane theory with forward and inverse analyses of high precision membrane structures. The fully nonlinear membrane theory is derived from Jaumann strains and stresses, exact coordinate transformations, the concept of local relative displacements, and orthogonal virtual rotations. In this theory, energy and Newtonian formulations are fully correlated, and every structural term can be interpreted in terms of vectors. Fully nonlinear ordinary differential equations (ODES) governing the large static deformations of known axisymmetric membranes under known axisymmetric loading (i.e., forward problems) are presented as first-order ODES, and a method for obtaining numerically exact solutions using the multiple shooting procedure is shown. A method for obtaining the undeformed geometry of any axisymmetric membrane with a known inflated geometry and a known internal pressure (i.e., inverse problems) is also derived. Numerical results from forward analysis are verified using results in the literature, and results from inverse analysis are verified using known exact solutions and solutions from the forward analysis. Results show that the membrane theory and the proposed numerical methods for solving nonlinear forward and inverse membrane problems are accurate.

  5. Analysis and design of nonlinear resonances via singularity theory

    NASA Astrophysics Data System (ADS)

    Cirillo, G. I.; Habib, G.; Kerschen, G.; Sepulchre, R.

    2017-03-01

    Bifurcation theory and continuation methods are well-established tools for the analysis of nonlinear mechanical systems subject to periodic forcing. We illustrate the added value and the complementary information provided by singularity theory with one distinguished parameter. While tracking bifurcations reveals the qualitative changes in the behaviour, tracking singularities reveals how structural changes are themselves organised in parameter space. The complementarity of that information is demonstrated in the analysis of detached resonance curves in a two-degree-of-freedom system.

  6. Die sokratische Lehrstrategie und ihre Relevanz fur die heutige Didaktik (The Socratic Method and Its Relevance for Modern Teaching).

    ERIC Educational Resources Information Center

    Kanakis, Ioannis

    1997-01-01

    Examines the Socratic method through a comparative analysis of early Platonic dialogs with theories of critical rationalism and cognitive theories based on achievement motivation. Presents details of the Socratic strategy of teaching and learning, including critical reflection, conversation, and intellectual honesty; asserts that these methods are…

  7. Phenomenography and Grounded Theory as Research Methods in Computing Education Research Field

    ERIC Educational Resources Information Center

    Kinnunen, Paivi; Simon, Beth

    2012-01-01

    This paper discusses two qualitative research methods, phenomenography and grounded theory. We introduce both methods' data collection and analysis processes and the type or results you may get at the end by using examples from computing education research. We highlight some of the similarities and differences between the aim, data collection and…

  8. Subsonic and Supersonic Flutter Analysis of a Highly Tapered Swept-Wing Planform, Including Effects of Density Variation and Finite Wing Thickness, and Comparison with Experiments

    NASA Technical Reports Server (NTRS)

    Yates, Carson, Jr.

    1967-01-01

    The flutter characteristics of several wings with an aspect-ratio of 4.0, a taper ratio of 0.2, and a quarter-chord sweepback of 45 deg. have been investigated analytically for Mach numbers up to 2.0. The calculations were based on the modified-strip-analysis method, the subsonic-kernel-function method, piston theory, and quasi-steady second-order theory. Results of t h e analysis and comparisons with experiment indicated that: (1) Flutter speeds were accurately predicted by the modified strip analysis, although accuracy at t h e highest Mach numbers required the use of nonlinear aerodynamic theory (which accounts for effects of wing thickness) for the calculation of the aerodynamic parameters. (2) An abrupt increase of flutter-speed coefficient with increasing Mach number, observed experimentally in the transonic range, was also indicated by the modified strip analysis. (3) In the low supersonic range for some densities, a discontinuous variation of flutter frequency with Mach number was indicated by the modified strip analysis. An abrupt change of frequency appeared experimentally in the transonic range. (4) Differences in flutter-speed-coefficient levels obtained from tests at low supersonic Mach numbers in two wind tunnels were also predicted by the modified strip analysis and were shown to be caused primarily by differences in mass ratio. (5) Flutter speeds calculated by the subsonic-kernel-function method were in good agreement with experiment and with the results of the modified strip analysis. (6) Flutter speed obtained from piston theory and from quasi-steady second-order theory were higher than experimental values by at least 38 percent.

  9. Aerodynamic design and analysis system for supersonic aircraft. Part 1: General description and theoretical development

    NASA Technical Reports Server (NTRS)

    Middleton, W. D.; Lundry, J. L.

    1975-01-01

    An integrated system of computer programs has been developed for the design and analysis of supersonic configurations. The system uses linearized theory methods for the calculation of surface pressures and supersonic area rule concepts in combination with linearized theory for calculation of aerodynamic force coefficients. Interactive graphics are optional at the user's request. This part presents a general description of the system and describes the theoretical methods used.

  10. Order, topology and preference

    NASA Technical Reports Server (NTRS)

    Sertel, M. R.

    1971-01-01

    Some standard order-related and topological notions, facts, and methods are brought to bear on central topics in the theory of preference and the theory of optimization. Consequences of connectivity are considered, especially from the viewpoint of normally preordered spaces. Examples are given showing how the theory of preference, or utility theory, can be applied to social analysis.

  11. Application of the Bootstrap Methods in Factor Analysis.

    ERIC Educational Resources Information Center

    Ichikawa, Masanori; Konishi, Sadanori

    1995-01-01

    A Monte Carlo experiment was conducted to investigate the performance of bootstrap methods in normal theory maximum likelihood factor analysis when the distributional assumption was satisfied or unsatisfied. Problems arising with the use of bootstrap methods are highlighted. (SLD)

  12. A study of the limitations of linear theory methods as applied to sonic boom calculations

    NASA Technical Reports Server (NTRS)

    Darden, Christine M.

    1990-01-01

    Current sonic boom minimization theories have been reviewed to emphasize the capabilities and flexibilities of the methods. Flexibility is important because it is necessary for the designer to meet optimized area constraints while reducing the impact on vehicle aerodynamic performance. Preliminary comparisons of sonic booms predicted for two Mach 3 concepts illustrate the benefits of shaping. Finally, for very simple bodies of revolution, sonic boom predictions were made using two methods - a modified linear theory method and a nonlinear method - for signature shapes which were both farfield N-waves and midfield waves. Preliminary analysis on these simple bodies verified that current modified linear theory prediction methods become inadequate for predicting midfield signatures for Mach numbers above 3. The importance of impulse is sonic boom disturbance and the importance of three-dimensional effects which could not be simulated with the bodies of revolution will determine the validity of current modified linear theory methods in predicting midfield signatures at lower Mach numbers.

  13. Is it really theoretical? A review of sampling in grounded theory studies in nursing journals.

    PubMed

    McCrae, Niall; Purssell, Edward

    2016-10-01

    Grounded theory is a distinct method of qualitative research, where core features are theoretical sampling and constant comparative analysis. However, inconsistent application of these activities has been observed in published studies. This review assessed the use of theoretical sampling in grounded theory studies in nursing journals. An adapted systematic review was conducted. Three leading nursing journals (2010-2014) were searched for studies stating grounded theory as the method. Sampling was assessed using a concise rating tool. A high proportion (86%) of the 134 articles described an iterative process of data collection and analysis. However, half of the studies did not demonstrate theoretical sampling, with many studies declaring or indicating a purposive sampling approach throughout. Specific reporting guidelines for grounded theory studies should be developed to ensure that study reports describe an iterative process of fieldwork and theoretical development. © 2016 John Wiley & Sons Ltd.

  14. The Philosophy of Information as an Underlying and Unifying Theory of Information Science

    ERIC Educational Resources Information Center

    Tomic, Taeda

    2010-01-01

    Introduction: Philosophical analyses of theoretical principles underlying these sub-domains reveal philosophy of information as underlying meta-theory of information science. Method: Conceptual research on the knowledge sub-domains in information science and philosophy and analysis of their mutual connection. Analysis: Similarities between…

  15. The Theory of Planned Behavior and Helmet Use among College Students

    ERIC Educational Resources Information Center

    Ross, Lisa Thomson; Ross, Thomas P.; Farber, Sarah; Davidson, Caroline; Trevino, Meredith; Hawkins, Ashley

    2011-01-01

    Objectives: To assess undergraduate helmet use attitudes and behaviors in accordance with the theory of planned behavior (TPB). We predicted helmet wearers and nonwearers would differ on our subscales. Methods: Participants (N = 414, 69% female, 84% white) completed a survey. Results: Principal component analysis and reliability analysis guided…

  16. Microgenetic Learning Analysis: A Methodology for Studying Knowledge in Transition

    ERIC Educational Resources Information Center

    Parnafes, O.; diSessa, A. A.

    2013-01-01

    This paper introduces and exemplifies a qualitative method for studying learning, "microgenetic learning analysis" (MLA), which is aimed jointly at developing theory and at establishing useful empirical results. Among modern methodologies, the focus on theory is somewhat distinctive. We use two strategies to describe MLA. First, we develop a…

  17. Who uses nursing theory? A univariate descriptive analysis of five years' research articles.

    PubMed

    Bond, A Elaine; Eshah, Nidal Farid; Bani-Khaled, Mohammed; Hamad, Atef Omar; Habashneh, Samira; Kataua', Hussein; al-Jarrah, Imad; Abu Kamal, Andaleeb; Hamdan, Falastine Rafic; Maabreh, Roqia

    2011-06-01

    Since the early 1950s, nursing leaders have worked diligently to build the Scientific Discipline of Nursing, integrating Theory, Research and Practice. Recently, the role of theory has again come into question, with some scientists claiming nurses are not using theory to guide their research, with which to improve practice. The purposes of this descriptive study were to determine: (i) Were nursing scientists' research articles in leading nursing journals based on theory? (ii) If so, were the theories nursing theories or borrowed theories? (iii) Were the theories integrated into the studies, or were they used as organizing frameworks? Research articles from seven top ISI journals were analysed, excluding regularly featured columns, meta-analyses, secondary analysis, case studies and literature reviews. The authors used King's dynamic Interacting system and Goal Attainment Theory as an organizing framework. They developed consensus on how to identify the integration of theory, searching the Title, Abstract, Aims, Methods, Discussion and Conclusion sections of each research article, whether quantitative or qualitative. Of 2857 articles published in the seven journals from 2002 to, and including, 2006, 2184 (76%) were research articles. Of the 837 (38%) authors who used theories, 460 (55%) used nursing theories, 377 (45%) used other theories: 776 (93%) of those who used theory integrated it into their studies, including qualitative studies, while 51 (7%) reported they used theory as an organizing framework for their studies. Closer analysis revealed theory principles were implicitly implied, even in research reports that did not explicitly report theory usage. Increasing numbers of nursing research articles (though not percentagewise) continue to be guided by theory, and not always by nursing theory. Newer nursing research methods may not explicitly state the use of nursing theory, though it is implicitly implied. © 2010 The Authors. Scandinavian Journal of Caring Sciences © 2010 Nordic College of Caring Science.

  18. Detecting spatio-temporal modes in multivariate data by entropy field decomposition

    NASA Astrophysics Data System (ADS)

    Frank, Lawrence R.; Galinsky, Vitaly L.

    2016-09-01

    A new data analysis method that addresses a general problem of detecting spatio-temporal variations in multivariate data is presented. The method utilizes two recent and complimentary general approaches to data analysis, information field theory (IFT) and entropy spectrum pathways (ESPs). Both methods reformulate and incorporate Bayesian theory, thus use prior information to uncover underlying structure of the unknown signal. Unification of ESP and IFT creates an approach that is non-Gaussian and nonlinear by construction and is found to produce unique spatio-temporal modes of signal behavior that can be ranked according to their significance, from which space-time trajectories of parameter variations can be constructed and quantified. Two brief examples of real world applications of the theory to the analysis of data bearing completely different, unrelated nature, lacking any underlying similarity, are also presented. The first example provides an analysis of resting state functional magnetic resonance imaging data that allowed us to create an efficient and accurate computational method for assessing and categorizing brain activity. The second example demonstrates the potential of the method in the application to the analysis of a strong atmospheric storm circulation system during the complicated stage of tornado development and formation using data recorded by a mobile Doppler radar. Reference implementation of the method will be made available as a part of the QUEST toolkit that is currently under development at the Center for Scientific Computation in Imaging.

  19. One-dimensional analysis of filamentary composite beam columns with thin-walled open sections

    NASA Technical Reports Server (NTRS)

    Lo, Patrick K.-L.; Johnson, Eric R.

    1986-01-01

    Vlasov's one-dimensional structural theory for thin-walled open section bars was originally developed and used for metallic elements. The theory was recently extended to laminated bars fabricated from advanced composite materials. The purpose of this research is to provide a study and assessment of the extended theory. The focus is on flexural and torsional-flexural buckling of thin-walled, open section, laminated composite columns. Buckling loads are computed from the theory using a linear bifurcation analysis and a geometrically nonlinear beam column analysis by the finite element method. Results from the analyses are compared to available test data.

  20. Theory analysis for Pender's health promotion model (HPM) by Barnum's criteria: a critical perspective.

    PubMed

    Khoshnood, Zohreh; Rayyani, Masoud; Tirgari, Batool

    2018-01-13

    Background Analysis of nursing theoretical works and its role in knowledge development is presented as an essential process of critical reflection. Health promotion model (HPM) focuses on helping people achieve higher levels of well-being and identifies background factors that influence health behaviors. Objectives This paper aims to evaluate, and critique HPM by Barnum's criteria. Methods The present study reviewed books and articles derived from Proquest, PubMed, Blackwell Databases. The method of evaluation for this model is based on Barnum's criteria for analysis, application and evaluation of nursing theories. The criteria selected by Barnum embrace both internal and external criticism. Internal criticism deals with how theory components fit with each other (internal construction of theory) and external criticism deals with the way in which theory relates to the extended world (which considers theory in its relationships to human beings, nursing, and health). Results The electronic database search yielded over 27,717 titles and abstracts. Following removal of duplicates, 18,963 titles and abstracts were screened using the inclusion criteria and 1278 manuscripts were retrieved. Of these, 80 were specific to HPM and 23 to analysis of any theory in nursing relating to the aim of this article. After final selection using the inclusion criteria for this review, 28 manuscripts were identified as examining the factors contributing to theory analysis. Evaluation of health promotion theory showed that the philosophical claims and their content are consistent and clear. HPM has a logical structure and was applied to diverse age groups from differing cultures with varying health concerns. Conclusion In conclusion, among the strategies for theory critique, the Barnum approach is structured and accurate, considers theory in its relationship to human beings, community psychiatric nursing, and health. While according to Pender, nursing assessment, diagnosis and interventions are utilized to operationalize the HPM through practical application and research.

  1. Gender similarities and differences.

    PubMed

    Hyde, Janet Shibley

    2014-01-01

    Whether men and women are fundamentally different or similar has been debated for more than a century. This review summarizes major theories designed to explain gender differences: evolutionary theories, cognitive social learning theory, sociocultural theory, and expectancy-value theory. The gender similarities hypothesis raises the possibility of theorizing gender similarities. Statistical methods for the analysis of gender differences and similarities are reviewed, including effect sizes, meta-analysis, taxometric analysis, and equivalence testing. Then, relying mainly on evidence from meta-analyses, gender differences are reviewed in cognitive performance (e.g., math performance), personality and social behaviors (e.g., temperament, emotions, aggression, and leadership), and psychological well-being. The evidence on gender differences in variance is summarized. The final sections explore applications of intersectionality and directions for future research.

  2. A Practical Method of Policy Analysis by Simulating Policy Options

    ERIC Educational Resources Information Center

    Phelps, James L.

    2011-01-01

    This article focuses on a method of policy analysis that has evolved from the previous articles in this issue. The first section, "Toward a Theory of Educational Production," identifies concepts from science and achievement production to be incorporated into this policy analysis method. Building on Kuhn's (1970) discussion regarding paradigms, the…

  3. Reliability assessment of different plate theories for elastic wave propagation analysis in functionally graded plates.

    PubMed

    Mehrkash, Milad; Azhari, Mojtaba; Mirdamadi, Hamid Reza

    2014-01-01

    The importance of elastic wave propagation problem in plates arises from the application of ultrasonic elastic waves in non-destructive evaluation of plate-like structures. However, precise study and analysis of acoustic guided waves especially in non-homogeneous waveguides such as functionally graded plates are so complicated that exact elastodynamic methods are rarely employed in practical applications. Thus, the simple approximate plate theories have attracted much interest for the calculation of wave fields in FGM plates. Therefore, in the current research, the classical plate theory (CPT), first-order shear deformation theory (FSDT) and third-order shear deformation theory (TSDT) are used to obtain the transient responses of flexural waves in FGM plates subjected to transverse impulsive loadings. Moreover, comparing the results with those based on a well recognized hybrid numerical method (HNM), we examine the accuracy of the plate theories for several plates of various thicknesses under excitations of different frequencies. The material properties of the plate are assumed to vary across the plate thickness according to a simple power-law distribution in terms of volume fractions of constituents. In all analyses, spatial Fourier transform together with modal analysis are applied to compute displacement responses of the plates. A comparison of the results demonstrates the reliability ranges of the approximate plate theories for elastic wave propagation analysis in FGM plates. Furthermore, based on various examples, it is shown that whenever the plate theories are used within the appropriate ranges of plate thickness and frequency content, solution process in wave number-time domain based on modal analysis approach is not only sufficient but also efficient for finding the transient waveforms in FGM plates. Copyright © 2013 Elsevier B.V. All rights reserved.

  4. Mediation Analysis of an Adolescent HIV/STI/Pregnancy Prevention Intervention

    ERIC Educational Resources Information Center

    Glassman, Jill R.; Franks, Heather M.; Baumler, Elizabeth R.; Coyle, Karin K.

    2014-01-01

    Most interventions designed to prevent HIV/STI/pregnancy risk behaviours in young people have multiple components based on psychosocial theories (e.g. social cognitive theory) dictating sets of mediating variables to influence to achieve desired changes in behaviours. Mediation analysis is a method for investigating the extent to which a variable…

  5. A Meta-Analytic Review of Research on Gender Differences in Sexuality, 1993-2007

    ERIC Educational Resources Information Center

    Petersen, Jennifer L.; Hyde, Janet Shibley

    2010-01-01

    In 1993 Oliver and Hyde conducted a meta-analysis on gender differences in sexuality. The current study updated that analysis with current research and methods. Evolutionary psychology, cognitive social learning theory, social structural theory, and the gender similarities hypothesis provided predictions about gender differences in sexuality. We…

  6. ATLAS, an integrated structural analysis and design system. Volume 6: Design module theory

    NASA Technical Reports Server (NTRS)

    Backman, B. F.

    1979-01-01

    The automated design theory underlying the operation of the ATLAS Design Module is decribed. The methods, applications and limitations associated with the fully stressed design, the thermal fully stressed design and a regional optimization algorithm are presented. A discussion of the convergence characteristics of the fully stressed design is also included. Derivations and concepts specific to the ATLAS design theory are shown, while conventional terminology and established methods are identified by references.

  7. Comparisons of Exploratory and Confirmatory Factor Analysis.

    ERIC Educational Resources Information Center

    Daniel, Larry G.

    Historically, most researchers conducting factor analysis have used exploratory methods. However, more recently, confirmatory factor analytic methods have been developed that can directly test theory either during factor rotation using "best fit" rotation methods or during factor extraction, as with the LISREL computer programs developed…

  8. Decision Making Methods in Space Economics and Systems Engineering

    NASA Technical Reports Server (NTRS)

    Shishko, Robert

    2006-01-01

    This viewgraph presentation reviews various methods of decision making and the impact that they have on space economics and systems engineering. Some of the methods discussed are: Present Value and Internal Rate of Return (IRR); Cost-Benefit Analysis; Real Options; Cost-Effectiveness Analysis; Cost-Utility Analysis; Multi-Attribute Utility Theory (MAUT); and Analytic Hierarchy Process (AHP).

  9. Further studies using matched filter theory and stochastic simulation for gust loads prediction

    NASA Technical Reports Server (NTRS)

    Scott, Robert C.; Pototzky, Anthony S.; Perry, Boyd Iii

    1993-01-01

    This paper describes two analysis methods -- one deterministic, the other stochastic -- for computing maximized and time-correlated gust loads for aircraft with nonlinear control systems. The first method is based on matched filter theory; the second is based on stochastic simulation. The paper summarizes the methods, discusses the selection of gust intensity for each method and presents numerical results. A strong similarity between the results from the two methods is seen to exist for both linear and nonlinear configurations.

  10. Detecting Spatio-Temporal Modes in Multivariate Data by Entropy Field Decomposition

    PubMed Central

    Frank, Lawrence R.; Galinsky, Vitaly L.

    2016-01-01

    A new data analysis method that addresses a general problem of detecting spatio-temporal variations in multivariate data is presented. The method utilizes two recent and complimentary general approaches to data analysis, information field theory (IFT) and entropy spectrum pathways (ESP). Both methods reformulate and incorporate Bayesian theory, thus use prior information to uncover underlying structure of the unknown signal. Unification of ESP and IFT creates an approach that is non-Gaussian and non-linear by construction and is found to produce unique spatio-temporal modes of signal behavior that can be ranked according to their significance, from which space-time trajectories of parameter variations can be constructed and quantified. Two brief examples of real world applications of the theory to the analysis of data bearing completely different, unrelated nature, lacking any underlying similarity, are also presented. The first example provides an analysis of resting state functional magnetic resonance imaging (rsFMRI) data that allowed us to create an efficient and accurate computational method for assessing and categorizing brain activity. The second example demonstrates the potential of the method in the application to the analysis of a strong atmospheric storm circulation system during the complicated stage of tornado development and formation using data recorded by a mobile Doppler radar. Reference implementation of the method will be made available as a part of the QUEST toolkit that is currently under development at the Center for Scientific Computation in Imaging. PMID:27695512

  11. Prior approval: the growth of Bayesian methods in psychology.

    PubMed

    Andrews, Mark; Baguley, Thom

    2013-02-01

    Within the last few years, Bayesian methods of data analysis in psychology have proliferated. In this paper, we briefly review the history or the Bayesian approach to statistics, and consider the implications that Bayesian methods have for the theory and practice of data analysis in psychology.

  12. The Circuit Theory Behind Coupled-Mode Magnetic Resonance-Based Wireless Power Transmission.

    PubMed

    Kiani, Mehdi; Ghovanloo, Maysam

    2012-09-01

    Inductive coupling is a viable scheme to wirelessly energize devices with a wide range of power requirements from nanowatts in radio frequency identification tags to milliwatts in implantable microelectronic devices, watts in mobile electronics, and kilowatts in electric cars. Several analytical methods for estimating the power transfer efficiency (PTE) across inductive power transmission links have been devised based on circuit and electromagnetic theories by electrical engineers and physicists, respectively. However, a direct side-by-side comparison between these two approaches is lacking. Here, we have analyzed the PTE of a pair of capacitively loaded inductors via reflected load theory (RLT) and compared it with a method known as coupled-mode theory (CMT). We have also derived PTE equations for multiple capacitively loaded inductors based on both RLT and CMT. We have proven that both methods basically result in the same set of equations in steady state and either method can be applied for short- or midrange coupling conditions. We have verified the accuracy of both methods through measurements, and also analyzed the transient response of a pair of capacitively loaded inductors. Our analysis shows that the CMT is only applicable to coils with high quality factor ( Q ) and large coupling distance. It simplifies the analysis by reducing the order of the differential equations by half compared to the circuit theory.

  13. The Circuit Theory Behind Coupled-Mode Magnetic Resonance-Based Wireless Power Transmission

    PubMed Central

    Kiani, Mehdi; Ghovanloo, Maysam

    2014-01-01

    Inductive coupling is a viable scheme to wirelessly energize devices with a wide range of power requirements from nanowatts in radio frequency identification tags to milliwatts in implantable microelectronic devices, watts in mobile electronics, and kilowatts in electric cars. Several analytical methods for estimating the power transfer efficiency (PTE) across inductive power transmission links have been devised based on circuit and electromagnetic theories by electrical engineers and physicists, respectively. However, a direct side-by-side comparison between these two approaches is lacking. Here, we have analyzed the PTE of a pair of capacitively loaded inductors via reflected load theory (RLT) and compared it with a method known as coupled-mode theory (CMT). We have also derived PTE equations for multiple capacitively loaded inductors based on both RLT and CMT. We have proven that both methods basically result in the same set of equations in steady state and either method can be applied for short- or midrange coupling conditions. We have verified the accuracy of both methods through measurements, and also analyzed the transient response of a pair of capacitively loaded inductors. Our analysis shows that the CMT is only applicable to coils with high quality factor (Q) and large coupling distance. It simplifies the analysis by reducing the order of the differential equations by half compared to the circuit theory. PMID:24683368

  14. Statistical energy analysis computer program, user's guide

    NASA Technical Reports Server (NTRS)

    Trudell, R. W.; Yano, L. I.

    1981-01-01

    A high frequency random vibration analysis, (statistical energy analysis (SEA) method) is examined. The SEA method accomplishes high frequency prediction of arbitrary structural configurations. A general SEA computer program is described. A summary of SEA theory, example problems of SEA program application, and complete program listing are presented.

  15. [The grounded theory as a methodological alternative for nursing research].

    PubMed

    dos Santos, Sérgio Ribeiro; da Nóbrega, Maria Miriam

    2002-01-01

    This study presents a method of interpretative and systematic research with appliance to the development of studies in nursing called "the grounded theory", whose theoretical support is the symbolic interactionism. The purpose of the paper is to describe the grounded theory as an alternative methodology for the construction of knowledge in nursing. The study highlights four topics: the basic principle, the basic concepts, the trajectory of the method and the process of analysis of the data. We conclude that the systematization of data and its interpretation, based on social actors' experience, constitute strong subsidies to generate theories through this research tool.

  16. A pedagogical derivation of the matrix element method in particle physics data analysis

    NASA Astrophysics Data System (ADS)

    Sumowidagdo, Suharyo

    2018-03-01

    The matrix element method provides a direct connection between the underlying theory of particle physics processes and detector-level physical observables. I am presenting a pedagogically-oriented derivation of the matrix element method, drawing from elementary concepts in probability theory, statistics, and the process of experimental measurements. The level of treatment should be suitable for beginning research student in phenomenology and experimental high energy physics.

  17. Backscatter analysis of dihedral corner reflectors using physical optics and the physical theory of diffraction

    NASA Technical Reports Server (NTRS)

    Griesser, Timothy; Balanis, Constantine A.

    1987-01-01

    The backscatter cross-sections of dihedral corner reflectors in the azimuthal plane are presently determined by both physical optics (PO) and the physical theory of diffraction (PTD), yielding results for the vertical and horizontal polarizations. In the first analysis method used, geometrical optics is used in place of PO at initial reflections in order to maintain the planar character of the reflected wave and reduce the complexity of the analysis. In the second method, PO is used at almost every reflection in order to maximize the accuracy of the PTD solution at the expense of a rapid increase in complexity. Induced surface current densities and resulting cross section patterns are illustrated for the two methods.

  18. Bayesian Analysis of Multidimensional Item Response Theory Models: A Discussion and Illustration of Three Response Style Models

    ERIC Educational Resources Information Center

    Leventhal, Brian C.; Stone, Clement A.

    2018-01-01

    Interest in Bayesian analysis of item response theory (IRT) models has grown tremendously due to the appeal of the paradigm among psychometricians, advantages of these methods when analyzing complex models, and availability of general-purpose software. Possible models include models which reflect multidimensionality due to designed test structure,…

  19. A Proposal for the Implementation of Programs for Culturally Diverse Students on a Predominantly White University Campus.

    ERIC Educational Resources Information Center

    Land, Elizabeth R.; Land, Warren A.

    An analysis was done of methods for dealing with cultural insensitivity found on predominantly white university campuses and of strategies for remedying the dissatisfaction of students from minority groups with their college experience. The analysis used Arthur Chickering's vectors of development theory and Alexander Astin's theory of student…

  20. The Analysis of Ratings Using Generalizability Theory for Student Outcome Assessment. AIR 1988 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Erwin, T. Dary

    Rating scales are a typical method for evaluating a student's performance in outcomes assessment. The analysis of the quality of information from rating scales poses special measurement problems when researchers work with faculty in their development. Generalizability measurement theory offers a set of techniques for estimating errors or…

  1. An Analysis of Measured Pressure Signatures From Two Theory-Validation Low-Boom Models

    NASA Technical Reports Server (NTRS)

    Mack, Robert J.

    2003-01-01

    Two wing/fuselage/nacelle/fin concepts were designed to check the validity and the applicability of sonic-boom minimization theory, sonic-boom analysis methods, and low-boom design methodology in use at the end of the 1980is. Models of these concepts were built, and the pressure signatures they generated were measured in the wind-tunnel. The results of these measurements lead to three conclusions: (1) the existing methods could adequately predict sonic-boom characteristics of wing/fuselage/fin(s) configurations if the equivalent area distributions of each component were smooth and continuous; (2) these methods needed revision so the engine-nacelle volume and the nacelle-wing interference lift disturbances could be accurately predicted; and (3) current nacelle-configuration integration methods had to be updated. With these changes in place, the existing sonic-boom analysis and minimization methods could be effectively applied to supersonic-cruise concepts for acceptable/tolerable sonic-boom overpressures during cruise.

  2. An Analysis of Class II Supplies Requisitions in the Korean Army’s Organizational Supply

    DTIC Science & Technology

    2009-03-26

    five methods for qualitative research : Case study , Ethnography , 45 Phenomenological study , Grounded theory , and...Approaches .. 42 Table 9 Five Qualitative Research Methods ..................................................................... 45 Table 10 Six...Content analysis. Table 9 provides a brief overview of the five methods . Table 9 Five Qualitative

  3. Statistical Optics

    NASA Astrophysics Data System (ADS)

    Goodman, Joseph W.

    2000-07-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research

  4. Development of a Higher Order Laminate Theory for Modeling Composites with Induced Strain Actuators

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; Seeley, Charles E.

    1996-01-01

    A refined higher order plate theory is developed to investigate the actuation mechanism of piezoelectric materials surface bonded or embedded in composite laminates. The current analysis uses a displacement field which accurately accounts for transverse shear stresses. Some higher order terms are identified by using the conditions that shear stresses vanish at all free surfaces. Therefore, all boundary conditions for displacements and stresses are satisfied in the present theory. The analysis is implemented using the finite element method which provides a convenient means to construct a numerical solution due to the discrete nature of the actuators. The higher order theory is computationally less expensive than a full three dimensional analysis. The theory is also shown to agree well with published experimental results. Numerical examples are presented for composite plates with thicknesses ranging from thin to very thick.

  5. A tribute to John Gibbon.

    PubMed

    Church, Russell M.

    2002-04-28

    This article provides an overview of the published research of John Gibbon. It describes his experimental research on scalar timing and his development of scalar timing theory. It also describes his methods of research which included mathematical analysis, conditioning methods, psychophysical methods and secondary data analysis. Finally, it describes his application of scalar timing theory to avoidance and punishment, autoshaping, temporal perception and timed behavior, foraging, circadian rhythms, human timing, and the effect of drugs on timed perception and timed performance of Parkinson's patients. The research of Gibbon has shown the essential role of timing in perception, classical conditioning, instrumental learning, behavior in natural environments and in neuropsychology.

  6. A general numerical analysis of the superconducting quasiparticle mixer

    NASA Technical Reports Server (NTRS)

    Hicks, R. G.; Feldman, M. J.; Kerr, A. R.

    1985-01-01

    For very low noise millimeter-wave receivers, the superconductor-insulator-superconductor (SIS) quasiparticle mixer is now competitive with conventional Schottky mixers. Tucker (1979, 1980) has developed a quantum theory of mixing which has provided a basis for the rapid improvement in SIS mixer performance. The present paper is concerned with a general method of numerical analysis for SIS mixers which allows arbitrary terminating impedances for all the harmonic frequencies. This analysis provides an approach for an examination of the range of validity of the three-frequency results of the quantum mixer theory. The new method has been implemented with the aid of a Fortran computer program.

  7. Development and application of an analysis of axisymmetric body effects on helicopter rotor aerodynamics using modified slender body theory

    NASA Technical Reports Server (NTRS)

    Yamauchi, G.; Johnson, W.

    1984-01-01

    A computationally efficient body analysis designed to couple with a comprehensive helicopter analysis is developed in order to calculate the body-induced aerodynamic effects on rotor performance and loads. A modified slender body theory is used as the body model. With the objective of demonstrating the accuracy, efficiency, and application of the method, the analysis at this stage is restricted to axisymmetric bodies at zero angle of attack. By comparing with results from an exact analysis for simple body shapes, it is found that the modified slender body theory provides an accurate potential flow solution for moderately thick bodies, with only a 10%-20% increase in computational effort over that of an isolated rotor analysis. The computational ease of this method provides a means for routine assessment of body-induced effects on a rotor. Results are given for several configurations that typify those being used in the Ames 40- by 80-Foot Wind Tunnel and in the rotor-body aerodynamic interference tests being conducted at Ames. A rotor-hybrid airship configuration is also analyzed.

  8. Efficient High-Fidelity, Geometrically Exact, Multiphysics Structural Models

    DTIC Science & Technology

    2011-10-14

    fuctionally graded core. International Journal for Numerical Methods in Engineering, 68:940– 966, 2006. 7F. Shang, Z. Wang, and Z. Li. Analysis of...normal deformable plate theory and MLPG method with radial basis fuctions . Composite Structures, 80:539– 552, 2007. 17W. Zhen and W. Chen. A higher-order...functionally graded plates by using higher-order shear and normal deformable plate theory and MLPG method with radial basis fuctions . Composite Structures, 80

  9. Modelling and Analysis of the Excavation Phase by the Theory of Blocks Method of Tunnel 4 Kherrata Gorge, Algeria

    NASA Astrophysics Data System (ADS)

    Boukarm, Riadh; Houam, Abdelkader; Fredj, Mohammed; Boucif, Rima

    2017-12-01

    The aim of our work is to check the stability during excavation tunnel work in the rock mass of Kherrata, connecting the cities of Bejaia to Setif. The characterization methods through the Q system (method of Barton), RMR (Bieniawski classification) allowed us to conclude that the quality of rock mass is average in limestone, and poor in fractured limestone. Then modelling of excavation phase using the theory of blocks method (Software UNWEDGE) with the parameters from the recommendations of classification allowed us to check stability and to finally conclude that the use of geomechanical classification and the theory of blocks can be considered reliable in preliminary design.

  10. Improving the evaluation of therapeutic interventions in multiple sclerosis: the role of new psychometric methods.

    PubMed

    Hobart, J; Cano, S

    2009-02-01

    In this monograph we examine the added value of new psychometric methods (Rasch measurement and Item Response Theory) over traditional psychometric approaches by comparing and contrasting their psychometric evaluations of existing sets of rating scale data. We have concentrated on Rasch measurement rather than Item Response Theory because we believe that it is the more advantageous method for health measurement from a conceptual, theoretical and practical perspective. Our intention is to provide an authoritative document that describes the principles of Rasch measurement and the practice of Rasch analysis in a clear, detailed, non-technical form that is accurate and accessible to clinicians and researchers in health measurement. A comparison was undertaken of traditional and new psychometric methods in five large sets of rating scale data: (1) evaluation of the Rivermead Mobility Index (RMI) in data from 666 participants in the Cannabis in Multiple Sclerosis (CAMS) study; (2) evaluation of the Multiple Sclerosis Impact Scale (MSIS-29) in data from 1725 people with multiple sclerosis; (3) evaluation of test-retest reliability of MSIS-29 in data from 150 people with multiple sclerosis; (4) examination of the use of Rasch analysis to equate scales purporting to measure the same health construct in 585 people with multiple sclerosis; and (5) comparison of relative responsiveness of the Barthel Index and Functional Independence Measure in data from 1400 people undergoing neurorehabilitation. Both Rasch measurement and Item Response Theory are conceptually and theoretically superior to traditional psychometric methods. Findings from each of the five studies show that Rasch analysis is empirically superior to traditional psychometric methods for evaluating rating scales, developing rating scales, analysing rating scale data, understanding and measuring stability and change, and understanding the health constructs we seek to quantify. There is considerable added value in using Rasch analysis rather than traditional psychometric methods in health measurement. Future research directions include the need to reproduce our findings in a range of clinical populations, detailed head-to-head comparisons of Rasch analysis and Item Response Theory, and the application of Rasch analysis to clinical practice.

  11. Scale-free crystallization of two-dimensional complex plasmas: Domain analysis using Minkowski tensors

    NASA Astrophysics Data System (ADS)

    Böbel, A.; Knapek, C. A.; Räth, C.

    2018-05-01

    Experiments of the recrystallization processes in two-dimensional complex plasmas are analyzed to rigorously test a recently developed scale-free phase transition theory. The "fractal-domain-structure" (FDS) theory is based on the kinetic theory of Frenkel. It assumes the formation of homogeneous domains, separated by defect lines, during crystallization and a fractal relationship between domain area and boundary length. For the defect number fraction and system energy a scale-free power-law relation is predicted. The long-range scaling behavior of the bond-order correlation function shows clearly that the complex plasma phase transitions are not of the Kosterlitz, Thouless, Halperin, Nelson, and Young type. Previous preliminary results obtained by counting the number of dislocations and applying a bond-order metric for structural analysis are reproduced. These findings are supplemented by extending the use of the bond-order metric to measure the defect number fraction and furthermore applying state-of-the-art analysis methods, allowing a systematic testing of the FDS theory with unprecedented scrutiny: A morphological analysis of lattice structure is performed via Minkowski tensor methods. Minkowski tensors form a complete family of additive, motion covariant and continuous morphological measures that are sensitive to nonlinear properties. The FDS theory is rigorously confirmed and predictions of the theory are reproduced extremely well. The predicted scale-free power-law relation between defect fraction number and system energy is verified for one more order of magnitude at high energies compared to the inherently discontinuous bond-order metric. It is found that the fractal relation between crystalline domain area and circumference is independent of the experiment, the particular Minkowski tensor method, and the particular choice of parameters. Thus, the fractal relationship seems to be inherent to two-dimensional phase transitions in complex plasmas. Minkowski tensor analysis turns out to be a powerful tool for investigations of crystallization processes. It is capable of revealing nonlinear local topological properties, however, still provides easily interpretable results founded on a solid mathematical framework.

  12. Study on the application of MRF and the D-S theory to image segmentation of the human brain and quantitative analysis of the brain tissue

    NASA Astrophysics Data System (ADS)

    Guan, Yihong; Luo, Yatao; Yang, Tao; Qiu, Lei; Li, Junchang

    2012-01-01

    The features of the spatial information of Markov random field image was used in image segmentation. It can effectively remove the noise, and get a more accurate segmentation results. Based on the fuzziness and clustering of pixel grayscale information, we find clustering center of the medical image different organizations and background through Fuzzy cmeans clustering method. Then we find each threshold point of multi-threshold segmentation through two dimensional histogram method, and segment it. The features of fusing multivariate information based on the Dempster-Shafer evidence theory, getting image fusion and segmentation. This paper will adopt the above three theories to propose a new human brain image segmentation method. Experimental result shows that the segmentation result is more in line with human vision, and is of vital significance to accurate analysis and application of tissues.

  13. An Investigation of the Accuracy of Alternative Methods of True Score Estimation in High-Stakes Mixed-Format Examinations.

    ERIC Educational Resources Information Center

    Klinger, Don A.; Rogers, W. Todd

    2003-01-01

    The estimation accuracy of procedures based on classical test score theory and item response theory (generalized partial credit model) were compared for examinations consisting of multiple-choice and extended-response items. Analysis of British Columbia Scholarship Examination results found an error rate of about 10 percent for both methods, with…

  14. Integrating Cross-Case Analyses and Process Tracing in Set-Theoretic Research: Strategies and Parameters of Debate

    ERIC Educational Resources Information Center

    Beach, Derek; Rohlfing, Ingo

    2018-01-01

    In recent years, there has been increasing interest in the combination of two methods on the basis of set theory. In our introduction and this special issue, we focus on two variants of cross-case set-theoretic methods--"qualitative comparative analysis" (QCA) and typological theory (TT)--and their combination with process tracing (PT).…

  15. Implementation of Laminate Theory Into Strain Rate Dependent Micromechanics Analysis of Polymer Matrix Composites

    NASA Technical Reports Server (NTRS)

    Goldberg, Robert K.

    2000-01-01

    A research program is in progress to develop strain rate dependent deformation and failure models for the analysis of polymer matrix composites subject to impact loads. Previously, strain rate dependent inelastic constitutive equations developed to model the polymer matrix were implemented into a mechanics of materials based micromechanics method. In the current work, the computation of the effective inelastic strain in the micromechanics model was modified to fully incorporate the Poisson effect. The micromechanics equations were also combined with classical laminate theory to enable the analysis of symmetric multilayered laminates subject to in-plane loading. A quasi-incremental trapezoidal integration method was implemented to integrate the constitutive equations within the laminate theory. Verification studies were conducted using an AS4/PEEK composite using a variety of laminate configurations and strain rates. The predicted results compared well with experimentally obtained values.

  16. Incompressible boundary-layer stability analysis of LFC experimental data for sub-critical Mach numbers. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Berry, S. A.

    1986-01-01

    An incompressible boundary-layer stability analysis of Laminar Flow Control (LFC) experimental data was completed and the results are presented. This analysis was undertaken for three reasons: to study laminar boundary-layer stability on a modern swept LFC airfoil; to calculate incompressible design limits of linear stability theory as applied to a modern airfoil at high subsonic speeds; and to verify the use of linear stability theory as a design tool. The experimental data were taken from the slotted LFC experiment recently completed in the NASA Langley 8-Foot Transonic Pressure Tunnel. Linear stability theory was applied and the results were compared with transition data to arrive at correlated n-factors. Results of the analysis showed that for the configuration and cases studied, Tollmien-Schlichting (TS) amplification was the dominating disturbance influencing transition. For these cases, incompressible linear stability theory correlated with an n-factor for TS waves of approximately 10 at transition. The n-factor method correlated rather consistently to this value despite a number of non-ideal conditions which indicates the method is useful as a design tool for advanced laminar flow airfoils.

  17. A Bifactor Multidimensional Item Response Theory Model for Differential Item Functioning Analysis on Testlet-Based Items

    ERIC Educational Resources Information Center

    Fukuhara, Hirotaka; Kamata, Akihito

    2011-01-01

    A differential item functioning (DIF) detection method for testlet-based data was proposed and evaluated in this study. The proposed DIF model is an extension of a bifactor multidimensional item response theory (MIRT) model for testlets. Unlike traditional item response theory (IRT) DIF models, the proposed model takes testlet effects into…

  18. Interlaminar Stresses by Refined Beam Theories and the Sinc Method Based on Interpolation of Highest Derivative

    NASA Technical Reports Server (NTRS)

    Slemp, Wesley C. H.; Kapania, Rakesh K.; Tessler, Alexander

    2010-01-01

    Computation of interlaminar stresses from the higher-order shear and normal deformable beam theory and the refined zigzag theory was performed using the Sinc method based on Interpolation of Highest Derivative. The Sinc method based on Interpolation of Highest Derivative was proposed as an efficient method for determining through-the-thickness variations of interlaminar stresses from one- and two-dimensional analysis by integration of the equilibrium equations of three-dimensional elasticity. However, the use of traditional equivalent single layer theories often results in inaccuracies near the boundaries and when the lamina have extremely large differences in material properties. Interlaminar stresses in symmetric cross-ply laminated beams were obtained by solving the higher-order shear and normal deformable beam theory and the refined zigzag theory with the Sinc method based on Interpolation of Highest Derivative. Interlaminar stresses and bending stresses from the present approach were compared with a detailed finite element solution obtained by ABAQUS/Standard. The results illustrate the ease with which the Sinc method based on Interpolation of Highest Derivative can be used to obtain the through-the-thickness distributions of interlaminar stresses from the beam theories. Moreover, the results indicate that the refined zigzag theory is a substantial improvement over the Timoshenko beam theory due to the piecewise continuous displacement field which more accurately represents interlaminar discontinuities in the strain field. The higher-order shear and normal deformable beam theory more accurately captures the interlaminar stresses at the ends of the beam because it allows transverse normal strain. However, the continuous nature of the displacement field requires a large number of monomial terms before the interlaminar stresses are computed as accurately as the refined zigzag theory.

  19. A generalization of random matrix theory and its application to statistical physics.

    PubMed

    Wang, Duan; Zhang, Xin; Horvatic, Davor; Podobnik, Boris; Eugene Stanley, H

    2017-02-01

    To study the statistical structure of crosscorrelations in empirical data, we generalize random matrix theory and propose a new method of cross-correlation analysis, known as autoregressive random matrix theory (ARRMT). ARRMT takes into account the influence of auto-correlations in the study of cross-correlations in multiple time series. We first analytically and numerically determine how auto-correlations affect the eigenvalue distribution of the correlation matrix. Then we introduce ARRMT with a detailed procedure of how to implement the method. Finally, we illustrate the method using two examples taken from inflation rates for air pressure data for 95 US cities.

  20. Error analysis and correction of discrete solutions from finite element codes

    NASA Technical Reports Server (NTRS)

    Thurston, G. A.; Stein, P. A.; Knight, N. F., Jr.; Reissner, J. E.

    1984-01-01

    Many structures are an assembly of individual shell components. Therefore, results for stresses and deflections from finite element solutions for each shell component should agree with the equations of shell theory. This paper examines the problem of applying shell theory to the error analysis and the correction of finite element results. The general approach to error analysis and correction is discussed first. Relaxation methods are suggested as one approach to correcting finite element results for all or parts of shell structures. Next, the problem of error analysis of plate structures is examined in more detail. The method of successive approximations is adapted to take discrete finite element solutions and to generate continuous approximate solutions for postbuckled plates. Preliminary numerical results are included.

  1. Damage Based Analysis (DBA) - Theory, Derivation and Practical Application Using Both an Acceleration and Pseudo Velocity Approach

    NASA Technical Reports Server (NTRS)

    Grillo, Vince

    2017-01-01

    The objective of this presentation is to give a brief overview of the theory behind the (DBA) method, an overview of the derivation and a practical application of the theory using the Python computer language. The Theory and Derivation will use both Acceleration and Pseudo Velocity methods to derive a series of equations for processing by Python. We will take the results and compare both Acceleration and Pseudo Velocity methods and discuss implementation of the Python functions. Also, we will discuss the efficiency of the methods and the amount of computer time required for the solution. In conclusion, (DBA) offers a powerful method to evaluate the amount of energy imparted into a system in the form of both Amplitude and Duration during qualification testing and flight environments. Many forms of steady state and transient vibratory motion can be characterized using this technique. (DBA) provides a more robust alternative to traditional methods such Power Spectral Density (PSD) using a maximax approach.

  2. Damage Based Analysis (DBA): Theory, Derivation and Practical Application - Using Both an Acceleration and Pseudo-Velocity Approach

    NASA Technical Reports Server (NTRS)

    Grillo, Vince

    2016-01-01

    The objective of this presentation is to give a brief overview of the theory behind the (DBA) method, an overview of the derivation and a practical application of the theory using the Python computer language. The Theory and Derivation will use both Acceleration and Pseudo Velocity methods to derive a series of equations for processing by Python. We will take the results and compare both Acceleration and Pseudo Velocity methods and discuss implementation of the Python functions. Also, we will discuss the efficiency of the methods and the amount of computer time required for the solution. In conclusion, (DBA) offers a powerful method to evaluate the amount of energy imparted into a system in the form of both Amplitude and Duration during qualification testing and flight environments. Many forms of steady state and transient vibratory motion can be characterized using this technique. (DBA) provides a more robust alternative to traditional methods such Power Spectral Density (PSD) using a Maximax approach.

  3. Interpretive focus groups: a participatory method for interpreting and extending secondary analysis of qualitative data.

    PubMed

    Redman-MacLaren, Michelle; Mills, Jane; Tommbe, Rachael

    2014-01-01

    Participatory approaches to qualitative research practice constantly change in response to evolving research environments. Researchers are increasingly encouraged to undertake secondary analysis of qualitative data, despite epistemological and ethical challenges. Interpretive focus groups can be described as a more participative method for groups to analyse qualitative data. To facilitate interpretive focus groups with women in Papua New Guinea to extend analysis of existing qualitative data and co-create new primary data. The purpose of this was to inform a transformational grounded theory and subsequent health promoting action. A two-step approach was used in a grounded theory study about how women experience male circumcision in Papua New Guinea. Participants analysed portions or 'chunks' of existing qualitative data in story circles and built upon this analysis by using the visual research method of storyboarding. New understandings of the data were evoked when women in interpretive focus groups analysed the data 'chunks'. Interpretive focus groups encouraged women to share their personal experiences about male circumcision. The visual method of storyboarding enabled women to draw pictures to represent their experiences. This provided an additional focus for whole-of-group discussions about the research topic. Interpretive focus groups offer opportunity to enhance trustworthiness of findings when researchers undertake secondary analysis of qualitative data. The co-analysis of existing data and co-generation of new data between research participants and researchers informed an emergent transformational grounded theory and subsequent health promoting action.

  4. Interpretive focus groups: a participatory method for interpreting and extending secondary analysis of qualitative data

    PubMed Central

    Redman-MacLaren, Michelle; Mills, Jane; Tommbe, Rachael

    2014-01-01

    Background Participatory approaches to qualitative research practice constantly change in response to evolving research environments. Researchers are increasingly encouraged to undertake secondary analysis of qualitative data, despite epistemological and ethical challenges. Interpretive focus groups can be described as a more participative method for groups to analyse qualitative data. Objective To facilitate interpretive focus groups with women in Papua New Guinea to extend analysis of existing qualitative data and co-create new primary data. The purpose of this was to inform a transformational grounded theory and subsequent health promoting action. Design A two-step approach was used in a grounded theory study about how women experience male circumcision in Papua New Guinea. Participants analysed portions or ‘chunks’ of existing qualitative data in story circles and built upon this analysis by using the visual research method of storyboarding. Results New understandings of the data were evoked when women in interpretive focus groups analysed the data ‘chunks’. Interpretive focus groups encouraged women to share their personal experiences about male circumcision. The visual method of storyboarding enabled women to draw pictures to represent their experiences. This provided an additional focus for whole-of-group discussions about the research topic. Conclusions Interpretive focus groups offer opportunity to enhance trustworthiness of findings when researchers undertake secondary analysis of qualitative data. The co-analysis of existing data and co-generation of new data between research participants and researchers informed an emergent transformational grounded theory and subsequent health promoting action. PMID:25138532

  5. Advanced stress analysis methods applicable to turbine engine structures

    NASA Technical Reports Server (NTRS)

    Pian, T. H. H.

    1985-01-01

    Advanced stress analysis methods applicable to turbine engine structures are investigated. Constructions of special elements which containing traction-free circular boundaries are investigated. New versions of mixed variational principle and version of hybrid stress elements are formulated. A method is established for suppression of kinematic deformation modes. semiLoof plate and shell elements are constructed by assumed stress hybrid method. An elastic-plastic analysis is conducted by viscoplasticity theory using the mechanical subelement model.

  6. Analysis of Ward identities in supersymmetric Yang-Mills theory

    NASA Astrophysics Data System (ADS)

    Ali, Sajid; Bergner, Georg; Gerber, Henning; Montvay, Istvan; Münster, Gernot; Piemonte, Stefano; Scior, Philipp

    2018-05-01

    In numerical investigations of supersymmetric Yang-Mills theory on a lattice, the supersymmetric Ward identities are valuable for finding the critical value of the hopping parameter and for examining the size of supersymmetry breaking by the lattice discretisation. In this article we present an improved method for the numerical analysis of supersymmetric Ward identities, which takes into account the correlations between the various observables involved. We present the first complete analysis of supersymmetric Ward identities in N=1 supersymmetric Yang-Mills theory with gauge group SU(3). The results indicate that lattice artefacts scale to zero as O(a^2) towards the continuum limit in agreement with theoretical expectations.

  7. Self construction in schizophrenia: a discourse analysis.

    PubMed

    Meehan, Trudy; MacLachlan, Malcolm

    2008-06-01

    Lysaker and Lysaker (Theory and Psychology, 12(2), 207-220, 2002) employ a dialogical theory of self in their writings on self disruption in schizophrenia. It is argued here that this theory could be enriched by incorporating a discursive and social constructionist model of self. Harr's model enables researchers to use subject positions to identify self construction in people with a diagnosis of schizophrenia that the dialogical model, using analysis of narrative, does not as easily recognize. The paper presents a discourse analysis of self construction in eight participants with a diagnosis of schizophrenia. Transcripts from semi-structured interviews are analysed, wherein focus falls on how participants construct self in talk through the use of subject positioning. The findings indicate that Harr's theory of self and the implied method of discourse analysis enables more subtle and nuanced constructions of self to be identified than those highlighted by Lysaker and Lysaker (Theory and Psychology, 12(2), 207-220, 2002). The analysis of subject positions revealed that participants constructed self in the form of Harr's (The singular self: An introduction to the psychology of personhood, 1998, London: Sage) self1, self2, and self3. The findings suggest that there may be constructions of self used by people diagnosed with schizophrenia that are not recognized by the current research methods focusing on narrative. The paper argues for the recognition of these constructions and by implication a model of self that takes into account different levels of visibility of self construction in talk.

  8. Responding mindfully to distressing psychosis: A grounded theory analysis.

    PubMed

    Abba, Nicola; Chadwick, Paul; Stevenson, Chris

    2008-01-01

    This study investigates the psychological process involved when people with current distressing psychosis learned to respond mindfully to unpleasant psychotic sensations (voices, thoughts, and images). Sixteen participants were interviewed on completion of a mindfulness group program. Grounded theory methodology was used to generate a theory of the core psychological process using a systematically applied set of methods linking analysis with data collection. The theory inducted describes the experience of relating differently to psychosis through a three-stage process: centering in awareness of psychosis; allowing voices, thoughts, and images to come and go without reacting or struggle; and reclaiming power through acceptance of psychosis and the self. The conceptual and clinical applications of the theory and its limits are discussed.

  9. Using Grounded Theory Method to Capture and Analyze Health Care Experiences

    PubMed Central

    Foley, Geraldine; Timonen, Virpi

    2015-01-01

    Objective Grounded theory (GT) is an established qualitative research method, but few papers have encapsulated the benefits, limits, and basic tenets of doing GT research on user and provider experiences of health care services. GT can be used to guide the entire study method, or it can be applied at the data analysis stage only. Methods We summarize key components of GT and common GT procedures used by qualitative researchers in health care research. We draw on our experience of conducting a GT study on amyotrophic lateral sclerosis patients’ experiences of health care services. Findings We discuss why some approaches in GT research may work better than others, particularly when the focus of study is hard-to-reach population groups. We highlight the flexibility of procedures in GT to build theory about how people engage with health care services. Conclusion GT enables researchers to capture and understand health care experiences. GT methods are particularly valuable when the topic of interest has not previously been studied. GT can be applied to bring structure and rigor to the analysis of qualitative data. PMID:25523315

  10. Verticality and containment in song and improvisation: an application of schema theory to Nordoff-Robbins music therapy.

    PubMed

    Aigen, Kenneth

    2009-01-01

    This study illustrates the use of a new musicological method for analyzing music in music therapy. It examines two pieces of clinical music through the constructs of schema theory. It begins with an argument for enhanced musical analysis in music therapy as a means of elevating the status of explanation in music therapy. Schema theory is introduced as a means of integrating musical with clinical concerns. Some basic ideas in schema theory are explained and the schemas of VERTICALITY and CONTAINER are presented as central ones in the analysis of music. Two transcriptions-one of a composed song and one of an improvisation-are examined in detail to illustrate how decisions in the temporal, melodic, and harmonic dimensions of the music are linked to specific clinical goals. The article concludes with a discussion of the implications of this type of musicological analysis for explanatory theory in music therapy.

  11. Finite element stress, vibration, and buckling analysis of laminated beams with the use of refined elements

    NASA Astrophysics Data System (ADS)

    Borovkov, Alexei I.; Avdeev, Ilya V.; Artemyev, A.

    1999-05-01

    In present work, the stress, vibration and buckling finite element analysis of laminated beams is performed. Review of the equivalent single-layer (ESL) laminate theories is done. Finite element algorithms and procedures integrated into the original FEA program system and based on the classical laminated plate theory (CLPT), first-order shear deformation theory (FSDT), third-order theory of Reddy (TSDT-R) and third- order theory of Kant (TSDT-K) with the use of the Lanczos method for solving of the eigenproblem are developed. Several numerical tests and examples of bending, free vibration and buckling of multilayered and sandwich beams with various material, geometry properties and boundary conditions are solved. New effective higher-order hierarchical element for the accurate calculation of transverse shear stress is proposed. The comparative analysis of results obtained by the considered models and solutions of 2D problems of the heterogeneous anisotropic elasticity is fulfilled.

  12. Lattice Methods and the Nuclear Few- and Many-Body Problem

    NASA Astrophysics Data System (ADS)

    Lee, Dean

    This chapter builds upon the review of lattice methods and effective field theory of the previous chapter. We begin with a brief overview of lattice calculations using chiral effective field theory and some recent applications. We then describe several methods for computing scattering on the lattice. After that we focus on the main goal, explaining the theory and algorithms relevant to lattice simulations of nuclear few- and many-body systems. We discuss the exact equivalence of four different lattice formalisms, the Grassmann path integral, transfer matrix operator, Grassmann path integral with auxiliary fields, and transfer matrix operator with auxiliary fields. Along with our analysis we include several coding examples and a number of exercises for the calculations of few- and many-body systems at leading order in chiral effective field theory.

  13. Review of LMIs, Interior Point Methods, Complexity Theory, and Robustness Analysis

    NASA Technical Reports Server (NTRS)

    Mesbahi, M.

    1996-01-01

    From end of intro: ...We would like to show that for certain problems in systems and control theory, there exist algorithms for which corresponding (xi) can be viewed as a certain measure of robustness, e.g., stability margin.

  14. Twenty-First Century Educational Theory and the Challenges of Modern Education: Appealing to the Heritage of the General Teaching Theory of the Secondary Educational Curriculum and the Learning Process

    ERIC Educational Resources Information Center

    Klarin, Mikhail V.

    2016-01-01

    The article presents an analysis of educational theory in light of the challenges confronting education in the twenty-first century. The author examines how our ideas about the methods for managing the transmission of culture, the subject of education, and the consequences of these changes for the theory of education have changed. The author…

  15. A study of the river velocity measurement techniques and analysis methods

    NASA Astrophysics Data System (ADS)

    Chung Yang, Han; Lun Chiang, Jie

    2013-04-01

    Velocity measurement technology can be traced back to the pitot tube velocity measurement method in the 18th century and today's velocity measurement technology use the acoustic and radar technology, with the Doppler principle developed technology advances, in order to develop the measurement method is more suitable for the measurement of velocity, the purpose is to get a more accurate measurement data and with the surface velocity theory, the maximum velocity theory and the indicator theory to obtain the mean velocity. As the main research direction of this article is to review the literature of the velocity measurement techniques and analysis methods, and to explore the applicability of the measurement method of the velocity measurement instruments, and then to describe the advantages and disadvantages of the different mean velocity profiles analysis method. Adequate review of the references of this study will be able to provide a reference for follow-up study of the velocity measurement. Review velocity measurement literature that different velocity measurement is required to follow the different flow conditions measured be upgraded its accuracy, because each flow rate measurement method has its advantages and disadvantages. Traditional velocity instrument can be used at low flow and RiverRAD microwave radar or imaging technology measurement method may be applied in high flow. In the tidal river can use the ADCP to quickly measure river vertical velocity distribution. In addition, urban rivers may be used the CW radar to set up on the bridge, and wide rivers can be used RiverRAD microwave radar to measure the velocities. Review the relevant literature also found that using Ultrasonic Doppler Current Profiler with the Chiu's theory to the velocity of observing automation work can save manpower and resources to improve measurement accuracy, reduce the risk of measurement, but the great variability of river characteristics in Taiwan and a lot of drifting floating objects in water in high flow, resulting in measurement automation work still needs further study. If the priority for the safety of personnel and instruments, we can use the non-contact velocity measurement method with the theoretical analysis method to achieve real-time monitoring.

  16. Social Cognitive and Planned Behavior Variables Associated with Stages of Change for Physical Activity in Spinal Cord Injury: A Multivariate Analysis

    ERIC Educational Resources Information Center

    Keegan, John; Ditchman, Nicole; Dutta, Alo; Chiu, Chung-Yi; Muller, Veronica; Chan, Fong; Kundu, Madan

    2016-01-01

    Purpose: To apply the constructs of social cognitive theory (SCT) and the theory of planned behavior (TPB) to understand the stages of change (SOC) for physical activities among individuals with a spinal cord injury (SCI). Method: Ex post facto design using multivariate analysis of variance (MANOVA). The participants were 144 individuals with SCI…

  17. An Analysis of Cross Racial Identity Scale Scores Using Classical Test Theory and Rasch Item Response Models

    ERIC Educational Resources Information Center

    Sussman, Joshua; Beaujean, A. Alexander; Worrell, Frank C.; Watson, Stevie

    2013-01-01

    Item response models (IRMs) were used to analyze Cross Racial Identity Scale (CRIS) scores. Rasch analysis scores were compared with classical test theory (CTT) scores. The partial credit model demonstrated a high goodness of fit and correlations between Rasch and CTT scores ranged from 0.91 to 0.99. CRIS scores are supported by both methods.…

  18. Quality of Malaysian Teachers Based on Education and Training: A Benefit and Earnings Returns Analysis Using Human Capital Theory

    ERIC Educational Resources Information Center

    Ismail, Ramlee; Awang, Marinah

    2017-01-01

    Purpose: The purpose of this paper is to investigate how the quality of teachers based on education and training provided under new reform policies in Malaysia affects their earnings outcomes. The study conducted a benefit and returns analysis guided by human capital theory. Design/methodology/approach: The study used survey research methods to…

  19. A system for aerodynamic design and analysis of supersonic aircraft. Part 4: Test cases

    NASA Technical Reports Server (NTRS)

    Middleton, W. D.; Lundry, J. L.

    1980-01-01

    An integrated system of computer programs was developed for the design and analysis of supersonic configurations. The system uses linearized theory methods for the calculation of surface pressures and supersonic area rule concepts in combination with linearized theory for calculation of aerodynamic force coefficients. Interactive graphics are optional at the user's request. Representative test cases and associated program output are presented.

  20. Analysis and testing of a new method for drop size measurement using laser scatter interferometry

    NASA Technical Reports Server (NTRS)

    Bachalo, W. D.; Houser, M. J.

    1984-01-01

    Research was conducted on a laser light scatter detection method for measuring the size and velocity of spherical particles. The method is based upon the measurement of the interference fringe pattern produced by spheres passing through the intersection of two laser beams. A theoretical analysis of the method was carried out using the geometrical optics theory. Experimental verification of the theory was obtained by using monodisperse droplet streams. Several optical configurations were tested to identify all of the parametric effects upon the size measurements. Both off-axis forward and backscatter light detection were utilized. Simulated spray environments and fuel spray nozzles were used in the evaluation of the method. The measurements of the monodisperse drops showed complete agreement with the theoretical predictions. The method was demonstrated to be independent of the beam intensity and extinction resulting from the surrounding drops. Signal processing concepts were considered and a method was selected for development.

  1. Efficiently Assessing Negative Cognition in Depression: An Item Response Theory Analysis of the Dysfunctional Attitude Scale

    ERIC Educational Resources Information Center

    Beevers, Christopher G.; Strong, David R.; Meyer, Bjorn; Pilkonis, Paul A.; Miller, Ivan R.

    2007-01-01

    Despite a central role for dysfunctional attitudes in cognitive theories of depression and the widespread use of the Dysfunctional Attitude Scale, form A (DAS-A; A. Weissman, 1979), the psychometric development of the DAS-A has been relatively limited. The authors used nonparametric item response theory methods to examine the DAS-A items and…

  2. Which theory for the origin of syphilis is true?

    PubMed

    Anteric, Ivana; Basic, Zeljana; Vilovic, Katarina; Kolic, Kresimir; Andjelinovic, Simun

    2014-12-01

    There are four theories about the origin of syphilis, of which the mostly represented one is the Columbian theory. This theory suggests that syphilis was brought into Europe in 1493 ad by the ship from Caribbean islands. The aim of this study is to test all theories on a sample of 403 skeletons: 135 from prehistory, 134 from antique, and 134 from medieval period and new age from the Dalmatia (Croatia). All skeletons were examined using standard anthropological methods. Paleopathological analysis was performed on each skeleton as well as additional radiographic method on one isolated skeleton. Paleopathological changes on skeletal remains connected with treponematosis. Paleopathological analysis revealed one skeleton from the antique period (second to 6th century A.D.) that exhibited skeletal markers similar to those described in one clinical case in which congenital syphilis was confirmed by a Wasserman reaction. Skeletal remains of this person were examined macroscopically and radiographically, and the differential diagnostics eliminated other considered pathologies as well as trauma. The finding of skeletal markers of syphilis on a skeleton from the antique supports the theory of pre-Columbian syphilis origin. © 2014 International Society for Sexual Medicine.

  3. [Photography as analysis document, body and medicine: theory, method and criticism--the experience of Museo Nacional de Medicina Enrique Laval].

    PubMed

    Robinson, César Leyton; Caballero, Andrés Díaz

    2007-01-01

    This article is an experimental methodological reflection on the use of medical images as useful documents for constructing the history of medicine. A method is used that is based on guidelines or analysis topics that include different ways of viewing documents, from aesthetic, technical, social and political theories to historical and medical thinking. Some exercises are also included that enhance the proposal for the reader: rediscovering the worlds in society that harbor these medical photographical archives to obtain a new theoretical approach to the construction of the history of medical science.

  4. The Five Star Method: A Relational Dream Work Methodology

    ERIC Educational Resources Information Center

    Sparrow, Gregory Scott; Thurston, Mark

    2010-01-01

    This article presents a systematic method of dream work called the Five Star Method. Based on cocreative dream theory, which views the dream as the product of the interaction between dreamer and dream, this creative intervention shifts the principal focus in dream analysis from the interpretation of static imagery to the analysis of the dreamer's…

  5. An evaluation of a coupled microstructural approach for the analysis of functionally graded composites via the finite-element method

    NASA Technical Reports Server (NTRS)

    Pindera, Marek-Jerzy; Dunn, Patrick

    1995-01-01

    A comparison is presented between the predictions of the finite-element analysis and a recently developed higher-order theory for functionally graded materials subjected to a thorough-thickness temperature gradient. In contrast to existing micromechanical theories that utilize classical (i.e., uncoupled) homogenization schemes to calculate micro-level and macro-level stress and displacement fields in materials with uniform or nonuniform fiber spacing (i.e., functionally graded materials), the new theory explicitly couples the microstructural details with the macrostructure of the composite. Previous thermo-elastic analysis has demonstrated that such coupling is necessary when: the temperature gradient is large with respect to the dimension of the reinforcement; the characteristic dimension of the reinforcement is large relative to the global dimensions of the composite and the number of reinforcing fibers or inclusions is small. In these circumstances, the standard micromechanical analyses based on the concept of the representative volume element used to determine average composite properties produce questionable results. The comparison between the predictions of the finite-element method and the higher-order theory presented herein establish the theory's accuracy in predicting thermal and stress fields within composites with a finite number of fibers in the thickness direction subjected to a thorough-thickness thermal gradient.

  6. Rolling bearing fault diagnosis based on information fusion using Dempster-Shafer evidence theory

    NASA Astrophysics Data System (ADS)

    Pei, Di; Yue, Jianhai; Jiao, Jing

    2017-10-01

    This paper presents a fault diagnosis method for rolling bearing based on information fusion. Acceleration sensors are arranged at different position to get bearing vibration data as diagnostic evidence. The Dempster-Shafer (D-S) evidence theory is used to fuse multi-sensor data to improve diagnostic accuracy. The efficiency of the proposed method is demonstrated by the high speed train transmission test bench. The results of experiment show that the proposed method in this paper improves the rolling bearing fault diagnosis accuracy compared with traditional signal analysis methods.

  7. Probabilistic structural analysis methods for space propulsion system components

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1986-01-01

    The development of a three-dimensional inelastic analysis methodology for the Space Shuttle main engine (SSME) structural components is described. The methodology is composed of: (1) composite load spectra, (2) probabilistic structural analysis methods, (3) the probabilistic finite element theory, and (4) probabilistic structural analysis. The methodology has led to significant technical progress in several important aspects of probabilistic structural analysis. The program and accomplishments to date are summarized.

  8. Staying theoretically sensitive when conducting grounded theory research.

    PubMed

    Reay, Gudrun; Bouchal, Shelley Raffin; A Rankin, James

    2016-09-01

    Background Grounded theory (GT) is founded on the premise that underlying social patterns can be discovered and conceptualised into theories. The method and need for theoretical sensitivity are best understood in the historical context in which GT was developed. Theoretical sensitivity entails entering the field with no preconceptions, so as to remain open to the data and the emerging theory. Investigators also read literature from other fields to understand various ways to construct theories. Aim To explore the concept of theoretical sensitivity from a classical GT perspective, and discuss the ontological and epistemological foundations of GT. Discussion Difficulties in remaining theoretically sensitive throughout research are discussed and illustrated with examples. Emergence - the idea that theory and substance will emerge from the process of comparing data - and staying open to the data are emphasised. Conclusion Understanding theoretical sensitivity as an underlying guiding principle of GT helps the researcher make sense of important concepts, such as delaying the literature review, emergence and the constant comparative method (simultaneous collection, coding and analysis of data). Implications for practice Theoretical sensitivity and adherence to the GT research method allow researchers to discover theories that can bridge the gap between theory and practice.

  9. Multi-Level Discourse Analysis in a Physics Teaching Methods Course from the Psychological Perspective of Activity Theory

    ERIC Educational Resources Information Center

    Vieira, Rodrigo Drumond; Kelly, Gregory J.

    2014-01-01

    In this paper, we present and apply a multi-level method for discourse analysis in science classrooms. This method is based on the structure of human activity (activity, actions, and operations) and it was applied to study a pre-service physics teacher methods course. We argue that such an approach, based on a cultural psychological perspective,…

  10. Synonymie und Interlinguistik (Synonymy and Interlinguistics).

    ERIC Educational Resources Information Center

    Szabo, Rita Brdar; Brdar, Mario

    1993-01-01

    Discusses the relationship between traditional synonym theory and two perspectives of interlinguistics: contrastive lexical analysis and languages in contact research. The goal and methods of each are described briefly, and a new synonym conceptualization is proposed that better fits synchronic dynamics than the traditional theory. Examples from…

  11. Reflectance spectroscopy: quantitative analysis techniques for remote sensing applications.

    USGS Publications Warehouse

    Clark, R.N.; Roush, T.L.

    1984-01-01

    Several methods for the analysis of remotely sensed reflectance data are compared, including empirical methods and scattering theories, both of which are important for solving remote sensing problems. The concept of the photon mean path length and the implications for use in modeling reflectance spectra are presented.-from Authors

  12. On the Spectrum of Periodic Signals

    ERIC Educational Resources Information Center

    Al-Smadi, Adnan

    2004-01-01

    In theory, there are many methods for the representation of signals. In practice, however, Fourier analysis involving the resolution of signals into sinusoidal components is used widely. There are several methods for Fourier analysis available for representation of signals. If the signal is periodic, then the Fourier series is used to represent…

  13. Network Data: Statistical Theory and New Models

    DTIC Science & Technology

    2016-02-17

    SECURITY CLASSIFICATION OF: During this period of review, Bin Yu worked on many thrusts of high-dimensional statistical theory and methodologies. Her...research covered a wide range of topics in statistics including analysis and methods for spectral clustering for sparse and structured networks...2,7,8,21], sparse modeling (e.g. Lasso) [4,10,11,17,18,19], statistical guarantees for the EM algorithm [3], statistical analysis of algorithm leveraging

  14. Unification Principle and a Geometric Field Theory

    NASA Astrophysics Data System (ADS)

    Wanas, Mamdouh I.; Osman, Samah N.; El-Kholy, Reham I.

    2015-08-01

    In the context of the geometrization philosophy, a covariant field theory is constructed. The theory satisfies the unification principle. The field equations of the theory are constructed depending on a general differential identity in the geometry used. The Lagrangian scalar used in the formalism is neither curvature scalar nor torsion scalar, but an alloy made of both, the W-scalar. The physical contents of the theory are explored depending on different methods. The analysis shows that the theory is capable of dealing with gravity, electromagnetism and material distribution with possible mutual interactions. The theory is shown to cover the domain of general relativity under certain conditions.

  15. The Economics of Terrorism: Economics Methods of Analysis in the Study of Terrorism and Counterterrorism

    DTIC Science & Technology

    2010-12-01

    addition to outlining definitions, data sources, choice theory , game theory , and the economic consequences of terrorism, this study identifies how...stratégiques. Les auteurs sont le Maj Alain Rollin, le Maj Meaghan Setter et Mme Rachel Lea Heide, Ph.D., sous la direction du Lcol William Yee...18 7 Choice Theory and its Applications 7.1

  16. Intelligence Analysts at State and Major Urban Area Fusion Centers: An Evaluation of Education and Training Requirements

    DTIC Science & Technology

    2011-06-10

    Sharan Merriam, there are six standard approaches to qualitative research : phenomenology , grounded theory , ethnography , narrative analysis, critical...69Merriam, Qualitative Research , 21-22. 70Grounded Theory Institute, ―What is Grounded Theory ,‖ http...as to the available methods of research , qualitative and quantitative, and why the qualitative methodology was selected. It also provided the reader

  17. Using Grounded Theory Method to Capture and Analyze Health Care Experiences.

    PubMed

    Foley, Geraldine; Timonen, Virpi

    2015-08-01

    Grounded theory (GT) is an established qualitative research method, but few papers have encapsulated the benefits, limits, and basic tenets of doing GT research on user and provider experiences of health care services. GT can be used to guide the entire study method, or it can be applied at the data analysis stage only. We summarize key components of GT and common GT procedures used by qualitative researchers in health care research. We draw on our experience of conducting a GT study on amyotrophic lateral sclerosis patients' experiences of health care services. We discuss why some approaches in GT research may work better than others, particularly when the focus of study is hard-to-reach population groups. We highlight the flexibility of procedures in GT to build theory about how people engage with health care services. GT enables researchers to capture and understand health care experiences. GT methods are particularly valuable when the topic of interest has not previously been studied. GT can be applied to bring structure and rigor to the analysis of qualitative data. © Health Research and Educational Trust.

  18. Advanced stress analysis methods applicable to turbine engine structures

    NASA Technical Reports Server (NTRS)

    Pian, Theodore H. H.

    1991-01-01

    The following tasks on the study of advanced stress analysis methods applicable to turbine engine structures are described: (1) constructions of special elements which contain traction-free circular boundaries; (2) formulation of new version of mixed variational principles and new version of hybrid stress elements; (3) establishment of methods for suppression of kinematic deformation modes; (4) construction of semiLoof plate and shell elements by assumed stress hybrid method; and (5) elastic-plastic analysis by viscoplasticity theory using the mechanical subelement model.

  19. Power System Transient Stability Based on Data Mining Theory

    NASA Astrophysics Data System (ADS)

    Cui, Zhen; Shi, Jia; Wu, Runsheng; Lu, Dan; Cui, Mingde

    2018-01-01

    In order to study the stability of power system, a power system transient stability based on data mining theory is designed. By introducing association rules analysis in data mining theory, an association classification method for transient stability assessment is presented. A mathematical model of transient stability assessment based on data mining technology is established. Meanwhile, combining rule reasoning with classification prediction, the method of association classification is proposed to perform transient stability assessment. The transient stability index is used to identify the samples that cannot be correctly classified in association classification. Then, according to the critical stability of each sample, the time domain simulation method is used to determine the state, so as to ensure the accuracy of the final results. The results show that this stability assessment system can improve the speed of operation under the premise that the analysis result is completely correct, and the improved algorithm can find out the inherent relation between the change of power system operation mode and the change of transient stability degree.

  20. The role of community mental health nurses caring for people with schizophrenia in Taiwan: a substantive grounded theory.

    PubMed

    Huang, Xuan-Yi; Yen, Wen-Jiuan; Liu, Shwu-Jiuan; Lin, Chouh-Jiuan

    2008-03-01

    The aim was to develop a practice theory that can be used to guide the direction of community nursing practice to help clients with schizophrenia and those who care for them. Substantive grounded theory was developed through use of grounded theory method of Strauss and Corbin. Two groups of participants in Taiwan were selected using theoretical sampling: one group consisted of community mental health nurses and the other group was clients with schizophrenia and those who cared for them. The number of participants in each group was determined by theoretical saturation. Semi-structured one-to-one in-depth interviews and unstructured non-participant observation were utilized for data collection. Data analysis involved three stages: open, axial and selective coding. During the process of coding and analysis, both inductive and deductive thinking were utilized and the constant comparative analysis process continued until data saturation occurred. To establish trustworthiness, the four criteria of credibility, transferability, dependability and confirmability were followed along with field trial, audit trial, member check and peer debriefing for reliability and validity. A substantive grounded theory, the role of community mental health nurses caring for people with schizophrenia in Taiwan, was developed through utilization of grounded theory method of Strauss and Corbin. In this paper, results and discussion focus on causal conditions, context, intervening conditions, consequences and phenomenon. The theory is the first to contribute knowledge about the field of mental health home visiting services in Taiwan to provide guidance for the delivery of quality care to assist people in the community with schizophrenia and their carers.

  1. Parametric Stiffness Control of Flexible Structures

    NASA Technical Reports Server (NTRS)

    Moon, F. C.; Rand, R. H.

    1985-01-01

    An unconventional method for control of flexible space structures using feedback control of certain elements of the stiffness matrix is discussed. The advantage of using this method of configuration control is that it can be accomplished in practical structures by changing the initial stress state in the structure. The initial stress state can be controlled hydraulically or by cables. The method leads, however, to nonlinear control equations. In particular, a long slender truss structure under cable induced initial compression is examined. both analytical and numerical analyses are presented. Nonlinear analysis using center manifold theory and normal form theory is used to determine criteria on the nonlinear control gains for stable or unstable operation. The analysis is made possible by the use of the exact computer algebra system MACSYMA.

  2. Risk analysis theory applied to fishing operations: A new approach on the decision-making problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cunha, J.C.S.

    1994-12-31

    In the past the decisions concerning whether to continue or interrupt a fishing operation were based primarily on the operator`s previous experience. This procedure often led to wrong decisions and unnecessary loss of money and time. This paper describes a decision-making method based on risk analysis theory and previous operation results from a field under study. The method leads to more accurate decisions on a daily basis allowing the operator to verify each day of the operation if the decision being carried out is the one with the highest probability to conduct to the best economical result. An example ofmore » the method application is provided at the end of the paper.« less

  3. Reliability analysis of the objective structured clinical examination using generalizability theory.

    PubMed

    Trejo-Mejía, Juan Andrés; Sánchez-Mendiola, Melchor; Méndez-Ramírez, Ignacio; Martínez-González, Adrián

    2016-01-01

    Background The objective structured clinical examination (OSCE) is a widely used method for assessing clinical competence in health sciences education. Studies using this method have shown evidence of validity and reliability. There are no published studies of OSCE reliability measurement with generalizability theory (G-theory) in Latin America. The aims of this study were to assess the reliability of an OSCE in medical students using G-theory and explore its usefulness for quality improvement. Methods An observational cross-sectional study was conducted at National Autonomous University of Mexico (UNAM) Faculty of Medicine in Mexico City. A total of 278 fifth-year medical students were assessed with an 18-station OSCE in a summative end-of-career final examination. There were four exam versions. G-theory with a crossover random effects design was used to identify the main sources of variance. Examiners, standardized patients, and cases were considered as a single facet of analysis. Results The exam was applied to 278 medical students. The OSCE had a generalizability coefficient of 0.93. The major components of variance were stations, students, and residual error. The sites and the versions of the tests had minimum variance. Conclusions Our study achieved a G coefficient similar to that found in other reports, which is acceptable for summative tests. G-theory allows the estimation of the magnitude of multiple sources of error and helps decision makers to determine the number of stations, test versions, and examiners needed to obtain reliable measurements.

  4. Trends and Patterns in Cultural Resource Significance: An Historical Perspective and Annotated Bibliography

    DTIC Science & Technology

    1997-04-01

    to tracing historical trends in archaeological method and theory ). The literature sum- marized here is extensive and is not accessible widely to the...of new signifi- cance assessment models. The more specific objectives in undertaking this literary review and interpretive analysis of archaeological...method and theory characteristic of the ’New Archaeology’ of the late 1960s. Once these ideas had made their way into the early literature on

  5. Assessing item fit for unidimensional item response theory models using residuals from estimated item response functions.

    PubMed

    Haberman, Shelby J; Sinharay, Sandip; Chon, Kyong Hee

    2013-07-01

    Residual analysis (e.g. Hambleton & Swaminathan, Item response theory: principles and applications, Kluwer Academic, Boston, 1985; Hambleton, Swaminathan, & Rogers, Fundamentals of item response theory, Sage, Newbury Park, 1991) is a popular method to assess fit of item response theory (IRT) models. We suggest a form of residual analysis that may be applied to assess item fit for unidimensional IRT models. The residual analysis consists of a comparison of the maximum-likelihood estimate of the item characteristic curve with an alternative ratio estimate of the item characteristic curve. The large sample distribution of the residual is proved to be standardized normal when the IRT model fits the data. We compare the performance of our suggested residual to the standardized residual of Hambleton et al. (Fundamentals of item response theory, Sage, Newbury Park, 1991) in a detailed simulation study. We then calculate our suggested residuals using data from an operational test. The residuals appear to be useful in assessing the item fit for unidimensional IRT models.

  6. Rasch-family models are more valuable than score-based approaches for analysing longitudinal patient-reported outcomes with missing data.

    PubMed

    de Bock, Élodie; Hardouin, Jean-Benoit; Blanchin, Myriam; Le Neel, Tanguy; Kubis, Gildas; Bonnaud-Antignac, Angélique; Dantan, Étienne; Sébille, Véronique

    2016-10-01

    The objective was to compare classical test theory and Rasch-family models derived from item response theory for the analysis of longitudinal patient-reported outcomes data with possibly informative intermittent missing items. A simulation study was performed in order to assess and compare the performance of classical test theory and Rasch model in terms of bias, control of the type I error and power of the test of time effect. The type I error was controlled for classical test theory and Rasch model whether data were complete or some items were missing. Both methods were unbiased and displayed similar power with complete data. When items were missing, Rasch model remained unbiased and displayed higher power than classical test theory. Rasch model performed better than the classical test theory approach regarding the analysis of longitudinal patient-reported outcomes with possibly informative intermittent missing items mainly for power. This study highlights the interest of Rasch-based models in clinical research and epidemiology for the analysis of incomplete patient-reported outcomes data. © The Author(s) 2013.

  7. Two Paradoxes in Linear Regression Analysis.

    PubMed

    Feng, Ge; Peng, Jing; Tu, Dongke; Zheng, Julia Z; Feng, Changyong

    2016-12-25

    Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection.

  8. A computational system for aerodynamic design and analysis of supersonic aircraft. Part 1: General description and theoretical development

    NASA Technical Reports Server (NTRS)

    Middleton, W. D.; Lundry, J. L.

    1976-01-01

    An integrated system of computer programs was developed for the design and analysis of supersonic configurations. The system uses linearized theory methods for the calculation of surface pressures and supersonic area rule concepts in combination with linearized theory for calculation of aerodynamic force coefficients. Interactive graphics are optional at the user's request. Schematics of the program structure and the individual overlays and subroutines are described.

  9. Probability theory versus simulation of petroleum potential in play analysis

    USGS Publications Warehouse

    Crovelli, R.A.

    1987-01-01

    An analytic probabilistic methodology for resource appraisal of undiscovered oil and gas resources in play analysis is presented. This play-analysis methodology is a geostochastic system for petroleum resource appraisal in explored as well as frontier areas. An objective was to replace an existing Monte Carlo simulation method in order to increase the efficiency of the appraisal process. Underlying the two methods is a single geologic model which considers both the uncertainty of the presence of the assessed hydrocarbon and its amount if present. The results of the model are resource estimates of crude oil, nonassociated gas, dissolved gas, and gas for a geologic play in terms of probability distributions. The analytic method is based upon conditional probability theory and a closed form solution of all means and standard deviations, along with the probabilities of occurrence. ?? 1987 J.C. Baltzer A.G., Scientific Publishing Company.

  10. Evolutionary game theory for physical and biological scientists. I. Training and validating population dynamics equations.

    PubMed

    Liao, David; Tlsty, Thea D

    2014-08-06

    Failure to understand evolutionary dynamics has been hypothesized as limiting our ability to control biological systems. An increasing awareness of similarities between macroscopic ecosystems and cellular tissues has inspired optimism that game theory will provide insights into the progression and control of cancer. To realize this potential, the ability to compare game theoretic models and experimental measurements of population dynamics should be broadly disseminated. In this tutorial, we present an analysis method that can be used to train parameters in game theoretic dynamics equations, used to validate the resulting equations, and used to make predictions to challenge these equations and to design treatment strategies. The data analysis techniques in this tutorial are adapted from the analysis of reaction kinetics using the method of initial rates taught in undergraduate general chemistry courses. Reliance on computer programming is avoided to encourage the adoption of these methods as routine bench activities.

  11. 2D problems of surface growth theory with applications to additive manufacturing

    NASA Astrophysics Data System (ADS)

    Manzhirov, A. V.; Mikhin, M. N.

    2018-04-01

    We study 2D problems of surface growth theory of deformable solids and their applications to the analysis of the stress-strain state of AM fabricated products and structures. Statements of the problems are given, and a solution method based on the approaches of the theory of functions of a complex variable is suggested. Computations are carried out for model problems. Qualitative and quantitative results are discussed.

  12. The Gender Subtext of Organizational Learning

    ERIC Educational Resources Information Center

    Raaijmakers, Stephan; Bleijenbergh, Inge; Fokkinga, Brigit; Visser, Max

    2018-01-01

    Purpose: This paper aims to challenge the alleged gender-neutral character of Argyris and Schön's theory of organizational learning (1978). While theories in organizational science seem gender neutral at the surface, a closer analysis reveals they are often based on men's experiences. Design/methodology/approach: This paper uses the method of…

  13. Theory, Method and Practice of Neuroscientific Findings in Science Education

    ERIC Educational Resources Information Center

    Liu, Chia-Ju; Chiang, Wen-Wei

    2014-01-01

    This report provides an overview of neuroscience research that is applicable for science educators. It first offers a brief analysis of empirical studies in educational neuroscience literature, followed by six science concept learning constructs based on the whole brain theory: gaining an understanding of brain function; pattern recognition and…

  14. A Lawyer's Primer on Feminist Theory and Tort.

    ERIC Educational Resources Information Center

    Bender, Leslie

    1988-01-01

    An overview of major components of feminist theory is given and their use in critiquing tort law is illustrated, focusing in particular on a standard-of-care analysis. It is proposed that the same method can be used to examine many other aspects of negligence and tort law. (Author/MSE)

  15. The Role of Theory in Practice.

    ERIC Educational Resources Information Center

    Pyfer, Jean L.

    There are at least three ways in which educational theory can be used in practice: (1) to reexamine our traditional approaches, (2) to provide direction in future practice, and (3) to generate research. Reexamination of traditional approaches through analysis and utilization of theoretical methods is one means of promoting constant growth and…

  16. Information Work Analysis: An Approach to Research on Information Interactions and Information Behaviour in Context

    ERIC Educational Resources Information Center

    Huvila, Isto

    2008-01-01

    Introduction: A work roles and role theory-based approach to conceptualise human information activity, denoted information work analysis is discussed. The present article explicates the approach and its special characteristics and benefits in comparison to earlier methods of analysing human information work. Method: The approach is discussed in…

  17. Laboratory theory and methods for sediment analysis

    USGS Publications Warehouse

    Guy, Harold P.

    1969-01-01

    The diverse character of fluvial sediments makes the choice of laboratory analysis somewhat arbitrary and the pressing of sediment samples difficult. This report presents some theories and methods used by the Water Resources Division for analysis of fluvial sediments to determine the concentration of suspended-sediment samples and the particle-size distribution of both suspended-sediment and bed-material samples. Other analyses related to these determinations may include particle shape, mineral content, and specific gravity, the organic matter and dissolved solids of samples, and the specific weight of soils. The merits and techniques of both the evaporation and filtration methods for concentration analysis are discussed. Methods used for particle-size analysis of suspended-sediment samples may include the sieve pipet, the VA tube-pipet, or the BW tube-VA tube depending on the equipment available, the concentration and approximate size of sediment in the sample, and the settling medium used. The choice of method for most bed-material samples is usually limited to procedures suitable for sand or to some type of visual analysis for large sizes. Several tested forms are presented to help insure a well-ordered system in the laboratory to handle the samples, to help determine the kind of analysis required for each, to conduct the required processes, and to assist in the required computations. Use of the manual should further 'standardize' methods of fluvial sediment analysis among the many laboratories and thereby help to achieve uniformity and precision of the data.

  18. BOOK REVIEW: Vortex Methods: Theory and Practice

    NASA Astrophysics Data System (ADS)

    Cottet, G.-H.; Koumoutsakos, P. D.

    2001-03-01

    The book Vortex Methods: Theory and Practice presents a comprehensive account of the numerical technique for solving fluid flow problems. It provides a very nice balance between the theoretical development and analysis of the various techniques and their practical implementation. In fact, the presentation of the rigorous mathematical analysis of these methods instills confidence in their implementation. The book goes into some detail on the more recent developments that attempt to account for viscous effects, in particular the presence of viscous boundary layers in some flows of interest. The presentation is very readable, with most points illustrated with well-chosen examples, some quite sophisticated. It is a very worthy reference book that should appeal to a large body of readers, from those interested in the mathematical analysis of the methods to practitioners of computational fluid dynamics. The use of the book as a text is compromised by its lack of exercises for students, but it could form the basis of a graduate special topics course. Juan Lopez

  19. An E-plane analysis of aperture-matched horn antennas using the moment method and the uniform geometrical theory of diffraction

    NASA Technical Reports Server (NTRS)

    Heedy, D. J.; Burnside, W. D.

    1984-01-01

    The moment method and the uniform geometrical theory of diffraction are utilized to obtain two separate solutions for the E-plane field pattern of an aperture-matched horn antenna. This particular horn antenna consists of a standard pyramidal horn with the following modifications: a rolled edge section attached to the aperture edges and a curved throat section. The resulting geometry provides significantly better performance in terms of the pattern, impedance, and frequency characteristics than normally obtainable. The moment method is used to calculate the E-plane pattern and BSWR of the antenna. However, at higher frequencies, large amounts of computation time are required. The uniform geometrical theory of diffraction provides a quick and efficient high frequency solution for the E-plane field pattern. In fact, the uniform geometrical theory of diffraction may be used to initially design the antenna; then, the moment method may be applied to fine tune the design. This procedure has been successfully applied to a compact range feed design.

  20. 'Safe passage': pregnant Iranian Kurdish women's choice of childbirth method.

    PubMed

    Shahoei, Roonak; Riji, Haliza Mohd; Saeedi, Zhila Abed

    2011-10-01

    This article is a report of a grounded theory study of the influence of emotions on women's selection of a method of childbirth. There is substantial evidence to indicate that a pregnant woman's emotions play an important role in the decision-making process of selecting a child delivery method. Despite this, however, there is a notable lack of research about the relationship between pregnant women's emotions and their choice of a childbirth method in developing countries. A qualitative study using the grounded theory approach was conducted. The data were collected from 22 Iranian Kurdish pregnant women in their third trimester using semi-structured interviews. Concurrent data collection and analysis took place between 2008 and 2009. A cumulative process of theoretical sampling and constant comparison was used to identify concepts and then expand, validate, and clarify them. The substantive grounded theory that was identified from data analysis was 'safe passage'. 'Safe passage' involved five phases that were not mutually exclusive in their occurrence. The five phases of the 'safe passage' theory that were identified from the data analysis were: 'safety of baby', 'fear', 'previous experience', 'social support' and 'faith'. The goal of 'safe passage' was to achieve a healthy delivery and to ensure the health of the newborn. 'Safe passage' was a process used to determine how the emotions of pregnant Iranian Kurdish women influenced their choice of the mode of child delivery. More research is needed in this field to develop a body of knowledge beneficial to midwifery education and practice. © 2011 Blackwell Publishing Ltd.

  1. Profitability analysis of KINGLONG nearly 5 years

    NASA Astrophysics Data System (ADS)

    Zhang, Mei; Wen, Jinghua

    2017-08-01

    Profitability analysis for measuring business performance and forecast its prospects play an important role. In this paper, the research instance King Long Motor in understanding the basic theory on the basis of financial management, to take a combination of theory and data analysis methods, combined with a measure of profitability related indicators of King Long Motor company’s profitability do a specific analysis to identify factors constraining the profitability of Kinglong company exists and the motivation to improve profitability, which made recommendations to improve the profitability of Kinglong car company to promote the company’s future can be better and faster development.)

  2. QTest: Quantitative Testing of Theories of Binary Choice.

    PubMed

    Regenwetter, Michel; Davis-Stober, Clintin P; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of "Random Cumulative Prospect Theory." A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences.

  3. Applying Instructional Design Theories to Bioinformatics Education in Microarray Analysis and Primer Design Workshops

    PubMed Central

    2005-01-01

    The need to support bioinformatics training has been widely recognized by scientists, industry, and government institutions. However, the discussion of instructional methods for teaching bioinformatics is only beginning. Here we report on a systematic attempt to design two bioinformatics workshops for graduate biology students on the basis of Gagne's Conditions of Learning instructional design theory. This theory, although first published in the early 1970s, is still fundamental in instructional design and instructional technology. First, top-level as well as prerequisite learning objectives for a microarray analysis workshop and a primer design workshop were defined. Then a hierarchy of objectives for each workshop was created. Hands-on tutorials were designed to meet these objectives. Finally, events of learning proposed by Gagne's theory were incorporated into the hands-on tutorials. The resultant manuals were tested on a small number of trainees, revised, and applied in 1-day bioinformatics workshops. Based on this experience and on observations made during the workshops, we conclude that Gagne's Conditions of Learning instructional design theory provides a useful framework for developing bioinformatics training, but may not be optimal as a method for teaching it. PMID:16220141

  4. Bending analysis of embedded nanoplates based on the integral formulation of Eringen's nonlocal theory using the finite element method

    NASA Astrophysics Data System (ADS)

    Ansari, R.; Torabi, J.; Norouzzadeh, A.

    2018-04-01

    Due to the capability of Eringen's nonlocal elasticity theory to capture the small length scale effect, it is widely used to study the mechanical behaviors of nanostructures. Previous studies have indicated that in some cases, the differential form of this theory cannot correctly predict the behavior of structure, and the integral form should be employed to avoid obtaining inconsistent results. The present study deals with the bending analysis of nanoplates resting on elastic foundation based on the integral formulation of Eringen's nonlocal theory. Since the formulation is presented in a general form, arbitrary kernel functions can be used. The first order shear deformation plate theory is considered to model the nanoplates, and the governing equations for both integral and differential forms are presented. Finally, the finite element method is applied to solve the problem. Selected results are given to investigate the effects of elastic foundation and to compare the predictions of integral nonlocal model with those of its differential nonlocal and local counterparts. It is found that by the use of proposed integral formulation of Eringen's nonlocal model, the paradox observed for the cantilever nanoplate is resolved.

  5. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis version 6.0 theory manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S

    The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components requiredmore » for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the Dakota software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of Dakota-related research publications in the areas of surrogate-based optimization, uncertainty quanti cation, and optimization under uncertainty that provide the foundation for many of Dakota's iterative analysis capabilities.« less

  6. Transonic Unsteady Aerodynamics and Aeroelasticity 1987, part 1

    NASA Technical Reports Server (NTRS)

    Bland, Samuel R. (Compiler)

    1989-01-01

    Computational fluid dynamics methods have been widely accepted for transonic aeroelastic analysis. Previously, calculations with the TSD methods were used for 2-D airfoils, but now the TSD methods are applied to the aeroelastic analysis of the complete aircraft. The Symposium papers are grouped into five subject areas, two of which are covered in this part: (1) Transonic Small Disturbance (TSD) theory for complete aircraft configurations; and (2) Full potential and Euler equation methods.

  7. Nonlinear analysis of 0-3 polarized PLZT microplate based on the new modified couple stress theory

    NASA Astrophysics Data System (ADS)

    Wang, Liming; Zheng, Shijie

    2018-02-01

    In this study, based on the new modified couple stress theory, the size- dependent model for nonlinear bending analysis of a pure 0-3 polarized PLZT plate is developed for the first time. The equilibrium equations are derived from a variational formulation based on the potential energy principle and the new modified couple stress theory. The Galerkin method is adopted to derive the nonlinear algebraic equations from governing differential equations. And then the nonlinear algebraic equations are solved by using Newton-Raphson method. After simplification, the new model includes only a material length scale parameter. In addition, numerical examples are carried out to study the effect of material length scale parameter on the nonlinear bending of a simply supported pure 0-3 polarized PLZT plate subjected to light illumination and uniform distributed load. The results indicate the new model is able to capture the size effect and geometric nonlinearity.

  8. A Modified Kirchhoff plate theory for Free Vibration analysis of functionally graded material plates using meshfree method

    NASA Astrophysics Data System (ADS)

    Nguyen Van Do, Vuong

    2018-04-01

    In this paper, a modified Kirchhoff theory is presented for free vibration analyses of functionally graded material (FGM) plate based on modified radial point interpolation method (RPIM). The shear deformation effects are taken account into modified theory to ignore the locking phenomenon of thin plates. Due to the proposed refined plate theory, the number of independent unknowns reduces one variable and exists with four degrees of freedom per node. The simulated free vibration results employed by the modified RPIM are compared with the other analytical solutions to verify the effectiveness and the accuracy of the developed mesh-free method. Detail parametric studies of the proposed method are then conducted including the effectiveness of thickness ratio, boundary condition and material inhomogeneity on the sample problems of square plates. Results illustrated that the modified mesh-free RPIM can effectively predict the numerical calculation as compared to the exact solutions. The obtained numerical results are indicated that the proposed method are stable and well accurate prediction to evaluate with other published analyses.

  9. Current distribution in a three-dimensional IC analyzed by a perturbation method. Part 1: A simple steady state theory

    NASA Technical Reports Server (NTRS)

    Edmonds, Larry D.

    1987-01-01

    The steady state current distribution in a three dimensional integrated circuit is presented. A device physics approach, based on a perturbation method rather than an equivalent lumped circuit approach, is used. The perturbation method allows the various currents to be expressed in terms of elementary solutions which are solutions to very simple boundary value problems. A Simple Steady State Theory is the subtitle because the most obvious limitation of the present version of the analysis is that all depletion region boundary surfaces are treated as equipotential surfaces. This may be an adequate approximation in some applications but it is an obvious weakness in the theory when applied to latched states. Examples that illustrate the use of these analytical methods are not given because they will be presented in detail in the future.

  10. Remembering and forgetting Freud in early twentieth-century dreams.

    PubMed

    Forrester, John

    2006-03-01

    The paper explores the use of Freud's methods of dream interpretation by four English writers of the early twentieth century: T. H. Pear, W. H. R. Rivers, Ernest Jones, and Alix Strachey. Each employed their own dreams in rather different ways: as part of an assessment of Freud's work as a psychological theory, as illustrative of the cogency of Freud's method and theories as part of the psychoanalytic process. Each adopted different approaches to the question of privacy and decorum. The paper argues that assessment of the impact of Freud's work must take account of the application of the method to the researcher's own dreams and the personal impact this process of analysis had upon them, and must also gauge how the dreamers' deployment of Freud's methods influenced their explicit relationship to him and his theories.

  11. A new uniformly valid asymptotic integration algorithm for elasto-plastic creep and unified viscoplastic theories including continuum damage

    NASA Technical Reports Server (NTRS)

    Chulya, Abhisak; Walker, Kevin P.

    1991-01-01

    A new scheme to integrate a system of stiff differential equations for both the elasto-plastic creep and the unified viscoplastic theories is presented. The method has high stability, allows large time increments, and is implicit and iterative. It is suitable for use with continuum damage theories. The scheme was incorporated into MARC, a commercial finite element code through a user subroutine called HYPELA. Results from numerical problems under complex loading histories are presented for both small and large scale analysis. To demonstrate the scheme's accuracy and efficiency, comparisons to a self-adaptive forward Euler method are made.

  12. The all-too-flexible abductive method: ATOM's normative status.

    PubMed

    Romeijn, Jan-Willem

    2008-09-01

    The author discusses the abductive theory of method (ATOM) by Brian Haig from a philosophical perspective, connecting his theory with a number of issues and trends in contemporary philosophy of science. It is argued that as it stands, the methodology presented by Haig is too permissive. Both the use of analogical reasoning and the application of exploratory factor analysis leave us with too many candidate theories to choose from, and explanatory coherence cannot be expected to save the day. The author ends with some suggestions to remedy the permissiveness and lack of normative force in ATOM, deriving from the experimental practice within which psychological data are produced.

  13. A new uniformly valid asymptotic integration algorithm for elasto-plastic-creep and unified viscoplastic theories including continuum damage

    NASA Technical Reports Server (NTRS)

    Chulya, A.; Walker, K. P.

    1989-01-01

    A new scheme to integrate a system of stiff differential equations for both the elasto-plastic creep and the unified viscoplastic theories is presented. The method has high stability, allows large time increments, and is implicit and iterative. It is suitable for use with continuum damage theories. The scheme was incorporated into MARC, a commercial finite element code through a user subroutine called HYPELA. Results from numerical problems under complex loading histories are presented for both small and large scale analysis. To demonstrate the scheme's accuracy and efficiency, comparisons to a self-adaptive forward Euler method are made.

  14. A computational system for aerodynamic design and analysis of supersonic aircraft. Part 2: User's manual

    NASA Technical Reports Server (NTRS)

    Middleton, W. D.; Lundry, J. L.; Coleman, R. G.

    1976-01-01

    An integrated system of computer programs was developed for the design and analysis of supersonic configurations. The system uses linearized theory methods for the calculation of surface pressures and supersonic area rule concepts in combination with linearized theory for calculation of aerodynamic force coefficients. Interactive graphics are optional at the user's request. This user's manual contains a description of the system, an explanation of its usage, the input definition, and example output.

  15. Two Paradoxes in Linear Regression Analysis

    PubMed Central

    FENG, Ge; PENG, Jing; TU, Dongke; ZHENG, Julia Z.; FENG, Changyong

    2016-01-01

    Summary Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection. PMID:28638214

  16. Development of multicomponent hybrid density functional theory with polarizable continuum model for the analysis of nuclear quantum effect and solvent effect on NMR chemical shift.

    PubMed

    Kanematsu, Yusuke; Tachikawa, Masanori

    2014-04-28

    We have developed the multicomponent hybrid density functional theory [MC_(HF+DFT)] method with polarizable continuum model (PCM) for the analysis of molecular properties including both nuclear quantum effect and solvent effect. The chemical shifts and H/D isotope shifts of the picolinic acid N-oxide (PANO) molecule in chloroform and acetonitrile solvents are applied by B3LYP electron exchange-correlation functional for our MC_(HF+DFT) method with PCM (MC_B3LYP/PCM). Our MC_B3LYP/PCM results for PANO are in reasonable agreement with the corresponding experimental chemical shifts and isotope shifts. We further investigated the applicability of our method for acetylacetone in several solvents.

  17. The application of Bandura's self-efficacy theory to abstinence-oriented alcoholism treatment.

    PubMed

    Rollnick, S; Heather, N

    1982-01-01

    This paper explores the relevance of self-efficacy theory (Bandura, 1977b) to the process of abstinence treatment and the phenomenon of relapse. By distinguishing between the particular efficacy and outcome expectations created in treatment it is possible to clarify some of the problems encountered between clinicians and alcoholics. Bandura's theory also explains why some treatment methods might be more effective than others. Analysis of relapse suggests that while some of the expectations created in treatment might serve to promote abstinence, others might unwittingly precipitate relapse. The understanding of abstinence treatment could be enhanced by the testing of hypotheses which emerge from this analysis.

  18. Tantra yukti method of theorization in ayurveda.

    PubMed

    Singh, Anuradha

    2003-01-01

    Method of theorization (Tantra Yukti-s given in Ayurvedic texts) is analyzed in the backdrop of scientific method. Thirty six methodic devices are singled out from texts for analysis in terms of truth specific, theory specific and discourse specific issues. The paper also points out exact problems in conception of method in Ayurveda and Science.

  19. Improving Predictions of Multiple Binary Models in ILP

    PubMed Central

    2014-01-01

    Despite the success of ILP systems in learning first-order rules from small number of examples and complexly structured data in various domains, they struggle in dealing with multiclass problems. In most cases they boil down a multiclass problem into multiple black-box binary problems following the one-versus-one or one-versus-rest binarisation techniques and learn a theory for each one. When evaluating the learned theories of multiple class problems in one-versus-rest paradigm particularly, there is a bias caused by the default rule toward the negative classes leading to an unrealistic high performance beside the lack of prediction integrity between the theories. Here we discuss the problem of using one-versus-rest binarisation technique when it comes to evaluating multiclass data and propose several methods to remedy this problem. We also illustrate the methods and highlight their link to binary tree and Formal Concept Analysis (FCA). Our methods allow learning of a simple, consistent, and reliable multiclass theory by combining the rules of the multiple one-versus-rest theories into one rule list or rule set theory. Empirical evaluation over a number of data sets shows that our proposed methods produce coherent and accurate rule models from the rules learned by the ILP system of Aleph. PMID:24696657

  20. Delamination Analysis Of Composite Curved Bars

    NASA Technical Reports Server (NTRS)

    Ko, William L.; Jackson, Raymond H.

    1990-01-01

    Classical anisotropic elasticity theory used to construct "multilayer" composite semicircular curved bar subjected to end forces and end moments. Radial location and intensity of open-mode delamination stress calculated and compared with results obtained from anisotropic continuum theory and from finite element method. Multilayer theory gave more accurate predictions of location and intensity of open-mode delamination stress. Currently being applied to predict open-mode delamination stress concentrations in horse-shoe-shaped composite test coupons.

  1. A Single Session of rTMS Enhances Small-Worldness in Writer's Cramp: Evidence from Simultaneous EEG-fMRI Multi-Modal Brain Graph.

    PubMed

    Bharath, Rose D; Panda, Rajanikant; Reddam, Venkateswara Reddy; Bhaskar, M V; Gohel, Suril; Bhardwaj, Sujas; Prajapati, Arvind; Pal, Pramod Kumar

    2017-01-01

    Background and Purpose : Repetitive transcranial magnetic stimulation (rTMS) induces widespread changes in brain connectivity. As the network topology differences induced by a single session of rTMS are less known we undertook this study to ascertain whether the network alterations had a small-world morphology using multi-modal graph theory analysis of simultaneous EEG-fMRI. Method : Simultaneous EEG-fMRI was acquired in duplicate before (R1) and after (R2) a single session of rTMS in 14 patients with Writer's Cramp (WC). Whole brain neuronal and hemodynamic network connectivity were explored using the graph theory measures and clustering coefficient, path length and small-world index were calculated for EEG and resting state fMRI (rsfMRI). Multi-modal graph theory analysis was used to evaluate the correlation of EEG and fMRI clustering coefficients. Result : A single session of rTMS was found to increase the clustering coefficient and small-worldness significantly in both EEG and fMRI ( p < 0.05). Multi-modal graph theory analysis revealed significant modulations in the fronto-parietal regions immediately after rTMS. The rsfMRI revealed additional modulations in several deep brain regions including cerebellum, insula and medial frontal lobe. Conclusion : Multi-modal graph theory analysis of simultaneous EEG-fMRI can supplement motor physiology methods in understanding the neurobiology of rTMS in vivo . Coinciding evidence from EEG and rsfMRI reports small-world morphology for the acute phase network hyper-connectivity indicating changes ensuing low-frequency rTMS is probably not "noise".

  2. Biological Embedding: Evaluation and Analysis of an Emerging Concept for Nursing Scholarship

    PubMed Central

    Nist, Marliese Dion

    2016-01-01

    Aim The purpose of this paper is to report the analysis of the concept of biological embedding. Background Research that incorporates a life course perspective is becoming increasingly prominent in the health sciences. Biological embedding is a central concept in life course theory and may be important for nursing theories to enhance our understanding of health states in individuals and populations. Before the concept of biological embedding can be used in nursing theory and research, an analysis of the concept is required to advance it toward full maturity. Design Concept analysis. Data Sources PubMed, CINAHL and PsycINFO were searched for publications using the term ‘biological embedding’ or ‘biological programming’ and published through 2015. Methods An evaluation of the concept was first conducted to determine the concept’s level of maturity and was followed by a concept comparison, using the methods for concept evaluation and comparison described by Morse. Results A consistent definition of biological embedding – the process by which early life experience alters biological processes to affect adult health outcomes – was found throughout the literature. The concept has been used in several theories that describe the mechanisms through which biological embedding might occur and highlight its role in the development of health trajectories. Biological embedding is a partially mature concept, requiring concept comparison with an overlapping concept – biological programming – to more clearly establish the boundaries of biological embedding. Conclusions Biological embedding has significant potential for theory development and application in multiple academic disciplines, including nursing. PMID:27682606

  3. The evaluation of student-centredness of teaching and learning: a new mixed-methods approach

    PubMed Central

    Lemos, Ana R.; Sandars, John E.; Alves, Palmira

    2014-01-01

    Objectives The aim of the study was to develop and consider the usefulness of a new mixed-methods approach to evaluate the student-centredness of teaching and learning on undergraduate medical courses. An essential paradigm for the evaluation was the coherence between how teachers conceptualise their practice (espoused theories) and their actual practice (theories-in-use). Methods The context was a module within an integrated basic sciences course in an undergraduate medical degree programme. The programme had an explicit intention of providing a student-centred curriculum. A content analysis framework based on Weimer’s dimensions of student-centred teaching was used to analyze data collected from individual interviews with seven teachers to identify espoused theories and 34h of classroom observations and one student focus group to identify theories-in-use. The interviewees were identified by purposeful sampling. The findings from the three methods were triangulated to evaluate the student-centredness of teaching and learning on the course. Results Different, but complementary, perspectives of the student-centredness of teaching and learning were identified by each method. The triangulation of the findings revealed coherence between the teachers’ espoused theories and theories-in-use. Conclusions A mixed-methods approach that combined classroom observations with interviews from a purposeful sample of teachers and students offered a useful evaluation of the extent of student-centredness of teaching and learning of this basic science course. Our case study suggests that this new approach is applicable to other courses in medical education. PMID:25341225

  4. Stepwise Analysis of Differential Item Functioning Based on Multiple-Group Partial Credit Model.

    ERIC Educational Resources Information Center

    Muraki, Eiji

    1999-01-01

    Extended an Item Response Theory (IRT) method for detection of differential item functioning to the partial credit model and applied the method to simulated data using a stepwise procedure. Then applied the stepwise DIF analysis based on the multiple-group partial credit model to writing trend data from the National Assessment of Educational…

  5. A Unifying Framework for Causal Analysis in Set-Theoretic Multimethod Research

    ERIC Educational Resources Information Center

    Rohlfing, Ingo; Schneider, Carsten Q.

    2018-01-01

    The combination of Qualitative Comparative Analysis (QCA) with process tracing, which we call set-theoretic multimethod research (MMR), is steadily becoming more popular in empirical research. Despite the fact that both methods have an elected affinity based on set theory, it is not obvious how a within-case method operating in a single case and a…

  6. Formative Research on the Simplifying Conditions Method (SCM) for Task Analysis and Sequencing.

    ERIC Educational Resources Information Center

    Kim, YoungHwan; Reigluth, Charles M.

    The Simplifying Conditions Method (SCM) is a set of guidelines for task analysis and sequencing of instructional content under the Elaboration Theory (ET). This article introduces the fundamentals of SCM and presents the findings from a formative research study on SCM. It was conducted in two distinct phases: design and instruction. In the first…

  7. A Guide to Analyzing Message-Response Sequences and Group Interaction Patterns in Computer-Mediated Communication

    ERIC Educational Resources Information Center

    Jeong, Allan

    2005-01-01

    This paper proposes a set of methods and a framework for evaluating, modeling, and predicting group interactions in computer-mediated communication. The method of sequential analysis is described along with specific software tools and techniques to facilitate the analysis of message-response sequences. In addition, the Dialogic Theory and its…

  8. Using Item Response Theory to Describe the Nonverbal Literacy Assessment (NVLA)

    ERIC Educational Resources Information Center

    Fleming, Danielle; Wilson, Mark; Ahlgrim-Delzell, Lynn

    2018-01-01

    The Nonverbal Literacy Assessment (NVLA) is a literacy assessment designed for students with significant intellectual disabilities. The 218-item test was initially examined using confirmatory factor analysis. This method showed that the test worked as expected, but the items loaded onto a single factor. This article uses item response theory to…

  9. A Proposal for Facilitating More Cooperation in Competitive Sports

    ERIC Educational Resources Information Center

    Jacobs, George M.; Teh, Jiexin; Spencer, Leonora

    2017-01-01

    This article utilises theories, methods and tools from the fields of Social Psychology and Education to suggest new metrics for the analysis of competitive sport. The hope is that these metrics will encourage cooperation to exist alongside of the dominant feelings of competition. The main theory from Social Psychology involved here is Social…

  10. Introduction to Multilevel Item Response Theory Analysis: Descriptive and Explanatory Models

    ERIC Educational Resources Information Center

    Sulis, Isabella; Toland, Michael D.

    2017-01-01

    Item response theory (IRT) models are the main psychometric approach for the development, evaluation, and refinement of multi-item instruments and scaling of latent traits, whereas multilevel models are the primary statistical method when considering the dependence between person responses when primary units (e.g., students) are nested within…

  11. Reasoning about Race and Pedagogy in Two Preservice Science Teachers: A Critical Race Theory Analysis

    ERIC Educational Resources Information Center

    Larkin, Douglas B.; Maloney, Tanya; Perry-Ryder, Gail M.

    2016-01-01

    This study describes the experiences of two preservice science teachers as they progress through their respective teacher education programs and uses critical race theory to examine the manner in which conceptions about race and its pedagogical implications change over time. Using a longitudinal case study method, participants' conceptual…

  12. Turbulent Chemically Reacting Flows According to a Kinetic Theory. Ph.D. Thesis; [statistical analysis/gas flow

    NASA Technical Reports Server (NTRS)

    Hong, Z. C.

    1975-01-01

    A review of various methods of calculating turbulent chemically reacting flow such as the Green Function, Navier-Stokes equation, and others is presented. Nonequilibrium degrees of freedom were employed to study the mixing behavior of a multiscale turbulence field. Classical and modern theories are discussed.

  13. Development and Validation of the Sorokin Psychosocial Love Inventory for Divorced Individuals

    ERIC Educational Resources Information Center

    D'Ambrosio, Joseph G.; Faul, Anna C.

    2013-01-01

    Objective: This study describes the development and validation of the Sorokin Psychosocial Love Inventory (SPSLI) measuring love actions toward a former spouse. Method: Classical measurement theory and confirmatory factor analysis (CFA) were utilized with an a priori theory and factor model to validate the SPSLI. Results: A 15-item scale…

  14. Class-Related Emotions in Secondary Physical Education: A Control-Value Theory Approach

    ERIC Educational Resources Information Center

    Simonton, Kelly L.; Garn, Alex C.; Solmon, Melinda Ann

    2017-01-01

    Purpose: Grounded in control-value theory, a model of students' achievement emotions in physical education (PE) was investigated. Methods: A path analysis tested hypotheses that students' (N = 529) perceptions of teacher responsiveness, assertiveness, and clarity predict control and value beliefs which, in turn, predict enjoyment and boredom.…

  15. Latin and Magic Squares

    ERIC Educational Resources Information Center

    Emanouilidis, Emanuel

    2005-01-01

    Latin squares have existed for hundreds of years but it wasn't until rather recently that Latin squares were used in other areas such as statistics, graph theory, coding theory and the generation of random numbers as well as in the design and analysis of experiments. This note describes Latin and diagonal Latin squares, a method of constructing…

  16. Bifurcation Analysis of an Electrostatically Actuated Nano-Beam Based on Modified Couple Stress Theory

    NASA Astrophysics Data System (ADS)

    Rezaei Kivi, Araz; Azizi, Saber; Norouzi, Peyman

    2017-12-01

    In this paper, the nonlinear size-dependent static and dynamic behavior of an electrostatically actuated nano-beam is investigated. A fully clamped nano-beam is considered for the modeling of the deformable electrode of the NEMS. The governing differential equation of the motion is derived using Hamiltonian principle based on couple stress theory; a non-classical theory for considering length scale effects. The nonlinear partial differential equation of the motion is discretized to a nonlinear Duffing type ODE's using Galerkin method. Static and dynamic pull-in instabilities obtained by both classical theory and MCST are compared. At the second stage of analysis, shooting technique is utilized to obtain the frequency response curve, and to capture the periodic solutions of the motion; the stability of the periodic solutions are gained by Floquet theory. The nonlinear dynamic behavior of the deformable electrode due to the AC harmonic accompanied with size dependency is investigated.

  17. The application of the statistical theory of extreme values to gust-load problems

    NASA Technical Reports Server (NTRS)

    Press, Harry

    1950-01-01

    An analysis is presented which indicates that the statistical theory of extreme values is applicable to the problems of predicting the frequency of encountering the larger gust loads and gust velocities for both specific test conditions as well as commercial transport operations. The extreme-value theory provides an analytic form for the distributions of maximum values of gust load and velocity. Methods of fitting the distribution are given along with a method of estimating the reliability of the predictions. The theory of extreme values is applied to available load data from commercial transport operations. The results indicate that the estimates of the frequency of encountering the larger loads are more consistent with the data and more reliable than those obtained in previous analyses. (author)

  18. Optical methods in nano-biotechnology

    NASA Astrophysics Data System (ADS)

    Bruno, Luigi; Gentile, Francesco

    2016-01-01

    A scientific theory is not a mathematical paradigm. It is a framework that explains natural facts and may predict future observations. A scientific theory may be modified, improved, or rejected. Science is less a collection of theories and more the process that brings either to deny some hypothesis, maintain or accept somehow universal beliefs (or disbeliefs), and create new models that may improve or replace precedent theories. This process cannot be entrusted to common sense, personal experiences or anecdotes (many precepts in physics are indeed counterintuitive), but on a rigorous design, observation and rational to statistical analysis of new experiments. Scientific results are always provisional: scientists rarely proclaim an absolute truth or absolute certainty. Uncertainty is inevitable at the frontiers of knowledge. Notably, this is the definition of the scientific method and what we have written in the above echoes the opinion Marcia McNutt who is the Editor of Science 'Science is a method for deciding whether what we choose to believe has a basis in the laws of nature or not'. A new discovery, a new theory that explains that discovery and the scientific method itself need observations, verifications and are susceptible of falsification.

  19. Radiative transfer modelling inside thermal protection system using hybrid homogenization method for a backward Monte Carlo method coupled with Mie theory

    NASA Astrophysics Data System (ADS)

    Le Foll, S.; André, F.; Delmas, A.; Bouilly, J. M.; Aspa, Y.

    2012-06-01

    A backward Monte Carlo method for modelling the spectral directional emittance of fibrous media has been developed. It uses Mie theory to calculate the radiative properties of single fibres, modelled as infinite cylinders, and the complex refractive index is computed by a Drude-Lorenz model for the dielectric function. The absorption and scattering coefficient are homogenised over several fibres, but the scattering phase function of a single one is used to determine the scattering direction of energy inside the medium. Sensitivity analysis based on several Monte Carlo results has been performed to estimate coefficients for a Multi-Linear Model (MLM) specifically developed for inverse analysis of experimental data. This model concurs with the Monte Carlo method and is highly computationally efficient. In contrast, the surface emissivity model, which assumes an opaque medium, shows poor agreement with the reference Monte Carlo calculations.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Weizhou, E-mail: wzw@lynu.edu.cn, E-mail: ybw@gzu.edu.cn; Zhang, Yu; Sun, Tao

    High-level coupled cluster singles, doubles, and perturbative triples [CCSD(T)] computations with up to the aug-cc-pVQZ basis set (1924 basis functions) and various extrapolations toward the complete basis set (CBS) limit are presented for the sandwich, T-shaped, and parallel-displaced benzene⋯naphthalene complex. Using the CCSD(T)/CBS interaction energies as a benchmark, the performance of some newly developed wave function and density functional theory methods has been evaluated. The best performing methods were found to be the dispersion-corrected PBE0 functional (PBE0-D3) and spin-component scaled zeroth-order symmetry-adapted perturbation theory (SCS-SAPT0). The success of SCS-SAPT0 is very encouraging because it provides one method for energy componentmore » analysis of π-stacked complexes with 200 atoms or more. Most newly developed methods do, however, overestimate the interaction energies. The results of energy component analysis show that interaction energies are overestimated mainly due to the overestimation of dispersion energy.« less

  1. On a class of Newton-like methods for solving nonlinear equations

    NASA Astrophysics Data System (ADS)

    Argyros, Ioannis K.

    2009-06-01

    We provide a semilocal convergence analysis for a certain class of Newton-like methods considered also in [I.K. Argyros, A unifying local-semilocal convergence analysis and applications for two-point Newton-like methods in Banach space, J. Math. Anal. Appl. 298 (2004) 374-397; I.K. Argyros, Computational theory of iterative methods, in: C.K. Chui, L. Wuytack (Eds.), Series: Studies in Computational Mathematics, vol. 15, Elsevier Publ. Co, New York, USA, 2007; J.E. Dennis, Toward a unified convergence theory for Newton-like methods, in: L.B. Rall (Ed.), Nonlinear Functional Analysis and Applications, Academic Press, New York, 1971], in order to approximate a locally unique solution of an equation in a Banach space. Using a combination of Lipschitz and center-Lipschitz conditions, instead of only Lipschitz conditions [F.A. Potra, Sharp error bounds for a class of Newton-like methods, Libertas Math. 5 (1985) 71-84], we provide an analysis with the following advantages over the work in [F.A. Potra, Sharp error bounds for a class of Newton-like methods, Libertas Math. 5 (1985) 71-84] which improved the works in [W.E. Bosarge, P.L. Falb, A multipoint method of third order, J. Optimiz. Theory Appl. 4 (1969) 156-166; W.E. Bosarge, P.L. Falb, Infinite dimensional multipoint methods and the solution of two point boundary value problems, Numer. Math. 14 (1970) 264-286; J.E. Dennis, On the Kantorovich hypothesis for Newton's method, SIAM J. Numer. Anal. 6 (3) (1969) 493-507; J.E. Dennis, Toward a unified convergence theory for Newton-like methods, in: L.B. Rall (Ed.), Nonlinear Functional Analysis and Applications, Academic Press, New York, 1971; H.J. Kornstaedt, Ein allgemeiner Konvergenzstaz fü r verschä rfte Newton-Verfahrem, in: ISNM, vol. 28, Birkhaü ser Verlag, Basel and Stuttgart, 1975, pp. 53-69; P. Laasonen, Ein überquadratisch konvergenter iterativer algorithmus, Ann. Acad. Sci. Fenn. Ser I 450 (1969) 1-10; F.A. Potra, On a modified secant method, L'analyse numérique et la theorie de l'approximation 8 (2) (1979) 203-214; F.A. Potra, An application of the induction method of V. Pták to the study of Regula Falsi, Aplikace Matematiky 26 (1981) 111-120; F.A. Potra, On the convergence of a class of Newton-like methods, in: Iterative Solution of Nonlinear Systems of Equations, in: Lecture Notes in Mathematics, vol. 953, Springer-Verlag, New York, 1982; F.A. Potra, V. Pták, Nondiscrete induction and double step secant method, Math. Scand. 46 (1980) 236-250; F.A. Potra, V. Pták, On a class of modified Newton processes, Numer. Funct. Anal. Optim. 2 (1) (1980) 107-120; F.A. Potra, Sharp error bounds for a class of Newton-like methods, Libertas Math. 5 (1985) 71-84; J.W. Schmidt, Untere Fehlerschranken für Regula-Falsi Verfahren, Period. Math. Hungar. 9 (3) (1978) 241-247; J.W. Schmidt, H. Schwetlick, Ableitungsfreie Verfhren mit höherer Konvergenzgeschwindifkeit, Computing 3 (1968) 215-226; J.F. Traub, Iterative Methods for the Solution of Equations, Prentice Hall, Englewood Cliffs, New Jersey, 1964; M.A. Wolfe, Extended iterative methods for the solution of operator equations, Numer. Math. 31 (1978) 153-174]: larger convergence domain and weaker sufficient convergence conditions. Numerical examples further validating the results are also provided.

  2. Numerical-analytic implementation of the higher-order canonical Van Vleck perturbation theory for the interpretation of medium-sized molecule vibrational spectra.

    PubMed

    Krasnoshchekov, Sergey V; Isayeva, Elena V; Stepanov, Nikolay F

    2012-04-12

    Anharmonic vibrational states of semirigid polyatomic molecules are often studied using the second-order vibrational perturbation theory (VPT2). For efficient higher-order analysis, an approach based on the canonical Van Vleck perturbation theory (CVPT), the Watson Hamiltonian and operators of creation and annihilation of vibrational quanta is employed. This method allows analysis of the convergence of perturbation theory and solves a number of theoretical problems of VPT2, e.g., yields anharmonic constants y(ijk), z(ijkl), and allows the reliable evaluation of vibrational IR and Raman anharmonic intensities in the presence of resonances. Darling-Dennison and higher-order resonance coupling coefficients can be reliably evaluated as well. The method is illustrated on classic molecules: water and formaldehyde. A number of theoretical conclusions results, including the necessity of using sextic force field in the fourth order (CVPT4) and the nearly vanishing CVPT4 contributions for bending and wagging modes. The coefficients of perturbative Dunham-type Hamiltonians in high-orders of CVPT are found to conform to the rules of equality at different orders as earlier proven analytically for diatomic molecules. The method can serve as a good substitution of the more traditional VPT2.

  3. A new blade element method for calculating the performance of high and intermediate solidity axial flow fans

    NASA Technical Reports Server (NTRS)

    Borst, H. V.

    1978-01-01

    A method is presented to design and predict the performance of axial flow rotors operating in a duct. The same method is suitable for the design of ducted fans and open propellers. The unified method is based on the blade element approach and the vortex theory for determining the three dimensional effects, so that two dimensional airfoil data can be used for determining the resultant force on each blade element. Resolution of this force in the thrust and torque planes and integration allows the total performance of the rotor, fan or propeller to be predicted. Three different methods of analysis, one based on a momentum flow theory; another on the vortex theory of propellers; and a third based on the theory of ducted fans, agree and reduce cascade airfoil data to single line as a function of the loading and induced angle of attack at values of constant inflow angle. The theory applies for any solidity from .01 to over 1 and any blade section camber. The effects of the duct and blade number can be determined so that the procedure applies over the entire range from two blade open propellers, to ducted helicopter tail rotors, to axial flow compressors with or without guide vanes, and to wind tunnel drive fans.

  4. Transverse Vibration of Tapered Single-Walled Carbon Nanotubes Embedded in Viscoelastic Medium

    NASA Astrophysics Data System (ADS)

    Lei, Y. J.; Zhang, D. P.; Shen, Z. B.

    2017-12-01

    Based on the nonlocal theory, Euler-Bernoulli beam theory and Kelvin viscoelastic foundation model, free transverse vibration is studied for a tapered viscoelastic single-walled carbon nanotube (visco-SWCNT) embedded in a viscoelastic medium. Firstly, the governing equations for vibration analysis are established. And then, we derive the natural frequencies in closed form for SWCNTs with arbitrary boundary conditions by applying transfer function method and perturbation method. Numerical results are also presented to discuss the effects of nonlocal parameter, relaxation time and taper parameter of SWCNTs, and material property parameters of the medium. This study demonstrates that the proposed model is available for vibration analysis of the tapered SWCNTs-viscoelastic medium coupling system.

  5. Numerical methods for the inverse problem of density functional theory

    DOE PAGES

    Jensen, Daniel S.; Wasserman, Adam

    2017-07-17

    Here, the inverse problem of Kohn–Sham density functional theory (DFT) is often solved in an effort to benchmark and design approximate exchange-correlation potentials. The forward and inverse problems of DFT rely on the same equations but the numerical methods for solving each problem are substantially different. We examine both problems in this tutorial with a special emphasis on the algorithms and error analysis needed for solving the inverse problem. Two inversion methods based on partial differential equation constrained optimization and constrained variational ideas are introduced. We compare and contrast several different inversion methods applied to one-dimensional finite and periodic modelmore » systems.« less

  6. Numerical methods for the inverse problem of density functional theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jensen, Daniel S.; Wasserman, Adam

    Here, the inverse problem of Kohn–Sham density functional theory (DFT) is often solved in an effort to benchmark and design approximate exchange-correlation potentials. The forward and inverse problems of DFT rely on the same equations but the numerical methods for solving each problem are substantially different. We examine both problems in this tutorial with a special emphasis on the algorithms and error analysis needed for solving the inverse problem. Two inversion methods based on partial differential equation constrained optimization and constrained variational ideas are introduced. We compare and contrast several different inversion methods applied to one-dimensional finite and periodic modelmore » systems.« less

  7. How to do a grounded theory study: a worked example of a study of dental practices

    PubMed Central

    2011-01-01

    Background Qualitative methodologies are increasingly popular in medical research. Grounded theory is the methodology most-often cited by authors of qualitative studies in medicine, but it has been suggested that many 'grounded theory' studies are not concordant with the methodology. In this paper we provide a worked example of a grounded theory project. Our aim is to provide a model for practice, to connect medical researchers with a useful methodology, and to increase the quality of 'grounded theory' research published in the medical literature. Methods We documented a worked example of using grounded theory methodology in practice. Results We describe our sampling, data collection, data analysis and interpretation. We explain how these steps were consistent with grounded theory methodology, and show how they related to one another. Grounded theory methodology assisted us to develop a detailed model of the process of adapting preventive protocols into dental practice, and to analyse variation in this process in different dental practices. Conclusions By employing grounded theory methodology rigorously, medical researchers can better design and justify their methods, and produce high-quality findings that will be more useful to patients, professionals and the research community. PMID:21902844

  8. Determination of the Shear Stress Distribution in a Laminate from the Applied Shear Resultant--A Simplified Shear Solution

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Aboudi, Jacob; Yarrington, Phillip W.

    2007-01-01

    The simplified shear solution method is presented for approximating the through-thickness shear stress distribution within a composite laminate based on laminated beam theory. The method does not consider the solution of a particular boundary value problem, rather it requires only knowledge of the global shear loading, geometry, and material properties of the laminate or panel. It is thus analogous to lamination theory in that ply level stresses can be efficiently determined from global load resultants (as determined, for instance, by finite element analysis) at a given location in a structure and used to evaluate the margin of safety on a ply by ply basis. The simplified shear solution stress distribution is zero at free surfaces, continuous at ply boundaries, and integrates to the applied shear load. Comparisons to existing theories are made for a variety of laminates, and design examples are provided illustrating the use of the method for determining through-thickness shear stress margins in several types of composite panels and in the context of a finite element structural analysis.

  9. The study on the Layout of the Charging Station in Chengdu

    NASA Astrophysics Data System (ADS)

    Cai, yun; Zhang, wanquan; You, wei; Mao, pan

    2018-03-01

    In this paper, the comprehensive analysis of the factors affecting the layout of the electric car, considering the principle of layout of the charging station. Using queuing theory in operational research to establish mathematical model and basing on the principle of saving resource and convenient owner to optimize site number. Combining the theory of center to determine the service radius, Using the Gravity method to determine the initial location, Finally using the method of center of gravity to locate the charging station’s location.

  10. The Analysis and Discussion in the Effective Application of the Dispatcher Training Based on Case Teaching Method with the Cause from the Action of the Gap Protection of Main Transformer

    NASA Astrophysics Data System (ADS)

    Yuanyuan, Xu; Zhengmao, Zhang; Xiang, Fang; Yuanshuai, Xu; Xinxin, Song

    2018-03-01

    The combination of theory and practice is a difficult problem on dispatcher training. Through a typical example of case, this paper provides an effective case teaching method for dispatcher training, and combines the theoretical discussion of the rule of experience with cases and achieves vividness. It helps students to understand and catch the key points of the theory, and improve their practical skills.

  11. Challenging convention: symbolic interactionism and grounded theory.

    PubMed

    Newman, Barbara

    2008-01-01

    Not very much is written in the literature about decisions made by researchers and the justifications on method as a result of a particular clinical problem, together with an appropriate and congruent theoretical perspective, particularly for Glaserian grounded theory. I contend the utilisation of symbolic interactionism as a theoretical perspective to inform and guide the evolving research process and analysis of data when using classic or Glaserian grounded theory (GT) method, is not always appropriate. Within this article I offer an analysis of the key issues to be addressed when contemplating the use of Glaserian GT and the utilisation of an appropriate theoretical perspective, rather than accepting convention of symbolic interactionism (SI). The analysis became imperative in a study I conducted that sought to explore the concerns, adaptive behaviours, psychosocial processes and relevant interactions over a 12-month period, among newly diagnosed persons with end stage renal disease, dependent on haemodialysis in the home environment for survival. The reality of perception was central to the end product in the study. Human ethics approval was granted by six committees within New South Wales Health Department and one from a university.

  12. What is an adequate sample size? Operationalising data saturation for theory-based interview studies.

    PubMed

    Francis, Jill J; Johnston, Marie; Robertson, Clare; Glidewell, Liz; Entwistle, Vikki; Eccles, Martin P; Grimshaw, Jeremy M

    2010-12-01

    In interview studies, sample size is often justified by interviewing participants until reaching 'data saturation'. However, there is no agreed method of establishing this. We propose principles for deciding saturation in theory-based interview studies (where conceptual categories are pre-established by existing theory). First, specify a minimum sample size for initial analysis (initial analysis sample). Second, specify how many more interviews will be conducted without new ideas emerging (stopping criterion). We demonstrate these principles in two studies, based on the theory of planned behaviour, designed to identify three belief categories (Behavioural, Normative and Control), using an initial analysis sample of 10 and stopping criterion of 3. Study 1 (retrospective analysis of existing data) identified 84 shared beliefs of 14 general medical practitioners about managing patients with sore throat without prescribing antibiotics. The criterion for saturation was achieved for Normative beliefs but not for other beliefs or studywise saturation. In Study 2 (prospective analysis), 17 relatives of people with Paget's disease of the bone reported 44 shared beliefs about taking genetic testing. Studywise data saturation was achieved at interview 17. We propose specification of these principles for reporting data saturation in theory-based interview studies. The principles may be adaptable for other types of studies.

  13. A Study of Driver's Route Choice Behavior Based on Evolutionary Game Theory

    PubMed Central

    Jiang, Xiaowei; Ji, Yanjie; Deng, Wei

    2014-01-01

    This paper proposes a route choice analytic method that embeds cumulative prospect theory in evolutionary game theory to analyze how the drivers adjust their route choice behaviors under the influence of the traffic information. A simulated network with two alternative routes and one variable message sign is built to illustrate the analytic method. We assume that the drivers in the transportation system are bounded rational, and the traffic information they receive is incomplete. An evolutionary game model is constructed to describe the evolutionary process of the drivers' route choice decision-making behaviors. Here we conclude that the traffic information plays an important role in the route choice behavior. The driver's route decision-making process develops towards different evolutionary stable states in accordance with different transportation situations. The analysis results also demonstrate that employing cumulative prospect theory and evolutionary game theory to study the driver's route choice behavior is effective. This analytic method provides an academic support and suggestion for the traffic guidance system, and may optimize the travel efficiency to a certain extent. PMID:25610455

  14. Sociological analysis and comparative education

    NASA Astrophysics Data System (ADS)

    Woock, Roger R.

    1981-12-01

    It is argued that comparative education is essentially a derivative field of study, in that it borrows theories and methods from academic disciplines. After a brief humanistic phase, in which history and philosophy were central for comparative education, sociology became an important source. In the mid-50's and 60's, sociology in the United States was characterised by Structural Functionalism as a theory, and Social Survey as a dominant methodology. Both were incorporated into the development of comparative education. Increasingly in the 70's, and certainly today, the new developments in sociology are characterised by an attack on Positivism, which is seen as the philosophical position underlying both functionalism and survey methods. New or re-discovered theories with their attendant methodologies included Marxism, Phenomenological Sociology, Critical Theory, and Historical Social Science. The current relationship between comparative education and social science is one of uncertainty, but since social science is seen to be returning to its European roots, the hope is held out for the development of an integrated social theory and method which will provide a much stronger basis for developments in comparative education.

  15. A study of driver's route choice behavior based on evolutionary game theory.

    PubMed

    Jiang, Xiaowei; Ji, Yanjie; Du, Muqing; Deng, Wei

    2014-01-01

    This paper proposes a route choice analytic method that embeds cumulative prospect theory in evolutionary game theory to analyze how the drivers adjust their route choice behaviors under the influence of the traffic information. A simulated network with two alternative routes and one variable message sign is built to illustrate the analytic method. We assume that the drivers in the transportation system are bounded rational, and the traffic information they receive is incomplete. An evolutionary game model is constructed to describe the evolutionary process of the drivers' route choice decision-making behaviors. Here we conclude that the traffic information plays an important role in the route choice behavior. The driver's route decision-making process develops towards different evolutionary stable states in accordance with different transportation situations. The analysis results also demonstrate that employing cumulative prospect theory and evolutionary game theory to study the driver's route choice behavior is effective. This analytic method provides an academic support and suggestion for the traffic guidance system, and may optimize the travel efficiency to a certain extent.

  16. Geometrically nonlinear analysis of laminated elastic structures

    NASA Technical Reports Server (NTRS)

    Reddy, J. N.

    1984-01-01

    Laminated composite plates and shells that can be used to model automobile bodies, aircraft wings and fuselages, and pressure vessels among many other were analyzed. The finite element method, a numerical technique for engineering analysis of structures, is used to model the geometry and approximate the solution. Various alternative formulations for analyzing laminated plates and shells are developed and their finite element models are tested for accuracy and economy in computation. These include the shear deformation laminate theory and degenerated 3-D elasticity theory for laminates.

  17. Dynamical density functional theory analysis of the laning instability in sheared soft matter.

    PubMed

    Scacchi, A; Archer, A J; Brader, J M

    2017-12-01

    Using dynamical density functional theory (DDFT) methods we investigate the laning instability of a sheared colloidal suspension. The nonequilibrium ordering at the laning transition is driven by nonaffine particle motion arising from interparticle interactions. Starting from a DDFT which incorporates the nonaffine motion, we perform a linear stability analysis that enables identification of the regions of parameter space where lanes form. We illustrate our general approach by applying it to a simple one-component fluid of soft penetrable particles.

  18. CLMNANAL: A C++ program for application of the Coleman stability analysis to rotorcraft

    NASA Technical Reports Server (NTRS)

    Lance, Michael B.

    1996-01-01

    This program is an adaptation of the theory of Robert P. Coleman and Arnold M. Feingold as presented in NACA Report 1351, 1958. This theory provided a method for the analysis of multiple-bladed rotor systems to determine the system susceptibility to ground resonance. Their treatment also provided a simple means for determining the required product of rotor and chassis damping factors to suppress the resonance. This C++ program is based on a FORTRAN 77 version of a similar code.

  19. The Neglect of Monotone Comparative Statics Methods

    ERIC Educational Resources Information Center

    Tremblay, Carol Horton; Tremblay, Victor J.

    2010-01-01

    Monotone methods enable comparative static analysis without the restrictive assumptions of the implicit-function theorem. Ease of use and flexibility in solving comparative static and game-theory problems have made monotone methods popular in the economics literature and in graduate courses, but they are still absent from undergraduate…

  20. Grounded theory: a methodological spiral from positivism to postmodernism.

    PubMed

    Mills, Jane; Chapman, Ysanne; Bonner, Ann; Francis, Karen

    2007-04-01

    Our aim in this paper is to explain a methodological/methods package devised to incorporate situational and social world mapping with frame analysis, based on a grounded theory study of Australian rural nurses' experiences of mentoring. Situational analysis, as conceived by Adele Clarke, shifts the research methodology of grounded theory from being located within a postpositivist paradigm to a postmodern paradigm. Clarke uses three types of maps during this process: situational, social world and positional, in combination with discourse analysis. During our grounded theory study, the process of concurrent interview data generation and analysis incorporated situational and social world mapping techniques. An outcome of this was our increased awareness of how outside actors influenced participants in their constructions of mentoring. In our attempts to use Clarke's methodological package, however, it became apparent that our constructivist beliefs about human agency could not be reconciled with the postmodern project of discourse analysis. We then turned to the literature on symbolic interactionism and adopted frame analysis as a method to examine the literature on rural nursing and mentoring as secondary form of data. While we found situational and social world mapping very useful, we were less successful in using positional maps. In retrospect, we would argue that collective action framing provides an alternative to analysing such positions in the literature. This is particularly so for researchers who locate themselves within a constructivist paradigm, and who are therefore unwilling to reject the notion of human agency and the ability of individuals to shape their world in some way. Our example of using this package of situational and social worlds mapping with frame analysis is intended to assist other researchers to locate participants more transparently in the social worlds that they negotiate in their everyday practice.

  1. Combining qualitative and quantitative research within mixed method research designs: a methodological review.

    PubMed

    Östlund, Ulrika; Kidd, Lisa; Wengström, Yvonne; Rowa-Dewar, Neneh

    2011-03-01

    It has been argued that mixed methods research can be useful in nursing and health science because of the complexity of the phenomena studied. However, the integration of qualitative and quantitative approaches continues to be one of much debate and there is a need for a rigorous framework for designing and interpreting mixed methods research. This paper explores the analytical approaches (i.e. parallel, concurrent or sequential) used in mixed methods studies within healthcare and exemplifies the use of triangulation as a methodological metaphor for drawing inferences from qualitative and quantitative findings originating from such analyses. This review of the literature used systematic principles in searching CINAHL, Medline and PsycINFO for healthcare research studies which employed a mixed methods approach and were published in the English language between January 1999 and September 2009. In total, 168 studies were included in the results. Most studies originated in the United States of America (USA), the United Kingdom (UK) and Canada. The analytic approach most widely used was parallel data analysis. A number of studies used sequential data analysis; far fewer studies employed concurrent data analysis. Very few of these studies clearly articulated the purpose for using a mixed methods design. The use of the methodological metaphor of triangulation on convergent, complementary, and divergent results from mixed methods studies is exemplified and an example of developing theory from such data is provided. A trend for conducting parallel data analysis on quantitative and qualitative data in mixed methods healthcare research has been identified in the studies included in this review. Using triangulation as a methodological metaphor can facilitate the integration of qualitative and quantitative findings, help researchers to clarify their theoretical propositions and the basis of their results. This can offer a better understanding of the links between theory and empirical findings, challenge theoretical assumptions and develop new theory. Copyright © 2010 Elsevier Ltd. All rights reserved.

  2. Fast flux module detection using matroid theory.

    PubMed

    Reimers, Arne C; Bruggeman, Frank J; Olivier, Brett G; Stougie, Leen

    2015-05-01

    Flux balance analysis (FBA) is one of the most often applied methods on genome-scale metabolic networks. Although FBA uniquely determines the optimal yield, the pathway that achieves this is usually not unique. The analysis of the optimal-yield flux space has been an open challenge. Flux variability analysis is only capturing some properties of the flux space, while elementary mode analysis is intractable due to the enormous number of elementary modes. However, it has been found by Kelk et al. (2012) that the space of optimal-yield fluxes decomposes into flux modules. These decompositions allow a much easier but still comprehensive analysis of the optimal-yield flux space. Using the mathematical definition of module introduced by Müller and Bockmayr (2013b), we discovered useful connections to matroid theory, through which efficient algorithms enable us to compute the decomposition into modules in a few seconds for genome-scale networks. Using that every module can be represented by one reaction that represents its function, in this article, we also present a method that uses this decomposition to visualize the interplay of modules. We expect the new method to replace flux variability analysis in the pipelines for metabolic networks.

  3. The scalar and electromagnetic form factors of the nucleon in dispersively improved Chiral EFT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alarcon, Jose Manuel

    We present a method for calculating the nucleon form factors of G-parity-even operators. This method combines chiral effective field theory (χEFT) and dispersion theory. Through unitarity we factorize the imaginary part of the form factors into a perturbative part, calculable with χEFT, and a non-perturbative part, obtained through other methods. We consider the scalar and electromagnetic (EM) form factors of the nucleon. The results show an important improvement compared to standard chiral calculations, and can be used in analysis of the low-energy properties of the nucleon.

  4. Return Difference Feedback Design for Robust Uncertainty Tolerance in Stochastic Multivariable Control Systems.

    DTIC Science & Technology

    1984-07-01

    34robustness" analysis for multiloop feedback systems. Reference [55] describes a simple method based on the Perron - Frobenius Theory of non-negative...Viewpoint, " Operator Theory : Advances and Applications, 12, pp. 277-302, 1984. - E. A. Jonckheere, "New Bound on the Sensitivity -- of the Solution of...Reidel, Dordrecht, Holland, 1984. M. G. Safonov, "Comments on Singular Value Theory in Uncertain Feedback Systems, " to appear IEEE Trans. on Automatic

  5. Basic Research in the Mathematical Foundations of Stability Theory, Control Theory and Numerical Linear Algebra.

    DTIC Science & Technology

    1979-09-01

    without determinantal divisors, Linear and Multilinear Algebra 7(1979), 107-109. 4. The use of integral operators in number theory (with C. Ryavec and...Gersgorin revisited, to appear in Letters in Linear Algebra. 15. A surprising determinantal inequality for real matrices (with C.R. Johnson), to appear in...Analysis: An Essay Concerning the Limitations of Some Mathematical Methods in the Social , Political and Biological Sciences, David Berlinski, MIT Press

  6. Factors Influencing Achievement in Undergraduate Social Science Research Methods Courses: A Mixed Methods Analysis

    ERIC Educational Resources Information Center

    Markle, Gail

    2017-01-01

    Undergraduate social science research methods courses tend to have higher than average rates of failure and withdrawal. Lack of success in these courses impedes students' progression through their degree programs and negatively impacts institutional retention and graduation rates. Grounded in adult learning theory, this mixed methods study…

  7. The Use of Propensity Scores in Mediation Analysis

    ERIC Educational Resources Information Center

    Jo, Booil; Stuart, Elizabeth A.; MacKinnon, David P.; Vinokur, Amiram D.

    2011-01-01

    Mediation analysis uses measures of hypothesized mediating variables to test theory for how a treatment achieves effects on outcomes and to improve subsequent treatments by identifying the most efficient treatment components. Most current mediation analysis methods rely on untested distributional and functional form assumptions for valid…

  8. When Stepfathers Claim Stepchildren: A Conceptual Analysis

    ERIC Educational Resources Information Center

    Marsiglio, William

    2004-01-01

    Abstract Guided by social constructionist and symbolic interactionist perspectives and a grounded theory method, my conceptual analysis explores stepfathers experiences with claiming stepchildren as their own. Using indepth interviews with a diverse sample of 36 stepfathers, my analysis focuses on paternal claiming as a core category and generates…

  9. A novel evaluation method for building construction project based on integrated information entropy with reliability theory.

    PubMed

    Bai, Xiao-ping; Zhang, Xi-wei

    2013-01-01

    Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  10. A steady and oscillatory kernel function method for interfering surfaces in subsonic, transonic and supersonic flow. [prediction analysis techniques for airfoils

    NASA Technical Reports Server (NTRS)

    Cunningham, A. M., Jr.

    1976-01-01

    The theory, results and user instructions for an aerodynamic computer program are presented. The theory is based on linear lifting surface theory, and the method is the kernel function. The program is applicable to multiple interfering surfaces which may be coplanar or noncoplanar. Local linearization was used to treat nonuniform flow problems without shocks. For cases with imbedded shocks, the appropriate boundary conditions were added to account for the flow discontinuities. The data describing nonuniform flow fields must be input from some other source such as an experiment or a finite difference solution. The results are in the form of small linear perturbations about nonlinear flow fields. The method was applied to a wide variety of problems for which it is demonstrated to be significantly superior to the uniform flow method. Program user instructions are given for easy access.

  11. A concept analysis of forensic risk.

    PubMed

    Kettles, A M

    2004-08-01

    Forensic risk is a term used in relation to many forms of clinical practice, such as assessment, intervention and management. Rarely is the term defined in the literature and as a concept it is multifaceted. Concept analysis is a method for exploring and evaluating the meaning of words. It gives precise definitions, both theoretical and operational, for use in theory, clinical practice and research. A concept analysis provides a logical basis for defining terms through providing defining attributes, case examples (model, contrary, borderline, related), antecedents and consequences and the implications for nursing. Concept analysis helps us to refine and define a concept that derives from practice, research or theory. This paper will use the strategy of concept analysis to find a working definition for the concept of forensic risk. In conclusion, the historical background and literature are reviewed using concept analysis to bring the term into focus and to define it more clearly. Forensic risk is found to derive both from forensic practice and from risk theory. A proposed definition of forensic risk is given.

  12. TANTRA YUKTI METHOD OF THEORIZATION IN AYURVEDA

    PubMed Central

    Singh, Anuradha

    2003-01-01

    Method of theorization (Tantra Yukti-s given in Ayurvedic texts) is analyzed in the backdrop of scientific method. Thirty six methodic devices are singled out from texts for analysis in terms of truth specific, theory specific and discourse specific issues. The paper also points out exact problems in conception of method in Ayurveda and Science. PMID:22557088

  13. A Guided Tour of Mathematical Methods

    NASA Astrophysics Data System (ADS)

    Snieder, Roel

    2009-04-01

    1. Introduction; 2. Dimensional analysis; 3. Power series; 4. Spherical and cylindrical co-ordinates; 5. The gradient; 6. The divergence of a vector field; 7. The curl of a vector field; 8. The theorem of Gauss; 9. The theorem of Stokes; 10. The Laplacian; 11. Conservation laws; 12. Scale analysis; 13. Linear algebra; 14. The Dirac delta function; 15. Fourier analysis; 16. Analytic functions; 17. Complex integration; 18. Green's functions: principles; 19. Green's functions: examples; 20. Normal modes; 21. Potential theory; 22. Cartesian tensors; 23. Perturbation theory; 24. Asymptotic evaluation of integrals; 25. Variational calculus; 26. Epilogue, on power and knowledge; References.

  14. Quantile Functions, Convergence in Quantile, and Extreme Value Distribution Theory.

    DTIC Science & Technology

    1980-11-01

    Gnanadesikan (1968). Quantile functions are advocated by Parzen (1979) as providing an approach to probability-based data analysis. Quantile functions are... Gnanadesikan , R. (1968). Probability Plotting Methods for the Analysis of Data, Biomtrika, 55, 1-17.

  15. Buckling Analysis of Angle-ply Composite and Sandwich Plates by Combination of Geometric Stiffness Matrix

    NASA Astrophysics Data System (ADS)

    Zhen, Wu; Wanji, Chen

    2007-05-01

    Buckling response of angle-ply laminated composite and sandwich plates are analyzed using the global-local higher order theory with combination of geometric stiffness matrix in this paper. This global-local theory completely fulfills the free surface conditions and the displacement and stress continuity conditions at interfaces. Moreover, the number of unknowns in this theory is independent of the number of layers in the laminate. Based on this global-local theory, a three-noded triangular element satisfying C1 continuity conditions has also been proposed. The bending part of this element is constructed from the concept of DKT element. In order to improve the accuracy of the analysis, a method of modified geometric stiffness matrix has been introduced. Numerical results show that the present theory not only computes accurately the buckling response of general laminated composite plates but also predicts the critical buckling loads of soft-core sandwiches. However, the global higher-order theories as well as first order theories might encounter some difficulties and overestimate the critical buckling loads for soft-core sandwich plates.

  16. Comparison of up-scaling methods in poroelasticity and its generalizations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berryman, J G

    2003-12-13

    Four methods of up-scaling coupled equations at the microscale to equations valid at the mesoscale and/or macroscale for fluid-saturated and partially saturated porous media will be discussed, compared, and contrasted. The four methods are: (1) effective medium theory, (2) mixture theory, (3) two-scale and multiscale homogenization, and (4) volume averaging. All these methods have advantages for some applications and disadvantages for others. For example, effective medium theory, mixture theory, and homogenization methods can all give formulas for coefficients in the up-scaled equations, whereas volume averaging methods give the form of the up-scaled equations but generally must be supplemented with physicalmore » arguments and/or data in order to determine the coefficients. Homogenization theory requires a great deal of mathematical insight from the user in order to choose appropriate scalings for use in the resulting power-law expansions, while volume averaging requires more physical insight to motivate the steps needed to find coefficients. Homogenization often is performed on periodic models, while volume averaging does not require any assumption of periodicity and can therefore be related very directly to laboratory and/or field measurements. Validity of the homogenization process is often limited to specific ranges of frequency - in order to justify the scaling hypotheses that must be made - and therefore cannot be used easily over wide ranges of frequency. However, volume averaging methods can quite easily be used for wide band data analysis. So, we learn from these comparisons that a researcher in the theory of poroelasticity and its generalizations needs to be conversant with two or more of these methods to solve problems generally.« less

  17. Analysis of biomolecular solvation sites by 3D-RISM theory.

    PubMed

    Sindhikara, Daniel J; Hirata, Fumio

    2013-06-06

    We derive, implement, and apply equilibrium solvation site analysis for biomolecules. Our method utilizes 3D-RISM calculations to quickly obtain equilibrium solvent distributions without either necessity of simulation or limits of solvent sampling. Our analysis of these distributions extracts highest likelihood poses of solvent as well as localized entropies, enthalpies, and solvation free energies. We demonstrate our method on a structure of HIV-1 protease where excellent structural and thermodynamic data are available for comparison. Our results, obtained within minutes, show systematic agreement with available experimental data. Further, our results are in good agreement with established simulation-based solvent analysis methods. This method can be used not only for visual analysis of active site solvation but also for virtual screening methods and experimental refinement.

  18. Effects of Anchor Item Methods on the Detection of Differential Item Functioning within the Family of Rasch Models

    ERIC Educational Resources Information Center

    Wang, Wen-Chung

    2004-01-01

    Scale indeterminacy in analysis of differential item functioning (DIF) within the framework of item response theory can be resolved by imposing 3 anchor item methods: the equal-mean-difficulty method, the all-other anchor item method, and the constant anchor item method. In this article, applicability and limitations of these 3 methods are…

  19. The scientific theory profile: A philosophy of science model for science teachers

    NASA Astrophysics Data System (ADS)

    Loving, Cathleen C.

    A model called the Scientific Theory Profile was developed for use with preservice and inservice science teachers or with graduate students interested in the various ways scientific theories are perceived. Early indications - from a survey of institutions with science education programs and a survey of current science methods texts - are that too little emphasis is placed on what contemporary writings reveal about the nature and importance of scientific theories. This prompted the development of the Profile. The Profile consists of a grid, with the x-axis representing methods for judging theories (rational vs. natural), and the y-axis representing views on reigning scientific theories as being the Truth versus models of what works best (realism vs. anti-realism). Three well-known philosophers of science who were selected for detailed analysis and who form the keystone positions on the Profile are Thomas Kuhn, Carl Hempel, and Sir Karl Popper. The hypothesis was that an analysis of the writings of respected individuals in philosophy and history of science who have different perspectives on theories (as well as overarching areas of agreement) could be translated into relative coordinates on a graph; and that this visual model might be helpful to science teachers in developing a balanced philosophy of science and a deeper understanding of the power of reigning theories. Nine other contemporary philosophers, all influenced by the three originals, are included in brief analyses, with their positions on the grid being relative to the keystones. The Scientific Theory Profile then forms the basis for a course, now in the planning stages, in perspectives on the nature of science, primarily for science teachers, with some objectives and activities suggested.

  20. Confronting Analytical Dilemmas for Understanding Complex Human Interactions in Design-Based Research from a Cultural-Historical Activity Theory (CHAT) Framework

    ERIC Educational Resources Information Center

    Yamagata-Lynch, Lisa C.

    2007-01-01

    Understanding human activity in real-world situations often involves complicated data collection, analysis, and presentation methods. This article discusses how Cultural-Historical Activity Theory (CHAT) can inform design-based research practices that focus on understanding activity in real-world situations. I provide a sample data set with…

  1. What Is "Human" in Human Capital Theory? Marking a Transition from Industrial to Postindustrial Education

    ERIC Educational Resources Information Center

    Peers, Chris

    2015-01-01

    This article addresses educational practice as a site for the development of human capital theory. The article considers metaphysical constructions that are broadly typical of educational thought, and shows how they are amenable to economic analysis. Using different Marxist and feminist methods, it discusses pedagogy and the family as kinds of…

  2. Least Squares Distance Method of Cognitive Validation and Analysis for Binary Items Using Their Item Response Theory Parameters

    ERIC Educational Resources Information Center

    Dimitrov, Dimiter M.

    2007-01-01

    The validation of cognitive attributes required for correct answers on binary test items or tasks has been addressed in previous research through the integration of cognitive psychology and psychometric models using parametric or nonparametric item response theory, latent class modeling, and Bayesian modeling. All previous models, each with their…

  3. Translation Fidelity of Psychological Scales: An Item Response Theory Analysis of an Individualism-Collectivism Scale.

    ERIC Educational Resources Information Center

    Bontempo, Robert

    1993-01-01

    Describes a method for assessing the quality of translations based on item response theory (IRT). Results from the IRT technique with French and Chinese versions of a scale measuring individualism-collectivism for samples of 250 U.S., 357 French, and 290 Chinese undergraduates show how several biased items are detected. (SLD)

  4. Videogames, Tools for Change: A Study Based on Activity Theory

    ERIC Educational Resources Information Center

    Méndez, Laura; Lacasa, Pilar

    2015-01-01

    Introduction: The purpose of this study is to provide a framework for analysis from which to interpret the transformations that take place, as perceived by the participants, when commercial video games are used in the classroom. We will show how Activity Theory (AT) is able to explain and interpret these changes. Method: Case studies are…

  5. Latcrit Educational Leadership and Advocacy: Struggling over Whiteness as Property in Texas School Finance

    ERIC Educational Resources Information Center

    Aleman, Enrique, Jr.

    2009-01-01

    In this article, the author seeks to re-imagine the political and policy roles of educational leaders of color, offering an alternative method for educational leadership, advocacy, and policy analysis. The author uses critical race theory (CRT) and Latina/o critical (LatCrit) theory to problematize the way politically-active Mexican American…

  6. Team Performance Pay and Motivation Theory: A Mixed Methods Study

    ERIC Educational Resources Information Center

    Wells, Pamela; Combs, Julie P.; Bustamante, Rebecca M.

    2013-01-01

    This study was conducted to explore teachers' perceptions of a team performance pay program in a large suburban school district through the lens of motivation theories. Mixed data analysis was used to analyze teacher responses from two archival questionnaires (Year 1, n = 368; Year 2, n = 649). Responses from teachers who participated in the team…

  7. An Unsolved Electric Circuit: A Common Misconception

    ERIC Educational Resources Information Center

    Harsha, N. R. Sree; Sreedevi, A.; Prakash, Anupama

    2015-01-01

    Despite a number of theories in circuit analysis, little is known about the behaviour of ideal equal voltage sources in parallel, connected across a resistive load. We neither have any theory that can predict the voltage source that provides the load current, nor is there any method to test it experimentally. In a series of experiments performed…

  8. Model Choice and Sample Size in Item Response Theory Analysis of Aphasia Tests

    ERIC Educational Resources Information Center

    Hula, William D.; Fergadiotis, Gerasimos; Martin, Nadine

    2012-01-01

    Purpose: The purpose of this study was to identify the most appropriate item response theory (IRT) measurement model for aphasia tests requiring 2-choice responses and to determine whether small samples are adequate for estimating such models. Method: Pyramids and Palm Trees (Howard & Patterson, 1992) test data that had been collected from…

  9. Indirect scaling methods for testing quantitative emotion theories.

    PubMed

    Junge, Martin; Reisenzein, Rainer

    2013-01-01

    Two studies investigated the utility of indirect scaling methods, based on graded pair comparisons, for the testing of quantitative emotion theories. In Study 1, we measured the intensity of relief and disappointment caused by lottery outcomes, and in Study 2, the intensity of disgust evoked by pictures, using both direct intensity ratings and graded pair comparisons. The stimuli were systematically constructed to reflect variables expected to influence the intensity of the emotions according to theoretical models of relief/disappointment and disgust, respectively. Two probabilistic scaling methods were used to estimate scale values from the pair comparison judgements: Additive functional measurement (AFM) and maximum likelihood difference scaling (MLDS). The emotion models were fitted to the direct and indirect intensity measurements using nonlinear regression (Study 1) and analysis of variance (Study 2). Both studies found substantially improved fits of the emotion models for the indirectly determined emotion intensities, with their advantage being evident particularly at the level of individual participants. The results suggest that indirect scaling methods yield more precise measurements of emotion intensity than rating scales and thereby provide stronger tests of emotion theories in general and quantitative emotion theories in particular.

  10. Role of regression analysis and variation of rheological data in calculation of pressure drop for sludge pipelines.

    PubMed

    Farno, E; Coventry, K; Slatter, P; Eshtiaghi, N

    2018-06-15

    Sludge pumps in wastewater treatment plants are often oversized due to uncertainty in calculation of pressure drop. This issue costs millions of dollars for industry to purchase and operate the oversized pumps. Besides costs, higher electricity consumption is associated with extra CO 2 emission which creates huge environmental impacts. Calculation of pressure drop via current pipe flow theory requires model estimation of flow curve data which depends on regression analysis and also varies with natural variation of rheological data. This study investigates impact of variation of rheological data and regression analysis on variation of pressure drop calculated via current pipe flow theories. Results compare the variation of calculated pressure drop between different models and regression methods and suggest on the suitability of each method. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Theory and applications of structured light single pixel imaging

    NASA Astrophysics Data System (ADS)

    Stokoe, Robert J.; Stockton, Patrick A.; Pezeshki, Ali; Bartels, Randy A.

    2018-02-01

    Many single-pixel imaging techniques have been developed in recent years. Though the methods of image acquisition vary considerably, the methods share unifying features that make general analysis possible. Furthermore, the methods developed thus far are based on intuitive processes that enable simple and physically-motivated reconstruction algorithms, however, this approach may not leverage the full potential of single-pixel imaging. We present a general theoretical framework of single-pixel imaging based on frame theory, which enables general, mathematically rigorous analysis. We apply our theoretical framework to existing single-pixel imaging techniques, as well as provide a foundation for developing more-advanced methods of image acquisition and reconstruction. The proposed frame theoretic framework for single-pixel imaging results in improved noise robustness, decrease in acquisition time, and can take advantage of special properties of the specimen under study. By building on this framework, new methods of imaging with a single element detector can be developed to realize the full potential associated with single-pixel imaging.

  12. Multivariate Longitudinal Methods for Studying Developmental Relationships between Depression and Academic Achievement

    ERIC Educational Resources Information Center

    Grimm, Kevin J.

    2007-01-01

    Recent advances in methods and computer software for longitudinal data analysis have pushed researchers to more critically examine developmental theories. In turn, researchers have also begun to push longitudinal methods by asking more complex developmental questions. One such question involves the relationships between two developmental…

  13. An aggregate method to calibrate the reference point of cumulative prospect theory-based route choice model for urban transit network

    NASA Astrophysics Data System (ADS)

    Zhang, Yufeng; Long, Man; Luo, Sida; Bao, Yu; Shen, Hanxia

    2015-12-01

    Transit route choice model is the key technology of public transit systems planning and management. Traditional route choice models are mostly based on expected utility theory which has an evident shortcoming that it cannot accurately portray travelers' subjective route choice behavior for their risk preferences are not taken into consideration. Cumulative prospect theory (CPT), a brand new theory, can be used to describe travelers' decision-making process under the condition of uncertainty of transit supply and risk preferences of multi-type travelers. The method to calibrate the reference point, a key parameter to CPT-based transit route choice model, determines the precision of the model to a great extent. In this paper, a new method is put forward to obtain the value of reference point which combines theoretical calculation and field investigation results. Comparing the proposed method with traditional method, it shows that the new method can promote the quality of CPT-based model by improving the accuracy in simulating travelers' route choice behaviors based on transit trip investigation from Nanjing City, China. The proposed method is of great significance to logical transit planning and management, and to some extent makes up the defect that obtaining the reference point is solely based on qualitative analysis.

  14. [Analysis and experimental verification of sensitivity and SNR of laser warning receiver].

    PubMed

    Zhang, Ji-Long; Wang, Ming; Tian, Er-Ming; Li, Xiao; Wang, Zhi-Bin; Zhang, Yue

    2009-01-01

    In order to countermeasure increasingly serious threat from hostile laser in modern war, it is urgent to do research on laser warning technology and system, and the sensitivity and signal to noise ratio (SNR) are two important performance parameters in laser warning system. In the present paper, based on the signal statistical detection theory, a method for calculation of the sensitivity and SNR in coherent detection laser warning receiver (LWR) has been proposed. Firstly, the probabilities of the laser signal and receiver noise were analyzed. Secondly, based on the threshold detection theory and Neyman-Pearson criteria, the signal current equation was established by introducing detection probability factor and false alarm rate factor, then, the mathematical expressions of sensitivity and SNR were deduced. Finally, by using method, the sensitivity and SNR of the sinusoidal grating laser warning receiver developed by our group were analyzed, and the theoretic calculation and experimental results indicate that the SNR analysis method is feasible, and can be used in performance analysis of LWR.

  15. Advances and trends in structures and dynamics; Proceedings of the Symposium, Washington, DC, October 22-25, 1984

    NASA Technical Reports Server (NTRS)

    Noor, A. K. (Editor); Hayduk, R. J. (Editor)

    1985-01-01

    Among the topics discussed are developments in structural engineering hardware and software, computation for fracture mechanics, trends in numerical analysis and parallel algorithms, mechanics of materials, advances in finite element methods, composite materials and structures, determinations of random motion and dynamic response, optimization theory, automotive tire modeling methods and contact problems, the damping and control of aircraft structures, and advanced structural applications. Specific topics covered include structural design expert systems, the evaluation of finite element system architectures, systolic arrays for finite element analyses, nonlinear finite element computations, hierarchical boundary elements, adaptive substructuring techniques in elastoplastic finite element analyses, automatic tracking of crack propagation, a theory of rate-dependent plasticity, the torsional stability of nonlinear eccentric structures, a computation method for fluid-structure interaction, the seismic analysis of three-dimensional soil-structure interaction, a stress analysis for a composite sandwich panel, toughness criterion identification for unidirectional composite laminates, the modeling of submerged cable dynamics, and damping synthesis for flexible spacecraft structures.

  16. The development of methods for the prediction of primary creep behavior in metals

    NASA Technical Reports Server (NTRS)

    Zerwekh, R. P.

    1978-01-01

    The applicability of a thermodynamic constitutive theory of deformation to the prediction of primary creep and creep strain relaxation behavior in metals is examined. Constitutive equations derived from the theory are subjected to a parametric analysis in order to determine the influence of several parameters on the curve forms generated by the equations. A computer program is developed which enables the solution of a generalized constitutive equation using experimental data as input. Several metals were tested to form a data base of primary creep and relaxation behavior. The extent to which these materials conformed to the constitutive equation showed wide variability, with the alloy Ti-6Al-4V exhibiting the most consistent results. Accordingly, most of the analysis is concentrated upon data from that alloy, although creep and relaxation data from all the materials tested are presented. Experimental methods are outlined as well as some variations in methods of analysis. Various theoretical and practical implications of the work are discussed.

  17. Analysis of high vacuum systems using SINDA'85

    NASA Technical Reports Server (NTRS)

    Spivey, R. A.; Clanton, S. E.; Moore, J. D.

    1993-01-01

    The theory, algorithms, and test data correlation analysis of a math model developed to predict performance of the Space Station Freedom Vacuum Exhaust System are presented. The theory used to predict the flow characteristics of viscous, transition, and molecular flow is presented in detail. Development of user subroutines which predict the flow characteristics in conjunction with the SINDA'85/FLUINT analysis software are discussed. The resistance-capacitance network approach with application to vacuum system analysis is demonstrated and results from the model are correlated with test data. The model was developed to predict the performance of the Space Station Freedom Vacuum Exhaust System. However, the unique use of the user subroutines developed in this model and written into the SINDA'85/FLUINT thermal analysis model provides a powerful tool that can be used to predict the transient performance of vacuum systems and gas flow in tubes of virtually any geometry. This can be accomplished using a resistance-capacitance (R-C) method very similar to the methods used to perform thermal analyses.

  18. The evaluation of student-centredness of teaching and learning: a new mixed-methods approach.

    PubMed

    Lemos, Ana R; Sandars, John E; Alves, Palmira; Costa, Manuel J

    2014-08-14

    The aim of the study was to develop and consider the usefulness of a new mixed-methods approach to evaluate the student-centredness of teaching and learning on undergraduate medical courses. An essential paradigm for the evaluation was the coherence between how teachers conceptualise their practice (espoused theories) and their actual practice (theories-in-use). The context was a module within an integrated basic sciences course in an undergraduate medical degree programme. The programme had an explicit intention of providing a student-centred curriculum. A content analysis framework based on Weimer's dimensions of student-centred teaching was used to analyze data collected from individual interviews with seven teachers to identify espoused theories and 34h of classroom observations and one student focus group to identify theories-in-use. The interviewees were identified by purposeful sampling. The findings from the three methods were triangulated to evaluate the student-centredness of teaching and learning on the course. Different, but complementary, perspectives of the student-centredness of teaching and learning were identified by each method. The triangulation of the findings revealed coherence between the teachers' espoused theories and theories-in-use. A mixed-methods approach that combined classroom observations with interviews from a purposeful sample of teachers and students offered a useful evaluation of the extent of student-centredness of teaching and learning of this basic science course. Our case study suggests that this new approach is applicable to other courses in medical education.

  19. Floquet stability analysis of the longitudinal dynamics of two hovering model insects

    PubMed Central

    Wu, Jiang Hao; Sun, Mao

    2012-01-01

    Because of the periodically varying aerodynamic and inertial forces of the flapping wings, a hovering or constant-speed flying insect is a cyclically forcing system, and, generally, the flight is not in a fixed-point equilibrium, but in a cyclic-motion equilibrium. Current stability theory of insect flight is based on the averaged model and treats the flight as a fixed-point equilibrium. In the present study, we treated the flight as a cyclic-motion equilibrium and used the Floquet theory to analyse the longitudinal stability of insect flight. Two hovering model insects were considered—a dronefly and a hawkmoth. The former had relatively high wingbeat frequency and small wing-mass to body-mass ratio, and hence very small amplitude of body oscillation; while the latter had relatively low wingbeat frequency and large wing-mass to body-mass ratio, and hence relatively large amplitude of body oscillation. For comparison, analysis using the averaged-model theory (fixed-point stability analysis) was also made. Results of both the cyclic-motion stability analysis and the fixed-point stability analysis were tested by numerical simulation using complete equations of motion coupled with the Navier–Stokes equations. The Floquet theory (cyclic-motion stability analysis) agreed well with the simulation for both the model dronefly and the model hawkmoth; but the averaged-model theory gave good results only for the dronefly. Thus, for an insect with relatively large body oscillation at wingbeat frequency, cyclic-motion stability analysis is required, and for their control analysis, the existing well-developed control theories for systems of fixed-point equilibrium are no longer applicable and new methods that take the cyclic variation of the flight dynamics into account are needed. PMID:22491980

  20. Weighted Fuzzy Risk Priority Number Evaluation of Turbine and Compressor Blades Considering Failure Mode Correlations

    NASA Astrophysics Data System (ADS)

    Gan, Luping; Li, Yan-Feng; Zhu, Shun-Peng; Yang, Yuan-Jian; Huang, Hong-Zhong

    2014-06-01

    Failure mode, effects and criticality analysis (FMECA) and Fault tree analysis (FTA) are powerful tools to evaluate reliability of systems. Although single failure mode issue can be efficiently addressed by traditional FMECA, multiple failure modes and component correlations in complex systems cannot be effectively evaluated. In addition, correlated variables and parameters are often assumed to be precisely known in quantitative analysis. In fact, due to the lack of information, epistemic uncertainty commonly exists in engineering design. To solve these problems, the advantages of FMECA, FTA, fuzzy theory, and Copula theory are integrated into a unified hybrid method called fuzzy probability weighted geometric mean (FPWGM) risk priority number (RPN) method. The epistemic uncertainty of risk variables and parameters are characterized by fuzzy number to obtain fuzzy weighted geometric mean (FWGM) RPN for single failure mode. Multiple failure modes are connected using minimum cut sets (MCS), and Boolean logic is used to combine fuzzy risk priority number (FRPN) of each MCS. Moreover, Copula theory is applied to analyze the correlation of multiple failure modes in order to derive the failure probabilities of each MCS. Compared to the case where dependency among multiple failure modes is not considered, the Copula modeling approach eliminates the error of reliability analysis. Furthermore, for purpose of quantitative analysis, probabilities importance weight from failure probabilities are assigned to FWGM RPN to reassess the risk priority, which generalize the definition of probability weight and FRPN, resulting in a more accurate estimation than that of the traditional models. Finally, a basic fatigue analysis case drawn from turbine and compressor blades in aeroengine is used to demonstrate the effectiveness and robustness of the presented method. The result provides some important insights on fatigue reliability analysis and risk priority assessment of structural system under failure correlations.

  1. Computing sensitivity and selectivity in parallel factor analysis and related multiway techniques: the need for further developments in net analyte signal theory.

    PubMed

    Olivieri, Alejandro C

    2005-08-01

    Sensitivity and selectivity are important figures of merit in multiway analysis, regularly employed for comparison of the analytical performance of methods and for experimental design and planning. They are especially interesting in the second-order advantage scenario, where the latter property allows for the analysis of samples with a complex background, permitting analyte determination even in the presence of unsuspected interferences. Since no general theory exists for estimating the multiway sensitivity, Monte Carlo numerical calculations have been developed for estimating variance inflation factors, as a convenient way of assessing both sensitivity and selectivity parameters for the popular parallel factor (PARAFAC) analysis and also for related multiway techniques. When the second-order advantage is achieved, the existing expressions derived from net analyte signal theory are only able to adequately cover cases where a single analyte is calibrated using second-order instrumental data. However, they fail for certain multianalyte cases, or when third-order data are employed, calling for an extension of net analyte theory. The results have strong implications in the planning of multiway analytical experiments.

  2. More efficient parameter estimates for factor analysis of ordinal variables by ridge generalized least squares.

    PubMed

    Yuan, Ke-Hai; Jiang, Ge; Cheng, Ying

    2017-11-01

    Data in psychology are often collected using Likert-type scales, and it has been shown that factor analysis of Likert-type data is better performed on the polychoric correlation matrix than on the product-moment covariance matrix, especially when the distributions of the observed variables are skewed. In theory, factor analysis of the polychoric correlation matrix is best conducted using generalized least squares with an asymptotically correct weight matrix (AGLS). However, simulation studies showed that both least squares (LS) and diagonally weighted least squares (DWLS) perform better than AGLS, and thus LS or DWLS is routinely used in practice. In either LS or DWLS, the associations among the polychoric correlation coefficients are completely ignored. To mend such a gap between statistical theory and empirical work, this paper proposes new methods, called ridge GLS, for factor analysis of ordinal data. Monte Carlo results show that, for a wide range of sample sizes, ridge GLS methods yield uniformly more accurate parameter estimates than existing methods (LS, DWLS, AGLS). A real-data example indicates that estimates by ridge GLS are 9-20% more efficient than those by existing methods. Rescaled and adjusted test statistics as well as sandwich-type standard errors following the ridge GLS methods also perform reasonably well. © 2017 The British Psychological Society.

  3. Unified Aeroacoustics Analysis for High Speed Turboprop Aerodynamics and Noise. Volume 1; Development of Theory for Blade Loading, Wakes, and Noise

    NASA Technical Reports Server (NTRS)

    Hanson, D. B.

    1991-01-01

    A unified theory for the aerodynamics and noise of advanced turboprops are presented. Aerodynamic topics include calculation of performance, blade load distribution, and non-uniform wake flow fields. Blade loading can be steady or unsteady due to fixed distortion, counter-rotating wakes, or blade vibration. The aerodynamic theory is based on the pressure potential method and is therefore basically linear. However, nonlinear effects associated with finite axial induction and blade vortex flow are included via approximate methods. Acoustic topics include radiation of noise caused by blade thickness, steady loading (including vortex lift), and unsteady loading. Shielding of the fuselage by its boundary layer and the wing are treated in separate analyses that are compatible but not integrated with the aeroacoustic theory for rotating blades.

  4. QTest: Quantitative Testing of Theories of Binary Choice

    PubMed Central

    Regenwetter, Michel; Davis-Stober, Clintin P.; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of “Random Cumulative Prospect Theory.” A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences. PMID:24999495

  5. Qualitative Data Analysis for Health Services Research: Developing Taxonomy, Themes, and Theory

    PubMed Central

    Bradley, Elizabeth H; Curry, Leslie A; Devers, Kelly J

    2007-01-01

    Objective To provide practical strategies for conducting and evaluating analyses of qualitative data applicable for health services researchers. Data Sources and Design We draw on extant qualitative methodological literature to describe practical approaches to qualitative data analysis. Approaches to data analysis vary by discipline and analytic tradition; however, we focus on qualitative data analysis that has as a goal the generation of taxonomy, themes, and theory germane to health services research. Principle Findings We describe an approach to qualitative data analysis that applies the principles of inductive reasoning while also employing predetermined code types to guide data analysis and interpretation. These code types (conceptual, relationship, perspective, participant characteristics, and setting codes) define a structure that is appropriate for generation of taxonomy, themes, and theory. Conceptual codes and subcodes facilitate the development of taxonomies. Relationship and perspective codes facilitate the development of themes and theory. Intersectional analyses with data coded for participant characteristics and setting codes can facilitate comparative analyses. Conclusions Qualitative inquiry can improve the description and explanation of complex, real-world phenomena pertinent to health services research. Greater understanding of the processes of qualitative data analysis can be helpful for health services researchers as they use these methods themselves or collaborate with qualitative researchers from a wide range of disciplines. PMID:17286625

  6. Qualitative data analysis for health services research: developing taxonomy, themes, and theory.

    PubMed

    Bradley, Elizabeth H; Curry, Leslie A; Devers, Kelly J

    2007-08-01

    To provide practical strategies for conducting and evaluating analyses of qualitative data applicable for health services researchers. DATA SOURCES AND DESIGN: We draw on extant qualitative methodological literature to describe practical approaches to qualitative data analysis. Approaches to data analysis vary by discipline and analytic tradition; however, we focus on qualitative data analysis that has as a goal the generation of taxonomy, themes, and theory germane to health services research. We describe an approach to qualitative data analysis that applies the principles of inductive reasoning while also employing predetermined code types to guide data analysis and interpretation. These code types (conceptual, relationship, perspective, participant characteristics, and setting codes) define a structure that is appropriate for generation of taxonomy, themes, and theory. Conceptual codes and subcodes facilitate the development of taxonomies. Relationship and perspective codes facilitate the development of themes and theory. Intersectional analyses with data coded for participant characteristics and setting codes can facilitate comparative analyses. Qualitative inquiry can improve the description and explanation of complex, real-world phenomena pertinent to health services research. Greater understanding of the processes of qualitative data analysis can be helpful for health services researchers as they use these methods themselves or collaborate with qualitative researchers from a wide range of disciplines.

  7. “History, Theory and Ethics of Medicine”: The Last Ten Years. A Survey of Course Content, Methods and Structural Preconditions at Twenty-nine German Medical Faculties

    PubMed Central

    Schildmann, Jan; Bruns, Florian; Hess, Volker; Vollmann, Jochen

    2017-01-01

    Objective: “History, Theory, Ethics of Medicine” (German: “Geschichte, Theorie, Ethik der Medizin”, abbreviation: GTE) forms part of the obligatory curriculum for medical students in Germany since the winter semester 2003/2004. This paper presents the results of a national survey on the contents, methods and framework of GTE teaching. Methods: Semi-structured questionnaire dispatched in July 2014 to 38 institutions responsible for GTE teaching. Descriptive analysis of quantitative data and content analysis of free-text answers. Results: It was possible to collect data from 29 institutes responsible for GTE teaching (response: 76%). There is at least one professorial chair for GTE in 19 faculties; two professorial chairs or professorships remained vacant at the time of the survey. The number of students taught per academic year ranges from <100 to >350. Teaching in GTE comprises an average of 2.18 hours per week per semester (min: 1, max: 6). Teaching in GTE is proportionally distributed according to an arithmetic average as follows: history: 35.4%, theory 14.7% and ethics 49.9%. Written learning objectives were formulated for GTE in 24 faculties. The preferred themes of teaching in history, theory or ethics which according to respondents should be taught comprise a broad spectrum and vary. Teaching in ethics (79 from a max. of 81 possible points) is, when compared to history (61/81) and theory (53/81), attributed the most significance for the training of medical doctors. Conclusion: 10 years after the introduction of GTE the number of students and the personnel resources available at the institutions vary considerably. In light of the differences regarding the content elicited in this study the pros and cons of heterogeneity in GTE should be discussed. PMID:28584871

  8. "History, Theory and Ethics of Medicine": The Last Ten Years. A Survey of Course Content, Methods and Structural Preconditions at Twenty-nine German Medical Faculties.

    PubMed

    Schildmann, Jan; Bruns, Florian; Hess, Volker; Vollmann, Jochen

    2017-01-01

    Objective: "History, Theory, Ethics of Medicine" (German: "Geschichte, Theorie, Ethik der Medizin", abbreviation: GTE) forms part of the obligatory curriculum for medical students in Germany since the winter semester 2003/2004. This paper presents the results of a national survey on the contents, methods and framework of GTE teaching. Methods: Semi-structured questionnaire dispatched in July 2014 to 38 institutions responsible for GTE teaching. Descriptive analysis of quantitative data and content analysis of free-text answers. Results: It was possible to collect data from 29 institutes responsible for GTE teaching (response: 76%). There is at least one professorial chair for GTE in 19 faculties; two professorial chairs or professorships remained vacant at the time of the survey. The number of students taught per academic year ranges from <100 to >350. Teaching in GTE comprises an average of 2.18 hours per week per semester (min: 1, max: 6). Teaching in GTE is proportionally distributed according to an arithmetic average as follows: history: 35.4%, theory 14.7% and ethics 49.9%. Written learning objectives were formulated for GTE in 24 faculties. The preferred themes of teaching in history, theory or ethics which according to respondents should be taught comprise a broad spectrum and vary. Teaching in ethics (79 from a max. of 81 possible points) is, when compared to history (61/81) and theory (53/81), attributed the most significance for the training of medical doctors. Conclusion: 10 years after the introduction of GTE the number of students and the personnel resources available at the institutions vary considerably. In light of the differences regarding the content elicited in this study the pros and cons of heterogeneity in GTE should be discussed.

  9. Concurrent analysis: towards generalisable qualitative research.

    PubMed

    Snowden, Austyn; Martin, Colin R

    2011-10-01

    This study develops an original method of qualitative analysis coherent with its interpretivist principles. The objective is to increase the likelihood of achieving generalisability and so improve the chance of the findings being translated into practice. Good qualitative research depends on coherent analysis of different types of data. The limitations of existing methodologies are first discussed to justify the need for a novel approach. To illustrate this approach, primary evidence is presented using the new methodology. The primary evidence consists of a constructivist grounded theory of how mental health nurses with prescribing authority integrate prescribing into practice. This theory is built concurrently from interviews, reflective accounts and case study data from the literature. Concurrent analysis. Ten research articles and 13 semi-structured interviews were sampled purposively and then theoretically and analysed concurrently using constructivist grounded theory. A theory of the process of becoming competent in mental health nurse prescribing was generated through this process. This theory was validated by 32 practising mental health nurse prescribers as an accurate representation of their experience. The methodology generated a coherent and generalisable theory. It is therefore claimed that concurrent analysis engenders consistent and iterative treatment of different sources of qualitative data in a manageable manner. This process supports facilitation of the highest standard of qualitative research. Concurrent analysis removes the artificial delineation of relevant literature from other forms of constructed data. This gives researchers clear direction to treat qualitative data consistently raising the chances of generalisability of the findings. Raising the generalisability of qualitative research will increase its chances of informing clinical practice. © 2010 Blackwell Publishing Ltd.

  10. JOURNAL SCOPE GUIDELINES: Paper classification scheme

    NASA Astrophysics Data System (ADS)

    2005-06-01

    This scheme is used to clarify the journal's scope and enable authors and readers to more easily locate the appropriate section for their work. For each of the sections listed in the scope statement we suggest some more detailed subject areas which help define that subject area. These lists are by no means exhaustive and are intended only as a guide to the type of papers we envisage appearing in each section. We acknowledge that no classification scheme can be perfect and that there are some papers which might be placed in more than one section. We are happy to provide further advice on paper classification to authors upon request (please email jphysa@iop.org). 1. Statistical physics numerical and computational methods statistical mechanics, phase transitions and critical phenomena quantum condensed matter theory Bose-Einstein condensation strongly correlated electron systems exactly solvable models in statistical mechanics lattice models, random walks and combinatorics field-theoretical models in statistical mechanics disordered systems, spin glasses and neural networks nonequilibrium systems network theory 2. Chaotic and complex systems nonlinear dynamics and classical chaos fractals and multifractals quantum chaos classical and quantum transport cellular automata granular systems and self-organization pattern formation biophysical models 3. Mathematical physics combinatorics algebraic structures and number theory matrix theory classical and quantum groups, symmetry and representation theory Lie algebras, special functions and orthogonal polynomials ordinary and partial differential equations difference and functional equations integrable systems soliton theory functional analysis and operator theory inverse problems geometry, differential geometry and topology numerical approximation and analysis geometric integration computational methods 4. Quantum mechanics and quantum information theory coherent states eigenvalue problems supersymmetric quantum mechanics scattering theory relativistic quantum mechanics semiclassical approximations foundations of quantum mechanics and measurement theory entanglement and quantum nonlocality geometric phases and quantum tomography quantum tunnelling decoherence and open systems quantum cryptography, communication and computation theoretical quantum optics 5. Classical and quantum field theory quantum field theory gauge and conformal field theory quantum electrodynamics and quantum chromodynamics Casimir effect integrable field theory random matrix theory applications in field theory string theory and its developments classical field theory and electromagnetism metamaterials 6. Fluid and plasma theory turbulence fundamental plasma physics kinetic theory magnetohydrodynamics and multifluid descriptions strongly coupled plasmas one-component plasmas non-neutral plasmas astrophysical and dusty plasmas

  11. Rigour and grounded theory.

    PubMed

    Cooney, Adeline

    2011-01-01

    This paper explores ways to enhance and demonstrate rigour in a grounded theory study. Grounded theory is sometimes criticised for a lack of rigour. Beck (1993) identified credibility, auditability and fittingness as the main standards of rigour for qualitative research methods. These criteria were evaluated for applicability to a Straussian grounded theory study and expanded or refocused where necessary. The author uses a Straussian grounded theory study (Cooney, In press) to examine how the revised criteria can be applied when conducting a grounded theory study. Strauss and Corbin (1998b) criteria for judging the adequacy of a grounded theory were examined in the context of the wider literature examining rigour in qualitative research studies in general and grounded theory studies in particular. A literature search for 'rigour' and 'grounded theory' was carried out to support this analysis. Criteria are suggested for enhancing and demonstrating the rigour of a Straussian grounded theory study. These include: cross-checking emerging concepts against participants' meanings, asking experts if the theory 'fit' their experiences, and recording detailed memos outlining all analytical and sampling decisions. IMPLICATIONS FOR RESEARCH PRACTICE: The criteria identified have been expressed as questions to enable novice researchers to audit the extent to which they are demonstrating rigour when writing up their studies. However, it should not be forgotten that rigour is built into the grounded theory method through the inductive-deductive cycle of theory generation. Care in applying the grounded theory methodology correctly is the single most important factor in ensuring rigour.

  12. Application of Petri net theory for modelling and validation of the sucrose breakdown pathway in the potato tuber.

    PubMed

    Koch, Ina; Junker, Björn H; Heiner, Monika

    2005-04-01

    Because of the complexity of metabolic networks and their regulation, formal modelling is a useful method to improve the understanding of these systems. An essential step in network modelling is to validate the network model. Petri net theory provides algorithms and methods, which can be applied directly to metabolic network modelling and analysis in order to validate the model. The metabolism between sucrose and starch in the potato tuber is of great research interest. Even if the metabolism is one of the best studied in sink organs, it is not yet fully understood. We provide an approach for model validation of metabolic networks using Petri net theory, which we demonstrate for the sucrose breakdown pathway in the potato tuber. We start with hierarchical modelling of the metabolic network as a Petri net and continue with the analysis of qualitative properties of the network. The results characterize the net structure and give insights into the complex net behaviour.

  13. Buckling analysis for anisotropic laminated plates under combined inplane loads

    NASA Technical Reports Server (NTRS)

    Viswanathan, A. V.; Tamekuni, M.; Baker, L. L.

    1974-01-01

    The buckling analysis presented considers rectangular flat or curved general laminates subjected to combined inplane normal and shear loads. Linear theory is used in the analysis. All prebuckling deformations and any initial imperfections are ignored. The analysis method can be readily extended to longitudinally stiffened structures subjected to combined inplane normal and shear loads.

  14. Distributed collaborative response surface method for mechanical dynamic assembly reliability design

    NASA Astrophysics Data System (ADS)

    Bai, Guangchen; Fei, Chengwei

    2013-11-01

    Because of the randomness of many impact factors influencing the dynamic assembly relationship of complex machinery, the reliability analysis of dynamic assembly relationship needs to be accomplished considering the randomness from a probabilistic perspective. To improve the accuracy and efficiency of dynamic assembly relationship reliability analysis, the mechanical dynamic assembly reliability(MDAR) theory and a distributed collaborative response surface method(DCRSM) are proposed. The mathematic model of DCRSM is established based on the quadratic response surface function, and verified by the assembly relationship reliability analysis of aeroengine high pressure turbine(HPT) blade-tip radial running clearance(BTRRC). Through the comparison of the DCRSM, traditional response surface method(RSM) and Monte Carlo Method(MCM), the results show that the DCRSM is not able to accomplish the computational task which is impossible for the other methods when the number of simulation is more than 100 000 times, but also the computational precision for the DCRSM is basically consistent with the MCM and improved by 0.40˜4.63% to the RSM, furthermore, the computational efficiency of DCRSM is up to about 188 times of the MCM and 55 times of the RSM under 10000 times simulations. The DCRSM is demonstrated to be a feasible and effective approach for markedly improving the computational efficiency and accuracy of MDAR analysis. Thus, the proposed research provides the promising theory and method for the MDAR design and optimization, and opens a novel research direction of probabilistic analysis for developing the high-performance and high-reliability of aeroengine.

  15. Continuation Methods for Qualitative Analysis of Aircraft Dynamics

    NASA Technical Reports Server (NTRS)

    Cummings, Peter A.

    2004-01-01

    A class of numerical methods for constructing bifurcation curves for systems of coupled, non-linear ordinary differential equations is presented. Foundations are discussed, and several variations are outlined along with their respective capabilities. Appropriate background material from dynamical systems theory is presented.

  16. Interpretive analysis of 85 systematic reviews suggests that narrative syntheses and meta-analyses are incommensurate in argumentation.

    PubMed

    Melendez-Torres, G J; O'Mara-Eves, A; Thomas, J; Brunton, G; Caird, J; Petticrew, M

    2017-03-01

    Using Toulmin's argumentation theory, we analysed the texts of systematic reviews in the area of workplace health promotion to explore differences in the modes of reasoning embedded in reports of narrative synthesis as compared with reports of meta-analysis. We used framework synthesis, grounded theory and cross-case analysis methods to analyse 85 systematic reviews addressing intervention effectiveness in workplace health promotion. Two core categories, or 'modes of reasoning', emerged to frame the contrast between narrative synthesis and meta-analysis: practical-configurational reasoning in narrative synthesis ('what is going on here? What picture emerges?') and inferential-predictive reasoning in meta-analysis ('does it work, and how well? Will it work again?'). Modes of reasoning examined quality and consistency of the included evidence differently. Meta-analyses clearly distinguished between warrant and claim, whereas narrative syntheses often presented joint warrant-claims. Narrative syntheses and meta-analyses represent different modes of reasoning. Systematic reviewers are likely to be addressing research questions in different ways with each method. It is important to consider narrative synthesis in its own right as a method and to develop specific quality criteria and understandings of how it is carried out, not merely as a complement to, or second-best option for, meta-analysis. © 2016 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd. © 2016 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.

  17. Comparison of contact conditions obtained by direct simulation with statistical analysis for normally distributed isotropic surfaces

    NASA Astrophysics Data System (ADS)

    Uchidate, M.

    2018-09-01

    In this study, with the aim of establishing a systematic knowledge on the impact of summit extraction methods and stochastic model selection in rough contact analysis, the contact area ratio (A r /A a ) obtained by statistical contact models with different summit extraction methods was compared with a direct simulation using the boundary element method (BEM). Fifty areal topography datasets with different autocorrelation functions in terms of the power index and correlation length were used for investigation. The non-causal 2D auto-regressive model which can generate datasets with specified parameters was employed in this research. Three summit extraction methods, Nayak’s theory, 8-point analysis and watershed segmentation, were examined. With regard to the stochastic model, Bhushan’s model and BGT (Bush-Gibson-Thomas) model were applied. The values of A r /A a from the stochastic models tended to be smaller than BEM. The discrepancy between the Bhushan’s model with the 8-point analysis and BEM was slightly smaller than Nayak’s theory. The results with the watershed segmentation was similar to those with the 8-point analysis. The impact of the Wolf pruning on the discrepancy between the stochastic analysis and BEM was not very clear. In case of the BGT model which employs surface gradients, good quantitative agreement against BEM was obtained when the Nayak’s bandwidth parameter was large.

  18. Interpretive analysis of 85 systematic reviews suggests that narrative syntheses and meta‐analyses are incommensurate in argumentation

    PubMed Central

    O'Mara‐Eves, A.; Thomas, J.; Brunton, G.; Caird, J.; Petticrew, M.

    2016-01-01

    Using Toulmin's argumentation theory, we analysed the texts of systematic reviews in the area of workplace health promotion to explore differences in the modes of reasoning embedded in reports of narrative synthesis as compared with reports of meta‐analysis. We used framework synthesis, grounded theory and cross‐case analysis methods to analyse 85 systematic reviews addressing intervention effectiveness in workplace health promotion. Two core categories, or ‘modes of reasoning’, emerged to frame the contrast between narrative synthesis and meta‐analysis: practical–configurational reasoning in narrative synthesis (‘what is going on here? What picture emerges?’) and inferential–predictive reasoning in meta‐analysis (‘does it work, and how well? Will it work again?’). Modes of reasoning examined quality and consistency of the included evidence differently. Meta‐analyses clearly distinguished between warrant and claim, whereas narrative syntheses often presented joint warrant–claims. Narrative syntheses and meta‐analyses represent different modes of reasoning. Systematic reviewers are likely to be addressing research questions in different ways with each method. It is important to consider narrative synthesis in its own right as a method and to develop specific quality criteria and understandings of how it is carried out, not merely as a complement to, or second‐best option for, meta‐analysis. © 2016 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd. PMID:27860329

  19. Information theory applications for biological sequence analysis.

    PubMed

    Vinga, Susana

    2014-05-01

    Information theory (IT) addresses the analysis of communication systems and has been widely applied in molecular biology. In particular, alignment-free sequence analysis and comparison greatly benefited from concepts derived from IT, such as entropy and mutual information. This review covers several aspects of IT applications, ranging from genome global analysis and comparison, including block-entropy estimation and resolution-free metrics based on iterative maps, to local analysis, comprising the classification of motifs, prediction of transcription factor binding sites and sequence characterization based on linguistic complexity and entropic profiles. IT has also been applied to high-level correlations that combine DNA, RNA or protein features with sequence-independent properties, such as gene mapping and phenotype analysis, and has also provided models based on communication systems theory to describe information transmission channels at the cell level and also during evolutionary processes. While not exhaustive, this review attempts to categorize existing methods and to indicate their relation with broader transversal topics such as genomic signatures, data compression and complexity, time series analysis and phylogenetic classification, providing a resource for future developments in this promising area.

  20. ON THE THEORY AND PROCEDURE FOR CONSTRUCTING A MINIMAL-LENGTH, AREA-CONSERVING FREQUENCY POLYGON FROM GROUPED DATA.

    ERIC Educational Resources Information Center

    CASE, C. MARSTON

    THIS PAPER IS CONCERNED WITH GRAPHIC PRESENTATION AND ANALYSIS OF GROUPED OBSERVATIONS. IT PRESENTS A METHOD AND SUPPORTING THEORY FOR THE CONSTRUCTION OF AN AREA-CONSERVING, MINIMAL LENGTH FREQUENCY POLYGON CORRESPONDING TO A GIVEN HISTOGRAM. TRADITIONALLY, THE CONCEPT OF A FREQUENCY POLYGON CORRESPONDING TO A GIVEN HISTOGRAM HAS REFERRED TO THAT…

  1. On-Orbit System Identification

    NASA Technical Reports Server (NTRS)

    Mettler, E.; Milman, M. H.; Bayard, D.; Eldred, D. B.

    1987-01-01

    Information derived from accelerometer readings benefits important engineering and control functions. Report discusses methodology for detection, identification, and analysis of motions within space station. Techniques of vibration and rotation analyses, control theory, statistics, filter theory, and transform methods integrated to form system for generating models and model parameters that characterize total motion of complicated space station, with respect to both control-induced and random mechanical disturbances.

  2. Connecting Classroom Practice to Concepts of Culturally Responsive Teaching: Video Analysis in an Online Teacher Education Course

    ERIC Educational Resources Information Center

    Lopez, Leslie Ann

    2013-01-01

    Video has been shown to be an effective tool for synthesizing theory and connecting theory to practice in meaningful ways. This design-based research study examined how localized video of a practicing teacher impacted pre-service teachers' ability to learn culturally responsive teaching (CRT) methods and targeted strategies in an online…

  3. Data on distribution and abundance: Monitoring for research and management [Chapter 6

    Treesearch

    Samuel A. Cushman; Kevin S. McKelvey

    2010-01-01

    In the first chapter of this book we identified the interdependence of method, data and theory as an important influence on the progress of science. The first several chapters focused mostly on progress in theory, in the areas of integrating spatial and temporal complexity into ecological analysis, the emergence of landscape ecology and its transformation into a multi-...

  4. The spectroscopic (FT-IR, UV-vis), Fukui function, NLO, NBO, NPA and tautomerism effect analysis of (E)-2-[(2-hydroxy-6-methoxybenzylidene)amino]benzonitrile.

    PubMed

    Demircioğlu, Zeynep; Kaştaş, Çiğdem Albayrak; Büyükgüngör, Orhan

    2015-03-15

    A new o-hydroxy Schiff base, (E)-2-[(2-hydroxy-6-methoxybenzylidene)amino]benzonitrile was isolated and investigated by experimental and theoretical methodologies. The solid state molecular structure was determined by X-ray diffraction method. The vibrational spectral analysis was carried out by using FT-IR spectroscopy in the range of 4000-400cm(-)(1). Theoretical calculations were performed by density functional theory (DFT) method using 6-31G(d,p) basis set. The results of the calculations were applied to simulated spectra of the title compound, which show excellent agreement with observed spectra. The UV-vis spectrum of the compound was recorded in the region 200-800 nm in several solvents and electronic properties such as excitation energies, and wavelengths were calculated by TD-DFT/B3LYP method. The most prominent transitions were corresponds to π→π∗. Hybrid density functional theory (DFT) was used to investigate the enol-imine and keto-amine tautomers of titled compound. The titled compound showed the preference of enol form, as supported by X-ray and spectroscopic analysis results. The geometric and molecular properties were compaired for both enol-imine and keto-amine forms. Additionally, geometry optimizations in solvent media were performed with the same level of theory by the integral equation formalism polarizable continuum (IEF-PCM). Stability of the molecule arises from hyperconjugative interactions, charge delocalization and intramolecular hydrogen bond has been analyzed using natural bond orbital (NBO) analysis. Mulliken population method and natural population analysis (NPA) have been studied. Also, condensed Fukui function and relative nucleophilicity indices calculated from charges obtained with orbital charge calculation methods (NPA). Molecular electrostatic potential (MEP) and non linear optical (NLO) properties are also examined. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Perturbation-iteration theory for analyzing microwave striplines

    NASA Technical Reports Server (NTRS)

    Kretch, B. E.

    1985-01-01

    A perturbation-iteration technique is presented for determining the propagation constant and characteristic impedance of an unshielded microstrip transmission line. The method converges to the correct solution with a few iterations at each frequency and is equivalent to a full wave analysis. The perturbation-iteration method gives a direct solution for the propagation constant without having to find the roots of a transcendental dispersion equation. The theory is presented in detail along with numerical results for the effective dielectric constant and characteristic impedance for a wide range of substrate dielectric constants, stripline dimensions, and frequencies.

  6. Analysis and control of hourglass instabilities in underintegrated linear and nonlinear elasticity

    NASA Technical Reports Server (NTRS)

    Jacquotte, Olivier P.; Oden, J. Tinsley

    1994-01-01

    Methods are described to identify and correct a bad finite element approximation of the governing operator obtained when under-integration is used in numerical code for several model problems: the Poisson problem, the linear elasticity problem, and for problems in the nonlinear theory of elasticity. For each of these problems, the reason for the occurrence of instabilities is given, a way to control or eliminate them is presented, and theorems of existence, uniqueness, and convergence for the given methods are established. Finally, numerical results are included which illustrate the theory.

  7. Does ethical theory have a future in bioethics?

    PubMed

    Beauchamp, Tom L

    2004-01-01

    Although there has long been a successful and stable marriage between philosophical ethical theory and bioethics, the marriage has become shaky as bioethics has become a more interdisciplinary and practical field. A practical price is paid for theoretical generality in philosophy. It is often unclear whether and, if so, how theory is to be brought to bear on dilemmatic problems, public policy, moral controversies, and moral conflict. Three clearly philosophical problems are used to see how philosophers are doing in handling practical problems: Cultural Relativity, and Moral Universality, Moral Justification, and Conceptual Analysis. In each case it is argued that philosophers need to develop theories and methods more closely attuned to practice. The work of philosophers such as Ruth Macklin, Norman Daniels, and Gerald Dworkin is examined. In the writings of each there is major methological gap between philosophical theory (or method) and practical conclusions. The future of philosophical ethics in interdisciplinary bioethics may turn on whether such gaps can be closed. If not, bioethics may justifiably conclude that philosophy is of little value.

  8. Sifting, sorting and saturating data in a grounded theory study of information use by practice nurses: a worked example.

    PubMed

    Hoare, Karen J; Mills, Jane; Francis, Karen

    2012-12-01

    The terminology used to analyse data in a grounded theory study can be confusing. Different grounded theorists use a variety of terms which all have similar meanings. In the following study, we use terms adopted by Charmaz including: initial, focused and axial coding. Initial codes are used to analyse data with an emphasis on identifying gerunds, a verb acting as a noun. If initial codes are relevant to the developing theory, they are grouped with similar codes into categories. Categories become saturated when there are no new codes identified in the data. Axial codes are used to link categories together into a grounded theory process. Memo writing accompanies this data sifting and sorting. The following article explains how one initial code became a category providing a worked example of the grounded theory method of constant comparative analysis. The interplay between coding and categorization is facilitated by the constant comparative method. © 2012 Wiley Publishing Asia Pty Ltd.

  9. Methods for the evaluation of alternative disaster warning systems

    NASA Technical Reports Server (NTRS)

    Agnew, C. E.; Anderson, R. J., Jr.; Lanen, W. N.

    1977-01-01

    For each of the methods identified, a theoretical basis is provided and an illustrative example is described. The example includes sufficient realism and detail to enable an analyst to conduct an evaluation of other systems. The methods discussed in the study include equal capability cost analysis, consumers' surplus, and statistical decision theory.

  10. The Integration of Linguistic Theory: Internal Reconstruction and the Comparative Method in Descriptive Linguistics.

    ERIC Educational Resources Information Center

    Bailey, Charles-James N.

    The author aims: (1) to show that generative phonology uses essentially the method of internal reconstruction which has previously been employed only in diachronic studies in setting up synchronic underlying phonological representations; (2) to show why synchronic analysis should add the comparative method to its arsenal, together with whatever…

  11. A performance analysis method for distributed real-time robotic systems: A case study of remote teleoperation

    NASA Technical Reports Server (NTRS)

    Lefebvre, D. R.; Sanderson, A. C.

    1994-01-01

    Robot coordination and control systems for remote teleoperation applications are by necessity implemented on distributed computers. Modeling and performance analysis of these distributed robotic systems is difficult, but important for economic system design. Performance analysis methods originally developed for conventional distributed computer systems are often unsatisfactory for evaluating real-time systems. The paper introduces a formal model of distributed robotic control systems; and a performance analysis method, based on scheduling theory, which can handle concurrent hard-real-time response specifications. Use of the method is illustrated by a case of remote teleoperation which assesses the effect of communication delays and the allocation of robot control functions on control system hardware requirements.

  12. Free Wake Techniques for Rotor Aerodynamic Analysis. Volume 1: Summary of Results and Background Theory

    NASA Technical Reports Server (NTRS)

    Miller, R. H.

    1982-01-01

    Results obtained during the development of a consistent aerodynamic theory for rotors in hovering flight are discussed. Methods of aerodynamic analysis were developed which are adequate for general design purposes until such time as more elaborate solutions become available, in particular solutions which include real fluids effects. Several problems were encountered in the course of this development, and many remain to be solved, however it is felt that a better understanding of the aerodynamic phenomena involved was obtained. Remaining uncertainties are discussed.

  13. Analysis of some types of intermediate orbits used in the theory of artificial Earth satellite motion for the purposes of geodesy.

    NASA Astrophysics Data System (ADS)

    Kotseva, V. I.

    Survey, analysis and comparison of 15 types of intermediate orbits used in the satellite movement theories for the purposes both of the geodesy and geodynamics have been made. The paper is a continuation of the investigations directed to practical realization both of analytical and semi-analytical methods for satellite orbit determination. It is indicated that the intermediate orbit proposed and elaborated by Aksenov, Grebenikov and Demin has got some good qualities and priorities over all the rest intermediate orbits.

  14. Nuclear Analysis

    NASA Technical Reports Server (NTRS)

    Clement, J. D.; Kirby, K. D.

    1973-01-01

    Exploratory calculations were performed for several gas core breeder reactor configurations. The computational method involved the use of the MACH-1 one dimensional diffusion theory code and the THERMOS integral transport theory code for thermal cross sections. Computations were performed to analyze thermal breeder concepts and nonbreeder concepts. Analysis of breeders was restricted to the (U-233)-Th breeding cycle, and computations were performed to examine a range of parameters. These parameters include U-233 to hydrogen atom ratio in the gaseous cavity, carbon to thorium atom ratio in the breeding blanket, cavity size, and blanket size.

  15. Ergodic theory and visualization. II. Fourier mesochronic plots visualize (quasi)periodic sets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Levnajić, Zoran; Department of Mechanical Engineering, University of California Santa Barbara, Santa Barbara, California 93106; Mezić, Igor

    We present an application and analysis of a visualization method for measure-preserving dynamical systems introduced by I. Mezić and A. Banaszuk [Physica D 197, 101 (2004)], based on frequency analysis and Koopman operator theory. This extends our earlier work on visualization of ergodic partition [Z. Levnajić and I. Mezić, Chaos 20, 033114 (2010)]. Our method employs the concept of Fourier time average [I. Mezić and A. Banaszuk, Physica D 197, 101 (2004)], and is realized as a computational algorithms for visualization of periodic and quasi-periodic sets in the phase space. The complement of periodic phase space partition contains chaotic zone,more » and we show how to identify it. The range of method's applicability is illustrated using well-known Chirikov standard map, while its potential in illuminating higher-dimensional dynamics is presented by studying the Froeschlé map and the Extended Standard Map.« less

  16. Ergodic theory and visualization. II. Fourier mesochronic plots visualize (quasi)periodic sets.

    PubMed

    Levnajić, Zoran; Mezić, Igor

    2015-05-01

    We present an application and analysis of a visualization method for measure-preserving dynamical systems introduced by I. Mezić and A. Banaszuk [Physica D 197, 101 (2004)], based on frequency analysis and Koopman operator theory. This extends our earlier work on visualization of ergodic partition [Z. Levnajić and I. Mezić, Chaos 20, 033114 (2010)]. Our method employs the concept of Fourier time average [I. Mezić and A. Banaszuk, Physica D 197, 101 (2004)], and is realized as a computational algorithms for visualization of periodic and quasi-periodic sets in the phase space. The complement of periodic phase space partition contains chaotic zone, and we show how to identify it. The range of method's applicability is illustrated using well-known Chirikov standard map, while its potential in illuminating higher-dimensional dynamics is presented by studying the Froeschlé map and the Extended Standard Map.

  17. Therapeutic Jurisprudence in Health Research: Enlisting Legal Theory as a Methodological Guide in an Interdisciplinary Case Study of Mental Health and Criminal Law.

    PubMed

    Ferrazzi, Priscilla; Krupa, Terry

    2015-09-01

    Studies that seek to understand and improve health care systems benefit from qualitative methods that employ theory to add depth, complexity, and context to analysis. Theories used in health research typically emerge from social science, but these can be inadequate for studying complex health systems. Mental health rehabilitation programs for criminal courts are complicated by their integration within the criminal justice system and by their dual health-and-justice objectives. In a qualitative multiple case study exploring the potential for these mental health court programs in Arctic communities, we assess whether a legal theory, known as therapeutic jurisprudence, functions as a useful methodological theory. Therapeutic jurisprudence, recruited across discipline boundaries, succeeds in guiding our qualitative inquiry at the complex intersection of mental health care and criminal law by providing a framework foundation for directing the study's research questions and the related propositions that focus our analysis. © The Author(s) 2014.

  18. Structured Case Analysis: Developing Critical Thinking Skills in a Marketing Case Course

    ERIC Educational Resources Information Center

    Klebba, Joanne M.; Hamilton, Janet G.

    2007-01-01

    Structured case analysis is a hybrid pedagogy that flexibly combines diverse instructional methods with comprehensive case analysis as a mechanism to develop critical thinking skills. An incremental learning framework is proposed that allows instructors to develop and monitor content-specific theory and the corresponding critical thinking skills.…

  19. Structural Analysis of a Consumption-Based Stratification Indicator: Relational Proximity of Household Expenditures

    ERIC Educational Resources Information Center

    Katz-Gerro, Tally; Talmud, Ilan

    2005-01-01

    This paper proposes a new analysis of consumption inequality using relational methods, derived from network images of social structure. We combine structural analysis with theoretical concerns in consumer research to propose a relational theory of consumption space, to construct a stratification indicator, and to demonstrate its analytical…

  20. Psychophysical Models for Signal Detection with Time Varying Uncertainty. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Gai, E.

    1975-01-01

    Psychophysical models for the behavior of the human operator in detection tasks which include change in detectability, correlation between observations and deferred decisions are developed. Classical Signal Detection Theory (SDT) is discussed and its emphasis on the sensory processes is contrasted to decision strategies. The analysis of decision strategies utilizes detection tasks with time varying signal strength. The classical theory is modified to include such tasks and several optimal decision strategies are explored. Two methods of classifying strategies are suggested. The first method is similar to the analysis of ROC curves, while the second is based on the relation between the criterion level (CL) and the detectability. Experiments to verify the analysis of tasks with changes of signal strength are designed. The results show that subjects are aware of changes in detectability and tend to use strategies that involve changes in the CL's.

  1. A Guided Tour of Mathematical Methods for the Physical Sciences

    NASA Astrophysics Data System (ADS)

    Snieder, Roel; van Wijk, Kasper

    2015-05-01

    1. Introduction; 2. Dimensional analysis; 3. Power series; 4. Spherical and cylindrical coordinates; 5. Gradient; 6. Divergence of a vector field; 7. Curl of a vector field; 8. Theorem of Gauss; 9. Theorem of Stokes; 10. The Laplacian; 11. Scale analysis; 12. Linear algebra; 13. Dirac delta function; 14. Fourier analysis; 15. Analytic functions; 16. Complex integration; 17. Green's functions: principles; 18. Green's functions: examples; 19. Normal modes; 20. Potential-field theory; 21. Probability and statistics; 22. Inverse problems; 23. Perturbation theory; 24. Asymptotic evaluation of integrals; 25. Conservation laws; 26. Cartesian tensors; 27. Variational calculus; 28. Epilogue on power and knowledge.

  2. Amide I vibrational circular dichroism of dipeptide: Conformation dependence and fragment analysis

    NASA Astrophysics Data System (ADS)

    Choi, Jun-Ho; Cho, Minhaeng

    2004-03-01

    The amide I vibrational circular dichroic response of alanine dipeptide analog (ADA) was theoretically investigated and the density functional theory calculation and fragment analysis results are presented. A variety of vibrational spectroscopic properties, local and normal mode frequencies, coupling constant, dipole, and rotational strengths, are calculated by varying two dihedral angles determining the three-dimensional ADA conformation. Considering two monopeptide fragments separately, we show that the amide I vibrational circular dichroism of the ADA can be quantitatively predicted. For several representative conformations of the model ADA, vibrational circular dichroism spectra are calculated by using both the density functional theory calculation and fragment analysis methods.

  3. Gesellschaft fuer angewandte Mathematik und Mechanik, Scientific Annual Meeting, Universitaet Stuttgart, Federal Republic of Germany, Apr. 13-17, 1987, Reports

    NASA Astrophysics Data System (ADS)

    Recent experimental, theoretical, and numerical investigations of problems in applied mechanics are discussed in reviews and reports. The fields covered include vibration and stability; the mechanics of elastic and plastic materials; fluid mechanics; the numerical treatment of differential equations; finite and boundary elements; optimization, decision theory, stochastics, and actuarial analysis; applied analysis and mathematical physics; and numerical analysis. Reviews are presented on mathematical applications of geometric-optics methods, biomechanics and implant technology, vibration theory in engineering, the stiffness and strength of damaged materials, and the existence of slow steady flows of viscoelastic fluids of integral type.

  4. Application of adaptive optics in complicated and integrated spatial multisensor system and its measurement analysis

    NASA Astrophysics Data System (ADS)

    Ding, Quanxin; Guo, Chunjie; Cai, Meng; Liu, Hua

    2007-12-01

    Adaptive Optics Expand System is a kind of new concept spatial equipment, which concerns system, cybernetics and informatics deeply, and is key way to improve advanced sensors ability. Traditional Zernike Phase Contrast Method is developed, and Accelerated High-level Phase Contrast Theory is established. Integration theory and mathematical simulation is achieved. Such Equipment, which is based on some crucial components, such as, core optical system, multi mode wavefront sensor and so on, is established for AOES advantageous configuration and global design. Studies on Complicated Spatial Multisensor System Integratation and measurement Analysis including error analysis are carried out.

  5. An Emerging New Risk Analysis Science: Foundations and Implications.

    PubMed

    Aven, Terje

    2018-05-01

    To solve real-life problems-such as those related to technology, health, security, or climate change-and make suitable decisions, risk is nearly always a main issue. Different types of sciences are often supporting the work, for example, statistics, natural sciences, and social sciences. Risk analysis approaches and methods are also commonly used, but risk analysis is not broadly accepted as a science in itself. A key problem is the lack of explanatory power and large uncertainties when assessing risk. This article presents an emerging new risk analysis science based on novel ideas and theories on risk analysis developed in recent years by the risk analysis community. It builds on a fundamental change in thinking, from the search for accurate predictions and risk estimates, to knowledge generation related to concepts, theories, frameworks, approaches, principles, methods, and models to understand, assess, characterize, communicate, and (in a broad sense) manage risk. Examples are used to illustrate the importance of this distinct/separate risk analysis science for solving risk problems, supporting science in general and other disciplines in particular. © 2017 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  6. Reliability analysis of the objective structured clinical examination using generalizability theory.

    PubMed

    Trejo-Mejía, Juan Andrés; Sánchez-Mendiola, Melchor; Méndez-Ramírez, Ignacio; Martínez-González, Adrián

    2016-01-01

    The objective structured clinical examination (OSCE) is a widely used method for assessing clinical competence in health sciences education. Studies using this method have shown evidence of validity and reliability. There are no published studies of OSCE reliability measurement with generalizability theory (G-theory) in Latin America. The aims of this study were to assess the reliability of an OSCE in medical students using G-theory and explore its usefulness for quality improvement. An observational cross-sectional study was conducted at National Autonomous University of Mexico (UNAM) Faculty of Medicine in Mexico City. A total of 278 fifth-year medical students were assessed with an 18-station OSCE in a summative end-of-career final examination. There were four exam versions. G-theory with a crossover random effects design was used to identify the main sources of variance. Examiners, standardized patients, and cases were considered as a single facet of analysis. The exam was applied to 278 medical students. The OSCE had a generalizability coefficient of 0.93. The major components of variance were stations, students, and residual error. The sites and the versions of the tests had minimum variance. Our study achieved a G coefficient similar to that found in other reports, which is acceptable for summative tests. G-theory allows the estimation of the magnitude of multiple sources of error and helps decision makers to determine the number of stations, test versions, and examiners needed to obtain reliable measurements.

  7. Accoustic waveform logging--Advances in theory and application

    USGS Publications Warehouse

    Paillet, F.L.; Cheng, C.H.; Pennington , W.D.

    1992-01-01

    Full-waveform acoustic logging has made significant advances in both theory and application in recent years, and these advances have greatly increased the capability of log analysts to measure the physical properties of formations. Advances in theory provide the analytical tools required to understand the properties of measured seismic waves, and to relate those properties to such quantities as shear and compressional velocity and attenuation, and primary and fracture porosity and permeability of potential reservoir rocks. The theory demonstrates that all parts of recorded waveforms are related to various modes of propagation, even in the case of dipole and quadrupole source logging. However, the theory also indicates that these mode properties can be used to design velocity and attenuation picking schemes, and shows how source frequency spectra can be selected to optimize results in specific applications. Synthetic microseismogram computations are an effective tool in waveform interpretation theory; they demonstrate how shear arrival picks and mode attenuation can be used to compute shear velocity and intrinsic attenuation, and formation permeability for monopole, dipole and quadrupole sources. Array processing of multi-receiver data offers the opportunity to apply even more sophisticated analysis techniques. Synthetic microseismogram data is used to illustrate the application of the maximum-likelihood method, semblance cross-correlation, and Prony's method analysis techniques to determine seismic velocities and attenuations. The interpretation of acoustic waveform logs is illustrated by reviews of various practical applications, including synthetic seismogram generation, lithology determination, estimation of geomechanical properties in situ, permeability estimation, and design of hydraulic fracture operations.

  8. Validating Experimental and Theoretical Langmuir Probe Analyses

    NASA Astrophysics Data System (ADS)

    Pilling, Lawrence Stuart; Carnegie, Dale

    2004-11-01

    Analysis of Langmuir probe characteristics contains a paradox in that it is unknown a priori which theory is applicable before it is applied. Often theories are assumed to be correct when certain criteria are met although they may not validate the approach used. We have analysed the Langmuir probe data from cylindrical double and single probes acquired from a DC discharge plasma over a wide variety of conditions. This discharge contains a dual temperature distribution and hence fitting a theoretically generated curve is impractical. To determine the densities an examination of the current theories was necessary. For the conditions where the probe radius is the same order of magnitude as the Debye length, the gradient expected for orbital motion limited (OML) is approximately the same as the radial motion gradients. An analysis of the gradients from the radial motion theory was able to resolve the differences from the OML gradient value of two. The method was also able to determine whether radial or OML theories applied without knowledge of the electron temperature. Only the position of the space charge potential is necessary to determine the applicable theory.

  9. Qualitative Methods in Mental Health Services Research

    PubMed Central

    Palinkas, Lawrence A.

    2014-01-01

    Qualitative and mixed methods play a prominent role in mental health services research. However, the standards for their use are not always evident, especially for those not trained in such methods. This paper reviews the rationale and common approaches to using qualitative and mixed methods in mental health services and implementation research based on a review of the papers included in this special series along with representative examples from the literature. Qualitative methods are used to provide a “thick description” or depth of understanding to complement breadth of understanding afforded by quantitative methods, elicit the perspective of those being studied, explore issues that have not been well studied, develop conceptual theories or test hypotheses, or evaluate the process of a phenomenon or intervention. Qualitative methods adhere to many of the same principles of scientific rigor as quantitative methods, but often differ with respect to study design, data collection and data analysis strategies. For instance, participants for qualitative studies are usually sampled purposefully rather than at random and the design usually reflects an iterative process alternating between data collection and analysis. The most common techniques for data collection are individual semi-structured interviews, focus groups, document reviews, and participant observation. Strategies for analysis are usually inductive, based on principles of grounded theory or phenomenology. Qualitative methods are also used in combination with quantitative methods in mixed method designs for convergence, complementarity, expansion, development, and sampling. Rigorously applied qualitative methods offer great potential in contributing to the scientific foundation of mental health services research. PMID:25350675

  10. Qualitative and mixed methods in mental health services and implementation research.

    PubMed

    Palinkas, Lawrence A

    2014-01-01

    Qualitative and mixed methods play a prominent role in mental health services research. However, the standards for their use are not always evident, especially for those not trained in such methods. This article reviews the rationale and common approaches to using qualitative and mixed methods in mental health services and implementation research based on a review of the articles included in this special series along with representative examples from the literature. Qualitative methods are used to provide a "thick description" or depth of understanding to complement breadth of understanding afforded by quantitative methods, elicit the perspective of those being studied, explore issues that have not been well studied, develop conceptual theories or test hypotheses, or evaluate the process of a phenomenon or intervention. Qualitative methods adhere to many of the same principles of scientific rigor as quantitative methods but often differ with respect to study design, data collection, and data analysis strategies. For instance, participants for qualitative studies are usually sampled purposefully rather than at random and the design usually reflects an iterative process alternating between data collection and analysis. The most common techniques for data collection are individual semistructured interviews, focus groups, document reviews, and participant observation. Strategies for analysis are usually inductive, based on principles of grounded theory or phenomenology. Qualitative methods are also used in combination with quantitative methods in mixed-method designs for convergence, complementarity, expansion, development, and sampling. Rigorously applied qualitative methods offer great potential in contributing to the scientific foundation of mental health services research.

  11. Self-defense against verbal assault: shame, anger, and the social bond.

    PubMed

    Scheff, T J

    1995-09-01

    With many years of experience and refinement, the arts of self-defense against physical assault are highly developed. Without an effective theory and and a useful practice, there is little in the way of self-defense against verbal assault. For THEORY, I draw upon ideas from aikido, family systems theory, and the sociology of emotions. Since unacknowledged shame seems to generate rage and damage social bonds, I emphasize the management of shame, anger, and bonds. To illustrate the meaning of these principles, I offer several episodes as examples, using the METHOD of discourse analysis. I apply this theory and method to the PRACTICE of psychotherapy by describing some rudimentary principles of defense of self against verbal aggression, especially the subtle aggression of innuendo. Psychotherapy is often an arena of verbal aggression by both therapist and client, even though it is usually unintentional and outside of awareness.

  12. Steam Hydrocarbon Cracking and Reforming

    ERIC Educational Resources Information Center

    Golombok, Michael

    2004-01-01

    The interactive methods of steam hydrocarbon reforming and cracking of the oil and chemical industries are scrutinized, with special focus on their resemblance and variations. The two methods are illustrations of equilibrium-controlled and kinetically-controlled processes, the analysis of which involves theories, which overlap and balance each…

  13. A Program for Automatic Generation of Dimensionless Parameters.

    ERIC Educational Resources Information Center

    Hundal, M. S.

    1982-01-01

    Following a review of the theory of dimensional analysis, presents a method for generating all of the possible sets of nondimensional parameters for a given problem, a digital computer program to implement the method, and a mechanical design problem to illustrate its use. (Author/JN)

  14. What Do You Think You Are Measuring? A Mixed-Methods Procedure for Assessing the Content Validity of Test Items and Theory-Based Scaling

    PubMed Central

    Koller, Ingrid; Levenson, Michael R.; Glück, Judith

    2017-01-01

    The valid measurement of latent constructs is crucial for psychological research. Here, we present a mixed-methods procedure for improving the precision of construct definitions, determining the content validity of items, evaluating the representativeness of items for the target construct, generating test items, and analyzing items on a theoretical basis. To illustrate the mixed-methods content-scaling-structure (CSS) procedure, we analyze the Adult Self-Transcendence Inventory, a self-report measure of wisdom (ASTI, Levenson et al., 2005). A content-validity analysis of the ASTI items was used as the basis of psychometric analyses using multidimensional item response models (N = 1215). We found that the new procedure produced important suggestions concerning five subdimensions of the ASTI that were not identifiable using exploratory methods. The study shows that the application of the suggested procedure leads to a deeper understanding of latent constructs. It also demonstrates the advantages of theory-based item analysis. PMID:28270777

  15. Pressure measurements in a low-density nozzle plume for code verification

    NASA Technical Reports Server (NTRS)

    Penko, Paul F.; Boyd, Iain D.; Meissner, Dana L.; Dewitt, Kenneth J.

    1991-01-01

    Measurements of Pitot pressure were made in the exit plane and plume of a low-density, nitrogen nozzle flow. Two numerical computer codes were used to analyze the flow, including one based on continuum theory using the explicit MacCormack method, and the other on kinetic theory using the method of direct-simulation Monte Carlo (DSMC). The continuum analysis was carried to the nozzle exit plane and the results were compared to the measurements. The DSMC analysis was extended into the plume of the nozzle flow and the results were compared with measurements at the exit plane and axial stations 12, 24 and 36 mm into the near-field plume. Two experimental apparatus were used that differed in design and gave slightly different profiles of pressure measurements. The DSMC method compared well with the measurements from each apparatus at all axial stations and provided a more accurate prediction of the flow than the continuum method, verifying the validity of DSMC for such calculations.

  16. Thermal Stress Analysis of a Continuous and Pulsed End-Pumped Nd:YAG Rod Crystal Using Non-Classic Conduction Heat Transfer Theory

    NASA Astrophysics Data System (ADS)

    Mojahedi, Mahdi; Shekoohinejad, Hamidreza

    2018-02-01

    In this paper, temperature distribution in the continuous and pulsed end-pumped Nd:YAG rod crystal is determined using nonclassical and classical heat conduction theories. In order to find the temperature distribution in crystal, heat transfer differential equations of crystal with consideration of boundary conditions are derived based on non-Fourier's model and temperature distribution of the crystal is achieved by an analytical method. Then, by transferring non-Fourier differential equations to matrix equations, using finite element method, temperature and stress of every point of crystal are calculated in the time domain. According to the results, a comparison between classical and nonclassical theories is represented to investigate rupture power values. In continuous end pumping with equal input powers, non-Fourier theory predicts greater temperature and stress compared to Fourier theory. It also shows that with an increase in relaxation time, crystal rupture power decreases. Despite of these results, in single rectangular pulsed end-pumping condition, with an equal input power, Fourier theory indicates higher temperature and stress rather than non-Fourier theory. It is also observed that, when the relaxation time increases, maximum amounts of temperature and stress decrease.

  17. Superintendent Leadership Style: A Gendered Discourse Analysis

    ERIC Educational Resources Information Center

    Wallin, Dawn C.; Crippen, Carolyn

    2007-01-01

    Using a blend of social constructionism, critical feminism, and dialogue theory, the discourse of nine Manitoba superintendents is examined to determine if it illustrates particular gendered assumptions regarding superintendents' leadership style. Qualitative inquiry and analysis methods were utilized to identify emerging themes, or topics of…

  18. Technology selection for ballast water treatment by multi-stakeholders: A multi-attribute decision analysis approach based on the combined weights and extension theory.

    PubMed

    Ren, Jingzheng

    2018-01-01

    This objective of this study is to develop a generic multi-attribute decision analysis framework for ranking the technologies for ballast water treatment and determine their grades. An evaluation criteria system consisting of eight criteria in four categories was used to evaluate the technologies for ballast water treatment. The Best-Worst method, which is a subjective weighting method and Criteria importance through inter-criteria correlation method, which is an objective weighting method, were combined to determine the weights of the evaluation criteria. The extension theory was employed to prioritize the technologies for ballast water treatment and determine their grades. An illustrative case including four technologies for ballast water treatment, i.e. Alfa Laval (T 1 ), Hyde (T 2 ), Unitor (T 3 ), and NaOH (T 4 ), were studied by the proposed method, and the Hyde (T 2 ) was recognized as the best technology. Sensitivity analysis was also carried to investigate the effects of the combined coefficients and the weights of the evaluation criteria on the final priority order of the four technologies for ballast water treatment. The sum weighted method and the TOPSIS was also employed to rank the four technologies, and the results determined by these two methods are consistent to that determined by the proposed method in this study. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. An enquiry into the method of paired comparison: reliability, scaling, and Thurstone's Law of Comparative Judgment

    Treesearch

    Thomas C. Brown; George L. Peterson

    2009-01-01

    The method of paired comparisons is used to measure individuals' preference orderings of items presented to them as discrete binary choices. This paper reviews the theory and application of the paired comparison method, describes a new computer program available for eliciting the choices, and presents an analysis of methods for scaling paired choice data to...

  20. Application of probabilistic analysis/design methods in space programs - The approaches, the status, and the needs

    NASA Technical Reports Server (NTRS)

    Ryan, Robert S.; Townsend, John S.

    1993-01-01

    The prospective improvement of probabilistic methods for space program analysis/design entails the further development of theories, codes, and tools which match specific areas of application, the drawing of lessons from previous uses of probability and statistics data bases, the enlargement of data bases (especially in the field of structural failures), and the education of engineers and managers on the advantages of these methods. An evaluation is presently made of the current limitations of probabilistic engineering methods. Recommendations are made for specific applications.

  1. Aerodynamic shape optimization using control theory

    NASA Technical Reports Server (NTRS)

    Reuther, James

    1996-01-01

    Aerodynamic shape design has long persisted as a difficult scientific challenge due its highly nonlinear flow physics and daunting geometric complexity. However, with the emergence of Computational Fluid Dynamics (CFD) it has become possible to make accurate predictions of flows which are not dominated by viscous effects. It is thus worthwhile to explore the extension of CFD methods for flow analysis to the treatment of aerodynamic shape design. Two new aerodynamic shape design methods are developed which combine existing CFD technology, optimal control theory, and numerical optimization techniques. Flow analysis methods for the potential flow equation and the Euler equations form the basis of the two respective design methods. In each case, optimal control theory is used to derive the adjoint differential equations, the solution of which provides the necessary gradient information to a numerical optimization method much more efficiently then by conventional finite differencing. Each technique uses a quasi-Newton numerical optimization algorithm to drive an aerodynamic objective function toward a minimum. An analytic grid perturbation method is developed to modify body fitted meshes to accommodate shape changes during the design process. Both Hicks-Henne perturbation functions and B-spline control points are explored as suitable design variables. The new methods prove to be computationally efficient and robust, and can be used for practical airfoil design including geometric and aerodynamic constraints. Objective functions are chosen to allow both inverse design to a target pressure distribution and wave drag minimization. Several design cases are presented for each method illustrating its practicality and efficiency. These include non-lifting and lifting airfoils operating at both subsonic and transonic conditions.

  2. Qualitative methods: beyond the cookbook.

    PubMed

    Harding, G; Gantley, M

    1998-02-01

    Qualitative methods appear increasingly in vogue in health services research (HSR). Such research, however, has utilized, often uncritically, a 'cookbook' of methods for data collection, and common-sense principles for data analysis. This paper argues that qualitative HSR benefits from recognizing and drawing upon theoretical principles underlying qualitative data collection and analysis. A distinction is drawn between problem-orientated and theory-orientated research, in order to illustrate how problem-orientated research would benefit from the introduction of theoretical perspectives in order to develop the knowledge base of health services research.

  3. Convergence analysis of a monotonic penalty method for American option pricing

    NASA Astrophysics Data System (ADS)

    Zhang, Kai; Yang, Xiaoqi; Teo, Kok Lay

    2008-12-01

    This paper is devoted to study the convergence analysis of a monotonic penalty method for pricing American options. A monotonic penalty method is first proposed to solve the complementarity problem arising from the valuation of American options, which produces a nonlinear degenerated parabolic PDE with Black-Scholes operator. Based on the variational theory, the solvability and convergence properties of this penalty approach are established in a proper infinite dimensional space. Moreover, the convergence rate of the combination of two power penalty functions is obtained.

  4. Evaluation of Ares-I Control System Robustness to Uncertain Aerodynamics and Flex Dynamics

    NASA Technical Reports Server (NTRS)

    Jang, Jiann-Woei; VanTassel, Chris; Bedrossian, Nazareth; Hall, Charles; Spanos, Pol

    2008-01-01

    This paper discusses the application of robust control theory to evaluate robustness of the Ares-I control systems. Three techniques for estimating upper and lower bounds of uncertain parameters which yield stable closed-loop response are used here: (1) Monte Carlo analysis, (2) mu analysis, and (3) characteristic frequency response analysis. All three methods are used to evaluate stability envelopes of the Ares-I control systems with uncertain aerodynamics and flex dynamics. The results show that characteristic frequency response analysis is the most effective of these methods for assessing robustness.

  5. A method for studying decision-making by guideline development groups.

    PubMed

    Gardner, Benjamin; Davidson, Rosemary; McAteer, John; Michie, Susan

    2009-08-05

    Multidisciplinary guideline development groups (GDGs) have considerable influence on UK healthcare policy and practice, but previous research suggests that research evidence is a variable influence on GDG recommendations. The Evidence into Recommendations (EiR) study has been set up to document social-psychological influences on GDG decision-making. In this paper we aim to evaluate the relevance of existing qualitative methodologies to the EiR study, and to develop a method best-suited to capturing influences on GDG decision-making. A research team comprised of three postdoctoral research fellows and a multidisciplinary steering group assessed the utility of extant qualitative methodologies for coding verbatim GDG meeting transcripts and semi-structured interviews with GDG members. A unique configuration of techniques was developed to permit data reduction and analysis. Our method incorporates techniques from thematic analysis, grounded theory analysis, content analysis, and framework analysis. Thematic analysis of individual interviews conducted with group members at the start and end of the GDG process defines discrete problem areas to guide data extraction from GDG meeting transcripts. Data excerpts are coded both inductively and deductively, using concepts taken from theories of decision-making, social influence and group processes. These codes inform a framework analysis to describe and explain incidents within GDG meetings. We illustrate the application of the method by discussing some preliminary findings of a study of a National Institute for Health and Clinical Excellence (NICE) acute physical health GDG. This method is currently being applied to study the meetings of three of NICE GDGs. These cover topics in acute physical health, mental health and public health, and comprise a total of 45 full-day meetings. The method offers potential for application to other health care and decision-making groups.

  6. Should the model for risk-informed regulation be game theory rather than decision theory?

    PubMed

    Bier, Vicki M; Lin, Shi-Woei

    2013-02-01

    Risk analysts frequently view the regulation of risks as being largely a matter of decision theory. According to this view, risk analysis methods provide information on the likelihood and severity of various possible outcomes; this information should then be assessed using a decision-theoretic approach (such as cost/benefit analysis) to determine whether the risks are acceptable, and whether additional regulation is warranted. However, this view ignores the fact that in many industries (particularly industries that are technologically sophisticated and employ specialized risk and safety experts), risk analyses may be done by regulated firms, not by the regulator. Moreover, those firms may have more knowledge about the levels of safety at their own facilities than the regulator does. This creates a situation in which the regulated firm has both the opportunity-and often also the motive-to provide inaccurate (in particular, favorably biased) risk information to the regulator, and hence the regulator has reason to doubt the accuracy of the risk information provided by regulated parties. Researchers have argued that decision theory is capable of dealing with many such strategic interactions as well as game theory can. This is especially true in two-player, two-stage games in which the follower has a unique best strategy in response to the leader's strategy, as appears to be the case in the situation analyzed in this article. However, even in such cases, we agree with Cox that game-theoretic methods and concepts can still be useful. In particular, the tools of mechanism design, and especially the revelation principle, can simplify the analysis of such games because the revelation principle provides rigorous assurance that it is sufficient to analyze only games in which licensees truthfully report their risk levels, making the problem more manageable. Without that, it would generally be necessary to consider much more complicated forms of strategic behavior (including deception), to identify optimal regulatory strategies. Therefore, we believe that the types of regulatory interactions analyzed in this article are better modeled using game theory rather than decision theory. In particular, the goals of this article are to review the relevant literature in game theory and regulatory economics (to stimulate interest in this area among risk analysts), and to present illustrative results showing how the application of game theory can provide useful insights into the theory and practice of risk-informed regulation. © 2012 Society for Risk Analysis.

  7. The current status of theory evaluation in nursing.

    PubMed

    Im, Eun-Ok

    2015-10-01

    To identify the current status of theory evaluation in nursing and provide directions for theory evaluation for future development of theoretical bases of nursing discipline. Theory evaluation is an essential component in development of nursing knowledge, which is a critical element in development of nursing discipline. Despite earlier significant efforts for theory evaluation in nursing, a recent decline in the number of theory evaluation articles was noted and there have been few updates on theory evaluation in nursing. Discussion paper. A total of 58 articles published from 2003-2014 were retrieved through searches using the PUBMED, PsyInfo and CINAHL. The articles were sorted by the area of evaluation and analysed to identify themes reflecting the theory evaluation process. Diverse ways of theory evaluation need to be continuously used in future theory evaluation efforts. Six themes reflecting the theory evaluation process were identified: (a) rarely using existing theory evaluation criteria; (b) evaluating specifics; (c) using various statistical analysis methods; (d) developing instruments; (e) adopting in practice and education; and (f) evaluating mainly middle-range theories and situation-specific theories. © 2015 John Wiley & Sons Ltd.

  8. [Uncertainty characterization approaches for ecological risk assessment of polycyclic aromatic hydrocarbon in Taihu Lake].

    PubMed

    Guo, Guang-Hui; Wu, Feng-Chang; He, Hong-Ping; Feng, Cheng-Lian; Zhang, Rui-Qing; Li, Hui-Xian

    2012-04-01

    Probabilistic approaches, such as Monte Carlo Sampling (MCS) and Latin Hypercube Sampling (LHS), and non-probabilistic approaches, such as interval analysis, fuzzy set theory and variance propagation, were used to characterize uncertainties associated with risk assessment of sigma PAH8 in surface water of Taihu Lake. The results from MCS and LHS were represented by probability distributions of hazard quotients of sigma PAH8 in surface waters of Taihu Lake. The probabilistic distribution of hazard quotient were obtained from the results of MCS and LHS based on probabilistic theory, which indicated that the confidence intervals of hazard quotient at 90% confidence level were in the range of 0.000 18-0.89 and 0.000 17-0.92, with the mean of 0.37 and 0.35, respectively. In addition, the probabilities that the hazard quotients from MCS and LHS exceed the threshold of 1 were 9.71% and 9.68%, respectively. The sensitivity analysis suggested the toxicity data contributed the most to the resulting distribution of quotients. The hazard quotient of sigma PAH8 to aquatic organisms ranged from 0.000 17 to 0.99 using interval analysis. The confidence interval was (0.001 5, 0.016 3) at the 90% confidence level calculated using fuzzy set theory, and the confidence interval was (0.000 16, 0.88) at the 90% confidence level based on the variance propagation. These results indicated that the ecological risk of sigma PAH8 to aquatic organisms were low. Each method has its own set of advantages and limitations, which was based on different theory; therefore, the appropriate method should be selected on a case-by-case to quantify the effects of uncertainties on the ecological risk assessment. Approach based on the probabilistic theory was selected as the most appropriate method to assess the risk of sigma PAH8 in surface water of Taihu Lake, which provided an important scientific foundation of risk management and control for organic pollutants in water.

  9. Entropy in sound and vibration: towards a new paradigm.

    PubMed

    Le Bot, A

    2017-01-01

    This paper describes a discussion on the method and the status of a statistical theory of sound and vibration, called statistical energy analysis (SEA). SEA is a simple theory of sound and vibration in elastic structures that applies when the vibrational energy is diffusely distributed. We show that SEA is a thermodynamical theory of sound and vibration, based on a law of exchange of energy analogous to the Clausius principle. We further investigate the notion of entropy in this context and discuss its meaning. We show that entropy is a measure of information lost in the passage from the classical theory of sound and vibration and SEA, its thermodynamical counterpart.

  10. The "Chaos Theory" and nonlinear dynamics in heart rate variability analysis: does it work in short-time series in patients with coronary heart disease?

    PubMed

    Krstacic, Goran; Krstacic, Antonija; Smalcelj, Anton; Milicic, Davor; Jembrek-Gostovic, Mirjana

    2007-04-01

    Dynamic analysis techniques may quantify abnormalities in heart rate variability (HRV) based on nonlinear and fractal analysis (chaos theory). The article emphasizes clinical and prognostic significance of dynamic changes in short-time series applied on patients with coronary heart disease (CHD) during the exercise electrocardiograph (ECG) test. The subjects were included in the series after complete cardiovascular diagnostic data. Series of R-R and ST-T intervals were obtained from exercise ECG data after sampling digitally. The range rescaled analysis method determined the fractal dimension of the intervals. To quantify fractal long-range correlation's properties of heart rate variability, the detrended fluctuation analysis technique was used. Approximate entropy (ApEn) was applied to quantify the regularity and complexity of time series, as well as unpredictability of fluctuations in time series. It was found that the short-term fractal scaling exponent (alpha(1)) is significantly lower in patients with CHD (0.93 +/- 0.07 vs 1.09 +/- 0.04; P < 0.001). The patients with CHD had higher fractal dimension in each exercise test program separately, as well as in exercise program at all. ApEn was significant lower in CHD group in both RR and ST-T ECG intervals (P < 0.001). The nonlinear dynamic methods could have clinical and prognostic applicability also in short-time ECG series. Dynamic analysis based on chaos theory during the exercise ECG test point out the multifractal time series in CHD patients who loss normal fractal characteristics and regularity in HRV. Nonlinear analysis technique may complement traditional ECG analysis.

  11. Stability analysis of nonlinear autonomous systems - General theory and application to flutter

    NASA Technical Reports Server (NTRS)

    Smith, L. L.; Morino, L.

    1975-01-01

    The analysis makes use of a singular perturbation method, the multiple time scaling. Concepts of stable and unstable limit cycles are introduced. The solution is obtained in the form of an asymptotic expansion. Numerical results are presented for the nonlinear flutter of panels and airfoils in supersonic flow. The approach used is an extension of a method for analyzing nonlinear panel flutter reported by Morino (1969).

  12. Theory of viscous transonic flow over airfoils at high Reynolds number

    NASA Technical Reports Server (NTRS)

    Melnik, R. E.; Chow, R.; Mead, H. R.

    1977-01-01

    This paper considers viscous flows with unseparated turbulent boundary layers over two-dimensional airfoils at transonic speeds. Conventional theoretical methods are based on boundary layer formulations which do not account for the effect of the curved wake and static pressure variations across the boundary layer in the trailing edge region. In this investigation an extended viscous theory is developed that accounts for both effects. The theory is based on a rational analysis of the strong turbulent interaction at airfoil trailing edges. The method of matched asymptotic expansions is employed to develop formal series solutions of the full Reynolds equations in the limit of Reynolds numbers tending to infinity. Procedures are developed for combining the local trailing edge solution with numerical methods for solving the full potential flow and boundary layer equations. Theoretical results indicate that conventional boundary layer methods account for only about 50% of the viscous effect on lift, the remaining contribution arising from wake curvature and normal pressure gradient effects.

  13. Molecular – genetic variance of RH blood group system within human population of Bosnia and Herzegovina

    PubMed Central

    Lasić, Lejla; Lojo-Kadrić, Naida; Silajdžić, Elma; Pojskić, Lejla; Hadžiselimović, Rifat; Pojskić, Naris

    2013-01-01

    There are two major theories for inheritance of Rh blood group system: Fisher – Race theory and Wiener theory. Aim of this study was identifying frequency of RHDCE alleles in Bosnian – Herzegovinian population and introduction of this method in screening for Rh phenotype in B&H since this type of analysis was not used for blood typing in B&H before. Rh blood group was typed by Polymerase Chain Reaction, using the protocols and primers previously established by other authors, then carrying out electrophoresis in 2-3% agarose gel. Percentage of Rh positive individuals in our sample is 84.48%, while the percentage of Rh negative individuals is 15.52%. Inter-rater agreement statistic showed perfect agreement (K=1) between the results of Rh blood system detection based on serological and molecular-genetics methods. In conclusion, molecular – genetic methods are suitable for prenatal genotyping and specific cases while standard serological method is suitable for high-throughput of samples. PMID:23448604

  14. An Appraisal of Social Network Theory and Analysis as Applied to Public Health: Challenges and Opportunities.

    PubMed

    Valente, Thomas W; Pitts, Stephanie R

    2017-03-20

    The use of social network theory and analysis methods as applied to public health has expanded greatly in the past decade, yielding a significant academic literature that spans almost every conceivable health issue. This review identifies several important theoretical challenges that confront the field but also provides opportunities for new research. These challenges include (a) measuring network influences, (b) identifying appropriate influence mechanisms, (c) the impact of social media and computerized communications, (d) the role of networks in evaluating public health interventions, and (e) ethics. Next steps for the field are outlined and the need for funding is emphasized. Recently developed network analysis techniques, technological innovations in communication, and changes in theoretical perspectives to include a focus on social and environmental behavioral influences have created opportunities for new theory and ever broader application of social networks to public health topics.

  15. Finite-strain large-deflection elastic-viscoplastic finite-element transient response analysis of structures

    NASA Technical Reports Server (NTRS)

    Rodal, J. J. A.; Witmer, E. A.

    1979-01-01

    A method of analysis for thin structures that incorporates finite strain, elastic-plastic, strain hardening, time dependent material behavior implemented with respect to a fixed configuration and is consistently valid for finite strains and finite rotations is developed. The theory is formulated systematically in a body fixed system of convected coordinates with materially embedded vectors that deform in common with continuum. Tensors are considered as linear vector functions and use is made of the dyadic representation. The kinematics of a deformable continuum is treated in detail, carefully defining precisely all quantities necessary for the analysis. The finite strain theory developed gives much better predictions and agreement with experiment than does the traditional small strain theory, and at practically no additional cost. This represents a very significant advance in the capability for the reliable prediction of nonlinear transient structural responses, including the reliable prediction of strains large enough to produce ductile metal rupture.

  16. Traffic Flow Density Distribution Based on FEM

    NASA Astrophysics Data System (ADS)

    Ma, Jing; Cui, Jianming

    In analysis of normal traffic flow, it usually uses the static or dynamic model to numerical analyze based on fluid mechanics. However, in such handling process, the problem of massive modeling and data handling exist, and the accuracy is not high. Finite Element Method (FEM) is a production which is developed from the combination of a modern mathematics, mathematics and computer technology, and it has been widely applied in various domain such as engineering. Based on existing theory of traffic flow, ITS and the development of FEM, a simulation theory of the FEM that solves the problems existing in traffic flow is put forward. Based on this theory, using the existing Finite Element Analysis (FEA) software, the traffic flow is simulated analyzed with fluid mechanics and the dynamics. Massive data processing problem of manually modeling and numerical analysis is solved, and the authenticity of simulation is enhanced.

  17. Examining the Fieldwork Experience from the Site Supervisor Perspective: A Mixed-Methods Study Using Vygotsky's Zone of Proximal Development Theory

    ERIC Educational Resources Information Center

    Brannon, Sian

    2013-01-01

    The purpose of this study was to identify feelings and behaviors of fieldwork supervisors in public libraries using Lev Vygotsky's Zone of Proximal Development theory as a background for design, analysis, and discussion of results. This research sought to find out how fieldwork supervisors perform initial assessments of their fieldwork students,…

  18. Capabilities-Based Planning for Energy Security at Department of Defense Installations

    DTIC Science & Technology

    2013-01-01

    Support Services—The ability to provide assis- tance for payload and launch vehicles including safety, reception , staging, integration, movement to the...pubs/technical_reports/TR1249.html Davis, Paul K., and Paul Dreyer, RAND’s Portfolio Analysis Tool (PAT): Theory , Methods, and Reference Manual, Santa...Steven C. Bankes, and Michael Egner, Enhancing Strategic Planning with Massive Scenario Generation: Theory and Experiments, Santa Monica, Calif

  19. Translations on Eastern Europe Political, Sociological, and Military Affairs No. 1361

    DTIC Science & Technology

    1977-03-07

    focusing on theories and methods of late bourgeois, logical positivistic trends of a structuralistic mold reducing the complex and muUi...and articles on military and civil defense, organization, theory , budgets, and hardware. 17. Key Words and Document Analysis. 17a. Descriptors...unanimously emphasized its support for consolidation of administration for non-proliferation of nuclear arms. The political advisory committee, however

  20. MAC/GMC 4.0 User's Manual: Keywords Manual. Volume 2

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2002-01-01

    This document is the second volume in the three volume set of User's Manuals for the Micromechanics Analysis Code with Generalized Method of Cells Version 4.0 (MAC/GMC 4.0). Volume 1 is the Theory Manual, this document is the Keywords Manual, and Volume 3 is the Example Problem Manual. MAC/GMC 4.0 is a composite material and laminate analysis software program developed at the NASA Glenn Research Center. It is based on the generalized method of cells (GMC) micromechanics theory, which provides access to the local stress and strain fields in the composite material. This access grants GMC the ability to accommodate arbitrary local models for inelastic material behavior and various types of damage and failure analysis. MAC/GMC 4.0 has been built around GMC to provide the theory with a user-friendly framework, along with a library of local inelastic, damage, and failure models. Further, applications of simulated thermo-mechanical loading, generation of output results, and selection of architectures to represent the composite material have been automated in MAC/GMC 4.0. Finally, classical lamination theory has been implemented within MAC/GMC 4.0 wherein GMC is used to model the composite material response of each ply. Consequently, the full range of GMC composite material capabilities is available for analysis of arbitrary laminate configurations as well. This volume describes the basic information required to use the MAC/GMC 4.0 software, including a 'Getting Started' section, and an in-depth description of each of the 22 keywords used in the input file to control the execution of the code.

  1. Analysis of three-dimensional-cavity-backed aperture antennas using a Combined Finite Element Method/Method of Moments/Geometrical Theory of Diffraction technique

    NASA Technical Reports Server (NTRS)

    Reddy, C. J.; Deshpande, M. D.; Cockrell, C. R.; Beck, F. B.

    1995-01-01

    A combined finite element method (FEM) and method of moments (MoM) technique is presented to analyze the radiation characteristics of a cavity-fed aperture in three dimensions. Generalized feed modeling has been done using the modal expansion of fields in the feed structure. Numerical results for some feeding structures such as a rectangular waveguide, circular waveguide, and coaxial line are presented. The method also uses the geometrical theory of diffraction (GTD) to predict the effect of a finite ground plane on radiation characteristics. Input admittance calculations for open radiating structures such as a rectangular waveguide, a circular waveguide, and a coaxial line are shown. Numerical data for a coaxial-fed cavity with finite ground plane are verified with experimental data.

  2. Geoid Recovery using Geophysical Inverse Theory Applied to Satellite to Satellite Tracking Data

    NASA Technical Reports Server (NTRS)

    Gaposchkin, E. M.; Frey, H. (Technical Monitor)

    2000-01-01

    This report describes a new method for determination of the geopotential. The analysis is aimed at the GRACE mission. This Satellite-to-Satellite Tracking (SST) mission is viewed as a mapping mission The result will be maps of the geoid. The elements of potential theory, celestial mechanics, and Geophysical Inverse Theory are integrated into a computation architecture, and the results of several simulations presented Centimeter accuracy geoids with 50 to 100 km resolution can be recovered with a 30 to 60 day mission.

  3. On the complex interplay between learning and dynamics in life sciences. Comment on the paper "Collective learning modeling based on the kinetic theory of active particles" by Burini et al.

    NASA Astrophysics Data System (ADS)

    Bellomo, Nicola; Elaiw, Ahmed; Alghamdi, Mohamed Ali

    2016-03-01

    The paper by Burini, De Lillo, and Gibelli [8] presents an overview and critical analysis of the literature on the modeling of learning dynamics. The first reference is the celebrated paper by Cucker and Smale [9]. Then, the authors also propose their own approach, based on suitable development of methods of the kinetic theory [6] and theoretical tools of evolutionary game theory [12,13], recently developed on graphs [2].

  4. Application of a transonic potential flow code to the static aeroelastic analysis of three-dimensional wings

    NASA Technical Reports Server (NTRS)

    Whitlow, W., Jr.; Bennett, R. M.

    1982-01-01

    Since the aerodynamic theory is nonlinear, the method requires the coupling of two iterative processes - an aerodynamic analysis and a structural analysis. A full potential analysis code, FLO22, is combined with a linear structural analysis to yield aerodynamic load distributions on and deflections of elastic wings. This method was used to analyze an aeroelastically-scaled wind tunnel model of a proposed executive-jet transport wing and an aeroelastic research wing. The results are compared with the corresponding rigid-wing analyses, and some effects of elasticity on the aerodynamic loading are noted.

  5. Channel flow analysis. [velocity distribution throughout blade flow field

    NASA Technical Reports Server (NTRS)

    Katsanis, T.

    1973-01-01

    The design of a proper blade profile requires calculation of the blade row flow field in order to determine the velocities on the blade surfaces. An analysis theory is presented for several methods used for this calculation and associated computer programs that were developed are discussed.

  6. Feasibility study of shell buckling analysis using the modified structure method

    NASA Technical Reports Server (NTRS)

    Cohen, G. A.; Haftka, R. T.

    1972-01-01

    The modified structure method, which is based on Koiter's theory of imperfections, was used to calculate approximate buckling loads of several shells of revolution. The method does not appear to be practical for shells because, in many cases, the prebuckling nonlinearity may be too large to be treated accurately as a small imperfection.

  7. Explicating Metatheory for Mixed Methods Research in Educational Leadership: An Application of Habermas's "Theory of Communicative Action"

    ERIC Educational Resources Information Center

    Whiteman, Rodney S.

    2015-01-01

    Purpose: Mixed methods research can provide a fruitful line of inquiry for educational leadership, program evaluation, and policy analysis; however, mixed methods research requires a metatheory that allows for mixing what have traditionally been considered incompatible qualitative and quantitative inquiry. The purpose of this paper is to apply…

  8. The decade 1989-1998 in Spanish psychology: an analysis of research in statistics, methodology, and psychometric theory.

    PubMed

    García-Pérez, M A

    2001-11-01

    This paper presents an analysis of research published in the decade 1989-1998 by Spanish faculty members in the areas of statistical methods, research methodology, and psychometric theory. Database search and direct correspondence with faculty members in Departments of Methodology across Spain rendered a list of 193 papers published in these broad areas by 82 faculty members. These and other faculty members had actually published 931 papers over the decade of analysis, but 738 of them addressed topics not appropriate for description in this report. Classification and analysis of these 193 papers revealed topics that have attracted the most interest (psychophysics, item response theory, analysis of variance, sequential analysis, and meta-analysis) as well as other topics that have received less attention (scaling, factor analysis, time series, and structural models). A significant number of papers also dealt with various methodological issues (software, algorithms, instrumentation, and techniques). A substantial part of this report is devoted to describing the issues addressed across these 193 papers--most of which are written in the Spanish language and published in Spanish journals--and some representative references are given.

  9. Parent-Based Adolescent Sexual Health Interventions And Effect on Communication Outcomes: A Systematic Review and Meta-Analyses

    PubMed Central

    Maria, Diane Santa; Markham, Christine; Mullen, Patricia Dolan; Bluethmann, Shirley

    2016-01-01

    Context Parent-based adolescent sexual health interventions aim to reduce sexual risk behaviors by bolstering parental protective behaviors. Few studies of theory use, methods, applications, delivery and outcomes of parent-based interventions have been conducted. Methods A systematic search of databases for the period 1998–2013 identified 28 published trials of U.S. parent-based interventions to examine theory use, setting, reach, delivery mode, dose and effects on parent-child communication. Established coding schemes were used to assess use of theory and describe methods employed to achieve behavioral change; intervention effects were explored in meta-analyses. Results Most interventions were conducted with minority parents in group sessions or via self-paced activities; interventions averaged seven hours, and most used theory extensively. Meta-analyses found improvements in sexual health communication: Analysis of 11 controlled trials indicated a medium effect on increasing communication (Cohen's d, 0.5), and analysis of nine trials found a large effect on increasing parental comfort with communication (0.7); effects were positive regardless of delivery mode or intervention dose. Intervention participants were 68% more likely than controls to report increased communication and 75% more likely to report increased comfort. Conclusions These findings point to gaps in the range of programs examined in published trials—for example, interventions for parents of sexual minority youth, programs for custodial grandparents and faith-based services. Yet they provide support for the effectiveness of parent-based interventions in improving communication. Innovative delivery approaches could extend programs' reach, and further research on sexual health outcomes would facilitate the meta-analysis of intervention effectiveness in improving adolescent sexual health behaviors. PMID:25639664

  10. Multiple Interacting Risk Factors: On Methods for Allocating Risk Factor Interactions.

    PubMed

    Price, Bertram; MacNicoll, Michael

    2015-05-01

    A persistent problem in health risk analysis where it is known that a disease may occur as a consequence of multiple risk factors with interactions is allocating the total risk of the disease among the individual risk factors. This problem, referred to here as risk apportionment, arises in various venues, including: (i) public health management, (ii) government programs for compensating injured individuals, and (iii) litigation. Two methods have been described in the risk analysis and epidemiology literature for allocating total risk among individual risk factors. One method uses weights to allocate interactions among the individual risk factors. The other method is based on risk accounting axioms and finding an optimal and unique allocation that satisfies the axioms using a procedure borrowed from game theory. Where relative risk or attributable risk is the risk measure, we find that the game-theory-determined allocation is the same as the allocation where risk factor interactions are apportioned to individual risk factors using equal weights. Therefore, the apportionment problem becomes one of selecting a meaningful set of weights for allocating interactions among the individual risk factors. Equal weights and weights proportional to the risks of the individual risk factors are discussed. © 2015 Society for Risk Analysis.

  11. Design optimization of natural laminar flow bodies in compressible flow

    NASA Technical Reports Server (NTRS)

    Dodbele, Simha S.

    1992-01-01

    An optimization method has been developed to design axisymmetric body shapes such as fuselages, nacelles, and external fuel tanks with increased transition Reynolds numbers in subsonic compressible flow. The new design method involves a constraint minimization procedure coupled with analysis of the inviscid and viscous flow regions and linear stability analysis of the compressible boundary-layer. In order to reduce the computer time, Granville's transition criterion is used to predict boundary-layer transition and to calculate the gradients of the objective function, and linear stability theory coupled with the e(exp n)-method is used to calculate the objective function at the end of each design iteration. Use of a method to design an axisymmetric body with extensive natural laminar flow is illustrated through the design of a tiptank of a business jet. For the original tiptank, boundary layer transition is predicted to occur at a transition Reynolds number of 6.04 x 10(exp 6). For the designed body shape, a transition Reynolds number of 7.22 x 10(exp 6) is predicted using compressible linear stability theory coupled with the e(exp n)-method.

  12. Improved full analytical polygon-based method using Fourier analysis of the three-dimensional affine transformation.

    PubMed

    Pan, Yijie; Wang, Yongtian; Liu, Juan; Li, Xin; Jia, Jia

    2014-03-01

    Previous research [Appl. Opt.52, A290 (2013)] has revealed that Fourier analysis of three-dimensional affine transformation theory can be used to improve the computation speed of the traditional polygon-based method. In this paper, we continue our research and propose an improved full analytical polygon-based method developed upon this theory. Vertex vectors of primitive and arbitrary triangles and the pseudo-inverse matrix were used to obtain an affine transformation matrix representing the spatial relationship between the two triangles. With this relationship and the primitive spectrum, we analytically obtained the spectrum of the arbitrary triangle. This algorithm discards low-level angular dependent computations. In order to add diffusive reflection to each arbitrary surface, we also propose a whole matrix computation approach that takes advantage of the affine transformation matrix and uses matrix multiplication to calculate shifting parameters of similar sub-polygons. The proposed method improves hologram computation speed for the conventional full analytical approach. Optical experimental results are demonstrated which prove that the proposed method can effectively reconstruct three-dimensional scenes.

  13. An Actor-Network Theory Analysis of Policy Innovation for Smoke-Free Places: Understanding Change in Complex Systems

    PubMed Central

    Borland, Ron; Coghill, Ken

    2010-01-01

    Complex, transnational issues like the tobacco epidemic are major challenges that defy analysis and management by conventional methods, as are other public health issues, such as those associated with global food distribution and climate change. We examined the evolution of indoor smoke-free regulations, a tobacco control policy innovation, and identified the key attributes of those jurisdictions that successfully pursued this innovation and those that to date have not. In doing so, we employed the actor-network theory, a comprehensive framework for the analysis of fundamental system change. Through our analysis, we identified approaches to help overcome some systemic barriers to the solution of the tobacco problem and comment on other complex transnational problems. PMID:20466949

  14. An actor-network theory analysis of policy innovation for smoke-free places: understanding change in complex systems.

    PubMed

    Young, David; Borland, Ron; Coghill, Ken

    2010-07-01

    Complex, transnational issues like the tobacco epidemic are major challenges that defy analysis and management by conventional methods, as are other public health issues, such as those associated with global food distribution and climate change. We examined the evolution of indoor smoke-free regulations, a tobacco control policy innovation, and identified the key attributes of those jurisdictions that successfully pursued this innovation and those that to date have not. In doing so, we employed the actor-network theory, a comprehensive framework for the analysis of fundamental system change. Through our analysis, we identified approaches to help overcome some systemic barriers to the solution of the tobacco problem and comment on other complex transnational problems.

  15. Modeling of composite beams and plates for static and dynamic analysis

    NASA Technical Reports Server (NTRS)

    Hodges, Dewey H.

    1992-01-01

    A rigorous theory and the corresponding computational algorithms were developed for through-the-thickness analysis of composite plates. This type of analysis is needed in order to find the elastic stiffness constants of a plate. Additionally, the analysis is used to post-process the resulting plate solution in order to find approximate three-dimensional displacement, strain, and stress distributions throughout the plate. It was decided that the variational-asymptotical method (VAM) would serve as a suitable framework in which to solve these types of problems. Work during this reporting period has progressed along two lines: (1) further evaluation of neo-classical plate theory (NCPT) as applied to shear-coupled laminates; and (2) continued modeling of plates with nonuniform thickness.

  16. Analysis of metal-matrix composite structures. I - Micromechanics constitutive theory. II - Laminate analyses

    NASA Technical Reports Server (NTRS)

    Arenburg, R. T.; Reddy, J. N.

    1991-01-01

    The micromechanical constitutive theory is used to examine the nonlinear behavior of continuous-fiber-reinforced metal-matrix composite structures. Effective lamina constitutive relations based on the Abouli micromechanics theory are presented. The inelastic matrix behavior is modeled by the unified viscoplasticity theory of Bodner and Partom. The laminate constitutive relations are incorporated into a first-order deformation plate theory. The resulting boundary value problem is solved by utilizing the finite element method. Attention is also given to computational aspects of the numerical solution, including the temporal integration of the inelastic strains and the spatial integration of bending moments. Numerical results the nonlinear response of metal matrix composites subjected to extensional and bending loads are presented.

  17. Understanding nursing units with data and theory.

    PubMed

    Diers, Donna; Hendrickson, Karrie; Rimar, Joan; Donovan, Donna

    2013-01-01

    Nursing units are social systems whose function depends on many variables. Available nursing data, combined with a theory of organizational diagnosis, can be used to understand nursing unit performance. One troubled unit served as a case study in organizational diagnosis and treatment using modern methods of data mining and performance improvement. Systems theory did not prescribe how to fix an underbounded system. The theory did suggest, however, that addressing the characteristics of overbounded and underbounded systems can provide some order and structure and identify helpful resources. In this instance, the data analysis served to help define the unit's problems in conjunction with information gained from talking with the nurses and touring the unit, but it was the theory that gave hints for direction for change.

  18. A Discursive Formation that Undermined Integration at a Historically Advantaged School in South Africa

    ERIC Educational Resources Information Center

    Naidoo, Devika

    2010-01-01

    This paper provides an analysis of the extent of integration at a historically advantaged school. A qualitative multi-method case study allowed for in-depth analysis of integration in the school. Bernstein's theory of code, classification, boundary and power framed the study. Data analysis showed that: racial desegregation was achieved at student…

  19. Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package

    NASA Astrophysics Data System (ADS)

    Donges, Jonathan F.; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik V.; Marwan, Norbert; Dijkstra, Henk A.; Kurths, Jürgen

    2015-11-01

    We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology.

  20. Variational Identification of Markovian Transition States

    NASA Astrophysics Data System (ADS)

    Martini, Linda; Kells, Adam; Covino, Roberto; Hummer, Gerhard; Buchete, Nicolae-Viorel; Rosta, Edina

    2017-07-01

    We present a method that enables the identification and analysis of conformational Markovian transition states from atomistic or coarse-grained molecular dynamics (MD) trajectories. Our algorithm is presented by using both analytical models and examples from MD simulations of the benchmark system helix-forming peptide Ala5 , and of larger, biomedically important systems: the 15-lipoxygenase-2 enzyme (15-LOX-2), the epidermal growth factor receptor (EGFR) protein, and the Mga2 fungal transcription factor. The analysis of 15-LOX-2 uses data generated exclusively from biased umbrella sampling simulations carried out at the hybrid ab initio density functional theory (DFT) quantum mechanics/molecular mechanics (QM/MM) level of theory. In all cases, our method automatically identifies the corresponding transition states and metastable conformations in a variationally optimal way, with the input of a set of relevant coordinates, by accurately reproducing the intrinsic slowest relaxation rate of each system. Our approach offers a general yet easy-to-implement analysis method that provides unique insight into the molecular mechanism and the rare but crucial (i.e., rate-limiting) transition states occurring along conformational transition paths in complex dynamical systems such as molecular trajectories.

  1. Evaluation of adding item-response theory analysis for evaluation of the European Board of Ophthalmology Diploma examination.

    PubMed

    Mathysen, Danny G P; Aclimandos, Wagih; Roelant, Ella; Wouters, Kristien; Creuzot-Garcher, Catherine; Ringens, Peter J; Hawlina, Marko; Tassignon, Marie-José

    2013-11-01

    To investigate whether introduction of item-response theory (IRT) analysis, in parallel to the 'traditional' statistical analysis methods available for performance evaluation of multiple T/F items as used in the European Board of Ophthalmology Diploma (EBOD) examination, has proved beneficial, and secondly, to study whether the overall assessment performance of the current written part of EBOD is sufficiently high (KR-20≥ 0.90) to be kept as examination format in future EBOD editions. 'Traditional' analysis methods for individual MCQ item performance comprise P-statistics, Rit-statistics and item discrimination, while overall reliability is evaluated through KR-20 for multiple T/F items. The additional set of statistical analysis methods for the evaluation of EBOD comprises mainly IRT analysis. These analysis techniques are used to monitor whether the introduction of negative marking for incorrect answers (since EBOD 2010) has a positive influence on the statistical performance of EBOD as a whole and its individual test items in particular. Item-response theory analysis demonstrated that item performance parameters should not be evaluated individually, but should be related to one another. Before the introduction of negative marking, the overall EBOD reliability (KR-20) was good though with room for improvement (EBOD 2008: 0.81; EBOD 2009: 0.78). After the introduction of negative marking, the overall reliability of EBOD improved significantly (EBOD 2010: 0.92; EBOD 2011:0.91; EBOD 2012: 0.91). Although many statistical performance parameters are available to evaluate individual items, our study demonstrates that the overall reliability assessment remains the only crucial parameter to be evaluated allowing comparison. While individual item performance analysis is worthwhile to undertake as secondary analysis, drawing final conclusions seems to be more difficult. Performance parameters need to be related, as shown by IRT analysis. Therefore, IRT analysis has proved beneficial for the statistical analysis of EBOD. Introduction of negative marking has led to a significant increase in the reliability (KR-20 > 0.90), indicating that the current examination format can be kept for future EBOD examinations. © 2013 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  2. Considering methodological options for reviews of theory: illustrated by a review of theories linking income and health

    PubMed Central

    2014-01-01

    Background Review of theory is an area of growing methodological advancement. Theoretical reviews are particularly useful where the literature is complex, multi-discipline, or contested. It has been suggested that adopting methods from systematic reviews may help address these challenges. However, the methodological approaches to reviews of theory, including the degree to which systematic review methods can be incorporated, have received little discussion in the literature. We recently employed systematic review methods in a review of theories about the causal relationship between income and health. Methods This article discusses some of the methodological issues we considered in developing the review and offers lessons learnt from our experiences. It examines the stages of a systematic review in relation to how they could be adapted for a review of theory. The issues arising and the approaches taken in the review of theories in income and health are considered, drawing on the approaches of other reviews of theory. Results Different approaches to searching were required, including electronic and manual searches, and electronic citation tracking to follow the development of theories. Determining inclusion criteria was an iterative process to ensure that inclusion criteria were specific enough to make the review practical and focused, but not so narrow that key literature was excluded. Involving subject specialists was valuable in the literature searches to ensure principal papers were identified and during the inductive approaches used in synthesis of theories to provide detailed understanding of how theories related to another. Reviews of theory are likely to involve iterations and inductive processes throughout, and some of the concepts and techniques that have been developed for qualitative evidence synthesis can be usefully translated to theoretical reviews of this kind. Conclusions It may be useful at the outset of a review of theory to consider whether the key aim of the review is to scope out theories relating to a particular issue; to conduct in-depth analysis of key theoretical works with the aim of developing new, overarching theories and interpretations; or to combine both these processes in the review. This can help decide the most appropriate methodological approach to take at particular stages of the review. PMID:25312937

  3. CINAHL: an exploratory analysis of the current status of nursing theory construction as reflected by the electronic domain.

    PubMed

    Riddlesperger, K L; Beard, M; Flowers, D L; Hisley, S M; Pfeifer, K A; Stiller, J J

    1996-09-01

    Since the 1980s the electronic domain has become the primary method for academic and professional communication of research and information. Papers relating to theory construction in nursing are a frequently occurring phenomenon within the electronic domain. Theory construction provides the underpinning for the advancement of professional nursing, facilitating the conceptualization of nursing actions leading to theory-based practice and research. The purpose of this study was to address the research question, 'What are the similarities and differences among theory construction papers that are accessible electronically in nursing literature today?' The Cumulative Index to Nursing and Allied Health Literature (CINAHL) was accessed to obtain a listing of papers from which an overall description of the type of theory construction papers being published in the nursing literature today could be determined. A literature search was conducted using the description 'theory construction'. Papers were limited to publication years from 1990 onwards. A total of 125 papers were obtained and read by one of the six authors. Using grounded theory, categories emerged by identification of similarities and differences among the papers. The findings are discussed here along with suggestions for further study. A second purpose of this paper was to present both traditional and non-traditional methods of tapping into the electronic domain when searching for assistance with theory construction.

  4. Absolute mass scale calibration in the inverse problem of the physical theory of fireballs.

    NASA Astrophysics Data System (ADS)

    Kalenichenko, V. V.

    A method of the absolute mass scale calibration is suggested for solving the inverse problem of the physical theory of fireballs. The method is based on the data on the masses of the fallen meteorites whose fireballs have been photographed in their flight. The method may be applied to those fireballs whose bodies have not experienced considerable fragmentation during their destruction in the atmosphere and have kept their form well enough. Statistical analysis of the inverse problem solution for a sufficiently representative sample makes it possible to separate a subsample of such fireballs. The data on the Lost City and Innisfree meteorites are used to obtain calibration coefficients.

  5. Plasticity - Theory and finite element applications.

    NASA Technical Reports Server (NTRS)

    Armen, H., Jr.; Levine, H. S.

    1972-01-01

    A unified presentation is given of the development and distinctions associated with various incremental solution procedures used to solve the equations governing the nonlinear behavior of structures, and this is discussed within the framework of the finite-element method. Although the primary emphasis here is on material nonlinearities, consideration is also given to geometric nonlinearities acting separately or in combination with nonlinear material behavior. The methods discussed here are applicable to a broad spectrum of structures, ranging from simple beams to general three-dimensional bodies. The finite-element analysis methods for material nonlinearity are general in the sense that any of the available plasticity theories can be incorporated to treat strain hardening or ideally plastic behavior.

  6. Are Teacher Course Evaluations Biased against Faculty That Teach Quantitative Methods Courses?

    ERIC Educational Resources Information Center

    Royal, Kenneth D.; Stockdale, Myrah R.

    2015-01-01

    The present study investigated graduate students' responses to teacher/course evaluations (TCE) to determine if students' responses were inherently biased against faculty who teach quantitative methods courses. Item response theory (IRT) and Differential Item Functioning (DIF) techniques were utilized for data analysis. Results indicate students…

  7. Presenting the Iterative Curriculum Discourse Analysis (ICDA) Approach

    ERIC Educational Resources Information Center

    Iversen, Lars Laird

    2014-01-01

    The article presents a method for analysing recurring curriculum documents using discourse theory inspired by Ernesto Laclau and Chantal Mouffe. The article includes a presentation of the method in seven practical steps, and is illustrated and discussed throughout using the author's recent case study on religion, identity and values in Norwegian…

  8. Class Jumping into Academia: Multiple Identities for Counseling Academics

    ERIC Educational Resources Information Center

    Nelson, Mary Lee; Englar-Carlson, Matt; Tierney, Sandra C.; Hau, Julie M.

    2006-01-01

    Eleven counseling psychology and counselor education academics were interviewed regarding their experiences of progressing from lower-or lower-middle-class backgrounds to college and, further, to academic positions. Grounded theory method was used for data analysis, and consensual qualitative research methods were used for triangulation and data…

  9. A Frame-Reflective Discourse Analysis of Serious Games

    ERIC Educational Resources Information Center

    Mayer, Igor; Warmelink, Harald; Zhou, Qiqi

    2016-01-01

    The authors explore how framing theory and the method of frame-reflective discourse analysis provide foundations for the emerging discipline of serious games (SGs) research. Starting with Wittgenstein's language game and Berger and Luckmann's social constructivist view on science, the authors demonstrate why a definitional or taxonomic approach to…

  10. Techniques for forced response involving discrete nonlinearities. I - Theory. II - Applications

    NASA Astrophysics Data System (ADS)

    Avitabile, Peter; Callahan, John O.

    Several new techniques developed for the forced response analysis of systems containing discrete nonlinear connection elements are presented and compared to the traditional methods. In particular, the techniques examined are the Equivalent Reduced Model Technique (ERMT), Modal Modification Response Technique (MMRT), and Component Element Method (CEM). The general theory of the techniques is presented, and applications are discussed with particular reference to the beam nonlinear system model using ERMT, MMRT, and CEM; frame nonlinear response using the three techniques; and comparison of the results obtained by using the ERMT, MMRT, and CEM models.

  11. Analysis of high aspect ratio jet flap wings of arbitrary geometry.

    NASA Technical Reports Server (NTRS)

    Lissaman, P. B. S.

    1973-01-01

    Paper presents a design technique for rapidly computing lift, induced drag, and spanwise loading of unswept jet flap wings of arbitrary thickness, chord, twist, blowing, and jet angle, including discontinuities. Linear theory is used, extending Spence's method for elliptically loaded jet flap wings. Curves for uniformly blown rectangular wings are presented for direct performance estimation. Arbitrary planforms require a simple computer program. Method of reducing wing to equivalent stretched, twisted, unblown planform for hand calculation is also given. Results correlate with limited existing data, and show lifting line theory is reasonable down to aspect ratios of 5.

  12. Strings from massive higher spins: the asymptotic uniqueness of the Veneziano amplitude

    NASA Astrophysics Data System (ADS)

    Caron-Huot, Simon; Komargodski, Zohar; Sever, Amit; Zhiboedov, Alexander

    2017-10-01

    We consider weakly coupled theories of massive higher-spin particles. This class of models includes, for instance, tree-level String Theory and Large-N Yang-Mills theory. The S-matrix in such theories is a meromorphic function obeying unitarity and crossing symmetry. We discuss the (unphysical) regime s, t ≫ 1, in which we expect the amplitude to be universal and exponentially large. We develop methods to study this regime and show that the amplitude necessarily coincides with the Veneziano amplitude there. In particular, this implies that the leading Regge trajectory, j( t), is asymptotically linear in Yang-Mills theory. Further, our analysis shows that any such theory of higherspin particles has stringy excitations and infinitely many asymptotically parallel subleading trajectories. More generally, we argue that, under some assumptions, any theory with at least one higher-spin particle must have strings.

  13. Determining the optimal forensic DNA analysis procedure following investigation of sample quality.

    PubMed

    Hedell, Ronny; Hedman, Johannes; Mostad, Petter

    2018-07-01

    Crime scene traces of various types are routinely sent to forensic laboratories for analysis, generally with the aim of addressing questions about the source of the trace. The laboratory may choose to analyse the samples in different ways depending on the type and quality of the sample, the importance of the case and the cost and performance of the available analysis methods. Theoretically well-founded guidelines for the choice of analysis method are, however, lacking in most situations. In this paper, it is shown how such guidelines can be created using Bayesian decision theory. The theory is applied to forensic DNA analysis, showing how the information from the initial qPCR analysis can be utilized. It is assumed the alternatives for analysis are using a standard short tandem repeat (STR) DNA analysis assay, using the standard assay and a complementary assay, or the analysis may be cancelled following quantification. The decision is based on information about the DNA amount and level of DNA degradation of the forensic sample, as well as case circumstances and the cost for analysis. Semi-continuous electropherogram models are used for simulation of DNA profiles and for computation of likelihood ratios. It is shown how tables and graphs, prepared beforehand, can be used to quickly find the optimal decision in forensic casework.

  14. Longer term consequences of the Short Take-Off and Landing (STOL) aircraft system

    NASA Technical Reports Server (NTRS)

    Laporte, T. R.

    1972-01-01

    An assessment of the STOL aircraft and the various means of employing it are discussed in the light of a research study to evaluate the efficacy of such analyses. It was determined that current approaches to assessment are generally inadequate for investigating the full social consequences of implementing a new technology. It is stated that a meaningful methodology of technology assessment must reflect mechanisms underlying the relationship of technology to social change. Interrelated methods which are discussed are: (1) gaming and simulation as heurisitic approaches in analysis and inquiry, (2) long range planning and questions of the future, (3) planning theory as a background for critical analysis of policy planning, and (4) social theory, with particular emphasis on social change and systems theories.

  15. Geometry program for aerodynamic lifting surface theory

    NASA Technical Reports Server (NTRS)

    Medan, R. T.

    1973-01-01

    A computer program that provides the geometry and boundary conditions appropriate for an analysis of a lifting, thin wing with control surfaces in linearized, subsonic, steady flow is presented. The kernel function method lifting surface theory is applied. The data which is generated by the program is stored on disk files or tapes for later use by programs which calculate an influence matrix, plot the wing planform, and evaluate the loads on the wing. In addition to processing data for subsequent use in a lifting surface analysis, the program is useful for computing area and mean geometric chords of the wing and control surfaces.

  16. Rigorous Electromagnetic Analysis of the Focusing Action of Refractive Cylindrical Microlens

    NASA Astrophysics Data System (ADS)

    Liu, Juan; Gu, Ben-Yuan; Dong, Bi-Zhen; Yang, Guo-Zhen

    The focusing action of refractive cylindrical microlens is investigated based on the rigorous electromagnetic theory with the use of the boundary element method. The focusing behaviors of these refractive microlenses with continuous and multilevel surface-envelope are characterized in terms of total electric-field patterns, the electric-field intensity distributions on the focal plane, and their diffractive efficiencies at the focal spots. The obtained results are also compared with the ones obtained by Kirchhoff's scalar diffraction theory. The present numerical and graphical results may provide useful information for the analysis and design of refractive elements in micro-optics.

  17. Gesellschaft fuer angewandte Mathematik und Mechanik, Annual Scientific Meeting, Universitaet Regensburg, Regensburg, West Germany, April 16-19, 1984, Proceedings

    NASA Astrophysics Data System (ADS)

    Problems in applied mathematics and mechanics are addressed in reviews and reports. Areas covered are vibration and stability, elastic and plastic mechanics, fluid mechanics, the numerical treatment of differential equations (general theory and finite-element methods in particular), optimization, decision theory, stochastics, actuarial mathematics, applied analysis and mathematical physics, and numerical analysis. Included are major lectures on separated flows, the transition regime of rarefied-gas dynamics, recent results in nonlinear elasticity, fluid-elastic vibration, the new computer arithmetic, and unsteady wave propagation in layered elastic bodies.

  18. Statistical mechanical theory for steady state systems. II. Reciprocal relations and the second entropy.

    PubMed

    Attard, Phil

    2005-04-15

    The concept of second entropy is introduced for the dynamic transitions between macrostates. It is used to develop a theory for fluctuations in velocity, and is exemplified by deriving Onsager reciprocal relations for Brownian motion. The cases of free, driven, and pinned Brownian particles are treated in turn, and Stokes' law is derived. The second entropy analysis is applied to the general case of thermodynamic fluctuations, and the Onsager reciprocal relations for these are derived using the method. The Green-Kubo formulas for the transport coefficients emerge from the analysis, as do Langevin dynamics.

  19. Investigating the application of Rasch theory in measuring change in middle school student performance in physical science

    NASA Astrophysics Data System (ADS)

    Cunningham, Jessica D.

    Newton's Universe (NU), an innovative teacher training program, strives to obtain measures from rural, middle school science teachers and their students to determine the impact of its distance learning course on understanding of temperature. No consensus exists on the most appropriate and useful method of analysis to measure change in psychological constructs over time. Several item response theory (IRT) models have been deemed useful in measuring change, which makes the choice of an IRT model not obvious. The appropriateness and utility of each model, including a comparison to a traditional analysis of variance approach, was investigated using middle school science student performance on an assessment over an instructional period. Predetermined criteria were outlined to guide model selection based on several factors including research questions, data properties, and meaningful interpretations to determine the most appropriate model for this study. All methods employed in this study reiterated one common interpretation of the data -- specifically, that the students of teachers with any NU course experience had significantly greater gains in performance over the instructional period. However, clear distinctions were made between an analysis of variance and the racked and stacked analysis using the Rasch model. Although limited research exists examining the usefulness of the Rasch model in measuring change in understanding over time, this study applied these methods and detailed plausible implications for data-driven decisions based upon results for NU and others. Being mindful of the advantages and usefulness of each method of analysis may help others make informed decisions about choosing an appropriate model to depict changes to evaluate other programs. Results may encourage other researchers to consider the meaningfulness of using IRT for this purpose. Results have implications for data-driven decisions for future professional development courses, in science education and other disciplines. KEYWORDS: Item Response Theory, Rasch Model, Racking and Stacking, Measuring Change in Student Performance, Newton's Universe teacher training

  20. An exploratory analysis of the nature of informal knowledge underlying theories of planned action used for public health oriented knowledge translation.

    PubMed

    Kothari, Anita; Boyko, Jennifer A; Campbell-Davison, Andrea

    2015-09-09

    Informal knowledge is used in public health practice to make sense of research findings. Although knowledge translation theories highlight the importance of informal knowledge, it is not clear to what extent the same literature provides guidance in terms of how to use it in practice. The objective of this study was to address this gap by exploring what planned action theories suggest in terms of using three types of informal knowledge: local, experiential and expert. We carried out an exploratory secondary analysis of the planned action theories that informed the development of a popular knowledge translation theory. Our sample included twenty-nine (n = 29) papers. We extracted information from these papers about sources of and guidance for using informal knowledge, and then carried out a thematic analysis. We found that theories of planned action provide guidance (including sources of, methods for identifying, and suggestions for use) for using local, experiential and expert knowledge. This study builds on previous knowledge translation related work to provide insight into the practical use of informal knowledge. Public health practitioners can refer to the guidance summarized in this paper to inform their decision-making. Further research about how to use informal knowledge in public health practice is needed given the value being accorded to using informal knowledge in public health decision-making processes.

  1. Dynamical Systems in Circuit Designer's Eyes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Odyniec, M.

    Examples of nonlinear circuit design are given. Focus of the design process is on theory and engineering methods (as opposed to numerical analysis). Modeling is related to measurements It is seen that the phase plane is still very useful with proper models Harmonic balance/describing function offers powerful insight (via the combination of simulation with circuit and ODE theory). Measurement and simulation capabilities increased, especially harmonics measurements (since sinusoids are easy to generate)

  2. Nucleon emission via electromagnetic excitation in relativistic nucleus-nucleus collisions: Re-analysis of the Weizsacker-Williams method

    NASA Technical Reports Server (NTRS)

    Norbury, John W.

    1989-01-01

    Previous analyses of the comparison of Weizsacker-Williams (WW) theory to experiment for nucleon emission via electromagnetic (EM) excitations in nucleus-nucleus collisions were not definitive because of different assumptions concerning the value of the minimum impact parameter. This situation is corrected by providing criteria that allows definitive statements to be made concerning agreement or disagreement between WW theory and experiment.

  3. Conductometry of electrolyte solutions

    NASA Astrophysics Data System (ADS)

    Safonova, Lyubov P.; Kolker, Arkadii M.

    1992-09-01

    A review is given of the theories of the electrical conductance of electrolyte solutions of different ionic strengths and concentrations, and of the models of ion association. An analysis is made of the methods for mathematical processing of experimental conductometric data. An account is provided of various theories describing the dependence of the limiting value of the ionic electrical conductance on the properties of the solute and solvent. The bibliography includes 115 references.

  4. An uncertainty and sensitivity analysis approach for GIS-based multicriteria landslide susceptibility mapping.

    PubMed

    Feizizadeh, Bakhtiar; Blaschke, Thomas

    2014-03-04

    GIS-based multicriteria decision analysis (MCDA) methods are increasingly being used in landslide susceptibility mapping. However, the uncertainties that are associated with MCDA techniques may significantly impact the results. This may sometimes lead to inaccurate outcomes and undesirable consequences. This article introduces a new GIS-based MCDA approach. We illustrate the consequences of applying different MCDA methods within a decision-making process through uncertainty analysis. Three GIS-MCDA methods in conjunction with Monte Carlo simulation (MCS) and Dempster-Shafer theory are analyzed for landslide susceptibility mapping (LSM) in the Urmia lake basin in Iran, which is highly susceptible to landslide hazards. The methodology comprises three stages. First, the LSM criteria are ranked and a sensitivity analysis is implemented to simulate error propagation based on the MCS. The resulting weights are expressed through probability density functions. Accordingly, within the second stage, three MCDA methods, namely analytical hierarchy process (AHP), weighted linear combination (WLC) and ordered weighted average (OWA), are used to produce the landslide susceptibility maps. In the third stage, accuracy assessments are carried out and the uncertainties of the different results are measured. We compare the accuracies of the three MCDA methods based on (1) the Dempster-Shafer theory and (2) a validation of the results using an inventory of known landslides and their respective coverage based on object-based image analysis of IRS-ID satellite images. The results of this study reveal that through the integration of GIS and MCDA models, it is possible to identify strategies for choosing an appropriate method for LSM. Furthermore, our findings indicate that the integration of MCDA and MCS can significantly improve the accuracy of the results. In LSM, the AHP method performed best, while the OWA reveals better performance in the reliability assessment. The WLC operation yielded poor results.

  5. Modeling the Severity of Drinking Consequences in First-Year College Women: An Item Response Theory Analysis of the Rutgers Alcohol Problem Index*

    PubMed Central

    Cohn, Amy M.; Hagman, Brett T.; Graff, Fiona S.; Noel, Nora E.

    2011-01-01

    Objective: The present study examined the latent continuum of alcohol-related negative consequences among first-year college women using methods from item response theory and classical test theory. Method: Participants (N = 315) were college women in their freshman year who reported consuming any alcohol in the past 90 days and who completed assessments of alcohol consumption and alcohol-related negative consequences using the Rutgers Alcohol Problem Index. Results: Item response theory analyses showed poor model fit for five items identified in the Rutgers Alcohol Problem Index. Two-parameter item response theory logistic models were applied to the remaining 18 items to examine estimates of item difficulty (i.e., severity) and discrimination parameters. The item difficulty parameters ranged from 0.591 to 2.031, and the discrimination parameters ranged from 0.321 to 2.371. Classical test theory analyses indicated that the omission of the five misfit items did not significantly alter the psychometric properties of the construct. Conclusions: Findings suggest that those consequences that had greater severity and discrimination parameters may be used as screening items to identify female problem drinkers at risk for an alcohol use disorder. PMID:22051212

  6. Why does MP2 work?

    PubMed

    Fink, Reinhold F

    2016-11-14

    We show analytically and numerically that the performance of second order Møller-Plesset (MP) perturbation theory (PT), coupled-cluster (CC) theory, and other perturbation theory approaches can be rationalized by analyzing the wavefunctions of these methods. While rather large deviations for the individual contributions of configurations to the electron correlation energy are found for MP wavefunctions, they profit from an advantageous and robust error cancellation: The absolute contribution to the correlation energy is generally underestimated for the critical excitations with small energy denominators and all other doubly excited configurations where the two excited electrons are coupled to a singlet. This is balanced by an overestimation of the contribution of triplet-coupled double excitations to the correlation energy. The even better performance of spin-component-scaled-MP2 theory is explained by a similar error compensation effect. The wavefunction analysis for the lowest singlet states of H 2 O, CH 2 , CO, and Cu + shows the predicted trends for MP methods, rapid but biased convergence of CC theory as well as the substantial potential of linearized CC, or retaining the excitation-degree (RE)-PT.

  7. Improving access to high-quality primary care for socioeconomically disadvantaged older people in rural areas: a mixed method study protocol

    PubMed Central

    Ford, John A; Jones, Andrew P; Wong, Geoff; Clark, Allan B; Porter, Tom; Shakespeare, Tom; Swart, Ann Marie; Steel, Nicholas

    2015-01-01

    Introduction The UK has an ageing population, especially in rural areas, where deprivation is high among older people. Previous research has identified this group as at high risk of poor access to healthcare. The aim of this study is to generate a theory of how socioeconomically disadvantaged older people from rural areas access primary care, to develop an intervention based on this theory and test it in a feasibility trial. Methods and analysis On the basis of the MRC Framework for Developing and Evaluating Complex Interventions, three methods will be used to generate the theory. First, a realist review will elucidate the patient pathway based on existing literature. Second, an analysis of the English Longitudinal Study of Ageing will be completed using structural equation modelling. Third, 15 semistructured interviews will be undertaken with patients and four focus groups with health professionals. A triangulation protocol will be used to allow each of these methods to inform and be informed by each other, and to integrate data into one overall realist theory. Based on this theory, an intervention will be developed in discussion with stakeholders to ensure that the intervention is feasible and practical. The intervention will be tested within a feasibility trial, the design of which will depend on the intervention. Lessons from the feasibility trial will be used to refine the intervention and gather the information needed for a definitive trial. Ethics and dissemination Ethics approval from the regional ethics committee has been granted for the focus groups with health professionals and interviews with patients. Ethics approval will be sought for the feasibility trial after the intervention has been designed. Findings will be disseminated to the key stakeholders involved in intervention development, to researchers, clinicians and health planners through peer-reviewed journal articles and conference publications, and locally through a dissemination event. PMID:26384728

  8. Validating experimental and theoretical Langmuir probe analyses

    NASA Astrophysics Data System (ADS)

    Pilling, L. S.; Carnegie, D. A.

    2007-08-01

    Analysis of Langmuir probe characteristics contains a paradox in that it is unknown a priori which theory is applicable before it is applied. Often theories are assumed to be correct when certain criteria are met although they may not validate the approach used. We have analysed the Langmuir probe data from cylindrical double and single probes acquired from a dc discharge plasma over a wide variety of conditions. This discharge contains a dual-temperature distribution and hence fitting a theoretically generated curve is impractical. To determine the densities, an examination of the current theories was necessary. For the conditions where the probe radius is the same order of magnitude as the Debye length, the gradient expected for orbital-motion limited (OML) is approximately the same as the radial-motion gradients. An analysis of the 'gradients' from the radial-motion theory was able to resolve the differences from the OML gradient value of two. The method was also able to determine whether radial or OML theories applied without knowledge of the electron temperature, or separation of the ion and electron contributions. Only the value of the space potential is necessary to determine the applicable theory.

  9. Generalization of the Kohn-Sham system that can represent arbitrary one-electron density matrices

    DOE PAGES

    Hubertus J. J. van Dam

    2016-04-27

    Density functional theory is currently the most widely applied method in electronic structure theory. The Kohn-Sham method, based on a fictitious system of noninteracting particles, is the workhorse of the theory. The particular form of the Kohn-Sham wave function admits only idempotent one-electron density matrices whereas wave functions of correlated electrons in post-Hartree-Fock methods invariably have fractional occupation numbers. Here we show that by generalizing the orbital concept and introducing a suitable dot product as well as a probability density, a noninteracting system can be chosen that can represent the one-electron density matrix of any system, even one with fractionalmore » occupation numbers. This fictitious system ensures that the exact electron density is accessible within density functional theory. It can also serve as the basis for reduced density matrix functional theory. Moreover, to aid the analysis of the results the orbitals may be assigned energies from a mean-field Hamiltonian. This produces energy levels that are akin to Hartree-Fock orbital energies such that conventional analyses based on Koopmans' theorem are available. Lastly, this system is convenient in formalisms that depend on creation and annihilation operators as they are trivially applied to single-determinant wave functions.« less

  10. From bed to bench: bridging from informatics practice to theory: an exploratory analysis.

    PubMed

    Haux, R; Lehmann, C U

    2014-01-01

    In 2009, Applied Clinical Informatics (ACI)--focused on applications in clinical informatics--was launched as a companion journal to Methods of Information in Medicine (MIM). Both journals are official journals of the International Medical Informatics Association. To explore which congruencies and interdependencies exist in publications from theory to practice and from practice to theory and to determine existing gaps. Major topics discussed in ACI and MIM were analyzed. We explored if the intention of publishing companion journals to provide an information bridge from informatics theory to informatics practice and vice versa could be supported by this model. In this manuscript we will report on congruencies and interdependences from practice to theory and on major topics in MIM. Retrospective, prolective observational study on recent publications of ACI and MIM. All publications of the years 2012 and 2013 were indexed and analyzed. Hundred and ninety-six publications were analyzed (ACI 87, MIM 109). In MIM publications, modelling aspects as well as methodological and evaluation approaches for the analysis of data, information, and knowledge in biomedicine and health care were frequently raised - and often discussed from an interdisciplinary point of view. Important themes were ambient-assisted living, anatomic spatial relations, biomedical informatics as scientific discipline, boosting, coding, computerized physician order entry, data analysis, grid and cloud computing, health care systems and services, health-enabling technologies, health information search, health information systems, imaging, knowledge-based decision support, patient records, signal analysis, and web science. Congruencies between journals could be found in themes, but with a different focus on content. Interdependencies from practice to theory, found in these publications, were only limited. Bridging from informatics theory to practice and vice versa remains a major component of successful research and practice as well as a major challenge.

  11. Local control theory using trajectory surface hopping and linear-response time-dependent density functional theory.

    PubMed

    Curchod, Basile F E; Penfold, Thomas J; Rothlisberger, Ursula; Tavernelli, Ivano

    2013-01-01

    The implementation of local control theory using nonadiabatic molecular dynamics within the framework of linear-response time-dependent density functional theory is discussed. The method is applied to study the photoexcitation of lithium fluoride, for which we demonstrate that this approach can efficiently generate a pulse, on-the-fly, able to control the population transfer between two selected electronic states. Analysis of the computed control pulse yields insights into the photophysics of the process identifying the relevant frequencies associated to the curvature of the initial and final state potential energy curves and their energy differences. The limitations inherent to the use of the trajectory surface hopping approach are also discussed.

  12. Comments on the variational modified-hypernetted-chain theory for simple fluids

    NASA Astrophysics Data System (ADS)

    Rosenfeld, Yaakov

    1986-02-01

    The variational modified-hypernetted-chain (VMHNC) theory, based on the approximation of universality of the bridge functions, is reformulated. The new formulation includes recent calculations by Lado and by Lado, Foiles, and Ashcroft, as two stages in a systematic approach which is analyzed. A variational iterative procedure for solving the exact (diagrammatic) equations for the fluid structure which is formally identical to the VMHNC is described, featuring the theory of simple classical fluids as a one-iteration theory. An accurate method for calculating the pair structure for a given potential and for inverting structure factor data in order to obtain the potential and the thermodynamic functions, follows from our analysis.

  13. Application to rotary wings of a simplified aerodynamic lifting surface theory for unsteady compressible flow

    NASA Technical Reports Server (NTRS)

    Rao, B. M.; Jones, W. P.

    1974-01-01

    A general method of predicting airloads is applied to helicopter rotor blades on a full three-dimensional basis using the general theory developed for a rotor blade at the psi = pi/2 position where flutter is most likely to occur. Calculations of aerodynamic coefficients for use in flutter analysis are made for forward and hovering flight with low inflow. The results are compared with values given by two-dimensional strip theory for a rigid rotor hinged at its root. The comparisons indicate the inadequacies of strip theory for airload prediction. One important conclusion drawn from this study is that the curved wake has a substantial effect on the chordwise load distribution.

  14. Entropy in sound and vibration: towards a new paradigm

    PubMed Central

    2017-01-01

    This paper describes a discussion on the method and the status of a statistical theory of sound and vibration, called statistical energy analysis (SEA). SEA is a simple theory of sound and vibration in elastic structures that applies when the vibrational energy is diffusely distributed. We show that SEA is a thermodynamical theory of sound and vibration, based on a law of exchange of energy analogous to the Clausius principle. We further investigate the notion of entropy in this context and discuss its meaning. We show that entropy is a measure of information lost in the passage from the classical theory of sound and vibration and SEA, its thermodynamical counterpart. PMID:28265190

  15. Considering methodological options for reviews of theory: illustrated by a review of theories linking income and health.

    PubMed

    Campbell, Mhairi; Egan, Matt; Lorenc, Theo; Bond, Lyndal; Popham, Frank; Fenton, Candida; Benzeval, Michaela

    2014-10-13

    Review of theory is an area of growing methodological advancement. Theoretical reviews are particularly useful where the literature is complex, multi-discipline, or contested. It has been suggested that adopting methods from systematic reviews may help address these challenges. However, the methodological approaches to reviews of theory, including the degree to which systematic review methods can be incorporated, have received little discussion in the literature. We recently employed systematic review methods in a review of theories about the causal relationship between income and health. This article discusses some of the methodological issues we considered in developing the review and offers lessons learnt from our experiences. It examines the stages of a systematic review in relation to how they could be adapted for a review of theory. The issues arising and the approaches taken in the review of theories in income and health are considered, drawing on the approaches of other reviews of theory. Different approaches to searching were required, including electronic and manual searches, and electronic citation tracking to follow the development of theories. Determining inclusion criteria was an iterative process to ensure that inclusion criteria were specific enough to make the review practical and focused, but not so narrow that key literature was excluded. Involving subject specialists was valuable in the literature searches to ensure principal papers were identified and during the inductive approaches used in synthesis of theories to provide detailed understanding of how theories related to another. Reviews of theory are likely to involve iterations and inductive processes throughout, and some of the concepts and techniques that have been developed for qualitative evidence synthesis can be usefully translated to theoretical reviews of this kind. It may be useful at the outset of a review of theory to consider whether the key aim of the review is to scope out theories relating to a particular issue; to conduct in-depth analysis of key theoretical works with the aim of developing new, overarching theories and interpretations; or to combine both these processes in the review. This can help decide the most appropriate methodological approach to take at particular stages of the review.

  16. Considering Horn's Parallel Analysis from a Random Matrix Theory Point of View.

    PubMed

    Saccenti, Edoardo; Timmerman, Marieke E

    2017-03-01

    Horn's parallel analysis is a widely used method for assessing the number of principal components and common factors. We discuss the theoretical foundations of parallel analysis for principal components based on a covariance matrix by making use of arguments from random matrix theory. In particular, we show that (i) for the first component, parallel analysis is an inferential method equivalent to the Tracy-Widom test, (ii) its use to test high-order eigenvalues is equivalent to the use of the joint distribution of the eigenvalues, and thus should be discouraged, and (iii) a formal test for higher-order components can be obtained based on a Tracy-Widom approximation. We illustrate the performance of the two testing procedures using simulated data generated under both a principal component model and a common factors model. For the principal component model, the Tracy-Widom test performs consistently in all conditions, while parallel analysis shows unpredictable behavior for higher-order components. For the common factor model, including major and minor factors, both procedures are heuristic approaches, with variable performance. We conclude that the Tracy-Widom procedure is preferred over parallel analysis for statistically testing the number of principal components based on a covariance matrix.

  17. Biomass accessibility analysis using electron tomography

    DOE PAGES

    Hinkle, Jacob D.; Ciesielski, Peter N.; Gruchalla, Kenny; ...

    2015-12-25

    Substrate accessibility to catalysts has been a dominant theme in theories of biomass deconstruction. Furthermore, current methods of quantifying accessibility do not elucidate mechanisms for increased accessibility due to changes in microstructure following pretreatment.

  18. Finite element analysis of elasto-plastic soils. Report no. 4: Finite element analysis of elasto-plastic frictional materials for application to lunar earth sciences

    NASA Technical Reports Server (NTRS)

    Marr, W. A., Jr.

    1972-01-01

    The behavior of finite element models employing different constitutive relations to describe the stress-strain behavior of soils is investigated. Three models, which assume small strain theory is applicable, include a nondilatant, a dilatant and a strain hardening constitutive relation. Two models are formulated using large strain theory and include a hyperbolic and a Tresca elastic perfectly plastic constitutive relation. These finite element models are used to analyze retaining walls and footings. Methods of improving the finite element solutions are investigated. For nonlinear problems better solutions can be obtained by using smaller load increment sizes and more iterations per load increment than by increasing the number of elements. Suitable methods of treating tension stresses and stresses which exceed the yield criteria are discussed.

  19. Using Goffman's theories of social interaction to reflect first-time mothers' experiences with the social norms of infant feeding.

    PubMed

    Brouwer, Marissa A; Drummond, Claire; Willis, Eileen

    2012-10-01

    Infant feeding, particularly breastfeeding, is an important public health issue because early feeding methods have been shown to influence health throughout childhood. We investigated how social norms influence first-time mothers' decisions around feeding methods. We conducted two in-depth interviews with 11 first-time mothers, the first 3 weeks after birth and the second 3 months following birth. We analyzed interview data using a third-level, thematic analysis, using Goffman's theories of social interaction to guide our analysis. Our results highlighted several issues surrounding breastfeeding in modern society. We propose that nursing mothers are conscious of adhering to social norms of being a good mother, but must also cope with societal views about presenting normal appearances when they need to feed their babies in public.

  20. Theoretical damping in roll and rolling moment due to differential wing incidence for slender cruciform wings and wing-body combinations

    NASA Technical Reports Server (NTRS)

    Adams, Gaynor J; DUGAN DUANE W

    1952-01-01

    A method of analysis based on slender-wing theory is developed to investigate the characteristics in roll of slender cruciform wings and wing-body combinations. The method makes use of the conformal mapping processes of classical hydrodynamics which transform the region outside a circle and the region outside an arbitrary arrangement of line segments intersecting at the origin. The method of analysis may be utilized to solve other slender cruciform wing-body problems involving arbitrarily assigned boundary conditions. (author)

  1. Methods for Neutron Spectrometry

    DOE R&D Accomplishments Database

    Brockhouse, Bertram N.

    1961-01-09

    The appropriate theories and the general philosophy of methods of measurement and treatment of data neutron spectrometry are discussed. Methods of analysis of results for liquids using the Van Hove formulation, and for crystals using the Born-von Karman theory, are reviewed. The most useful of the available methods of measurement are considered to be the crystal spectrometer methods and the pulsed monoenergetic beam/time-of-flight method. Pulsed-beam spectrometers have the advantage of higher counting rates than crystal spectrometers, especially in view of the fact that simultaneous measurements in several counters at different angles of scattering are possible in pulsed-beam spectrometers. The crystal spectrometer permits several valuable new types of specialized experiments to be performed, especially energy distribution measurements at constant momentum transfer. The Chalk River triple-axis crystal-spectrometer is discussed, with reference to its use in making the specialized experiments. The Chalk River rotating crystal (pulsed-beam) spectrometer is described, and a comparison of this type instrument with other pulsed-beam spectrometers is made. A partial outline of the theory of operation of rotating-crystal spectrometers is presented. The use of quartz-crystal filters for fast neutron elimination and for order elimination is discussed. (auth)

  2. Bending, longitudinal and torsional wave transmission on Euler-Bernoulli and Timoshenko beams with high propagation losses.

    PubMed

    Wang, X; Hopkins, C

    2016-10-01

    Advanced Statistical Energy Analysis (ASEA) is used to predict vibration transmission across coupled beams which support multiple wave types up to high frequencies where Timoshenko theory is valid. Bending-longitudinal and bending-torsional models are considered for an L-junction and rectangular beam frame. Comparisons are made with measurements, Finite Element Methods (FEM) and Statistical Energy Analysis (SEA). When beams support at least two local modes for each wave type in a frequency band and the modal overlap factor is at least 0.1, measurements and FEM have relatively smooth curves. Agreement between measurements, FEM, and ASEA demonstrates that ASEA is able to predict high propagation losses which are not accounted for with SEA. These propagation losses tend to become more important at high frequencies with relatively high internal loss factors and can occur when there is more than one wave type. At such high frequencies, Timoshenko theory, rather than Euler-Bernoulli theory, is often required. Timoshenko theory is incorporated in ASEA and SEA using wave theory transmission coefficients derived assuming Euler-Bernoulli theory, but using Timoshenko group velocity when calculating coupling loss factors. The changeover between theories is appropriate above the frequency where there is a 26% difference between Euler-Bernoulli and Timoshenko group velocities.

  3. QUAGOL: a guide for qualitative data analysis.

    PubMed

    Dierckx de Casterlé, Bernadette; Gastmans, Chris; Bryon, Els; Denier, Yvonne

    2012-03-01

    Data analysis is a complex and contested part of the qualitative research process, which has received limited theoretical attention. Researchers are often in need of useful instructions or guidelines on how to analyze the mass of qualitative data, but face the lack of clear guidance for using particular analytic methods. The aim of this paper is to propose and discuss the Qualitative Analysis Guide of Leuven (QUAGOL), a guide that was developed in order to be able to truly capture the rich insights of qualitative interview data. The article describes six major problems researchers are often struggling with during the process of qualitative data analysis. Consequently, the QUAGOL is proposed as a guide to facilitate the process of analysis. Challenges emerged and lessons learned from own extensive experiences with qualitative data analysis within the Grounded Theory Approach, as well as from those of other researchers (as described in the literature), were discussed and recommendations were presented. Strengths and pitfalls of the proposed method were discussed in detail. The Qualitative Analysis Guide of Leuven (QUAGOL) offers a comprehensive method to guide the process of qualitative data analysis. The process consists of two parts, each consisting of five stages. The method is systematic but not rigid. It is characterized by iterative processes of digging deeper, constantly moving between the various stages of the process. As such, it aims to stimulate the researcher's intuition and creativity as optimal as possible. The QUAGOL guide is a theory and practice-based guide that supports and facilitates the process of analysis of qualitative interview data. Although the method can facilitate the process of analysis, it cannot guarantee automatic quality. The skills of the researcher and the quality of the research team remain the most crucial components of a successful process of analysis. Additionally, the importance of constantly moving between the various stages throughout the research process cannot be overstated. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. Effects of measurement errors on psychometric measurements in ergonomics studies: Implications for correlations, ANOVA, linear regression, factor analysis, and linear discriminant analysis.

    PubMed

    Liu, Yan; Salvendy, Gavriel

    2009-05-01

    This paper aims to demonstrate the effects of measurement errors on psychometric measurements in ergonomics studies. A variety of sources can cause random measurement errors in ergonomics studies and these errors can distort virtually every statistic computed and lead investigators to erroneous conclusions. The effects of measurement errors on five most widely used statistical analysis tools have been discussed and illustrated: correlation; ANOVA; linear regression; factor analysis; linear discriminant analysis. It has been shown that measurement errors can greatly attenuate correlations between variables, reduce statistical power of ANOVA, distort (overestimate, underestimate or even change the sign of) regression coefficients, underrate the explanation contributions of the most important factors in factor analysis and depreciate the significance of discriminant function and discrimination abilities of individual variables in discrimination analysis. The discussions will be restricted to subjective scales and survey methods and their reliability estimates. Other methods applied in ergonomics research, such as physical and electrophysiological measurements and chemical and biomedical analysis methods, also have issues of measurement errors, but they are beyond the scope of this paper. As there has been increasing interest in the development and testing of theories in ergonomics research, it has become very important for ergonomics researchers to understand the effects of measurement errors on their experiment results, which the authors believe is very critical to research progress in theory development and cumulative knowledge in the ergonomics field.

  5. Clinical overview: a framework for analysis.

    PubMed

    Bossen, Claus; Jensen, Lotte G

    2013-01-01

    In this presentation, we investigate concepts and theories for analysing how healthcare professionals achieve overview of patient cases. By 'overview' we mean the situation in which a healthcare professional with sufficient certainty and in concrete situations knows how to proceed based on available information upon a patient. Achieving overview is central for the efficient and safe use of healthcare IT systems, and for the realization of the potential improvements of healthcare that are behind investments in such systems. We focus on the theories of decision-making, sensemaking, narratives, ethnomethodology and distributed cognition. Whereas decision-making theory tend to be sequential and normative, we find the concept of 'functional deployment' in sensemaking theory, 'emplotment' in narrative theory, the focus on 'members' methods' in ethnomethodology and the inclusion of 'computational artifacts' in distributed cognition helpful.

  6. Sensitivity analysis of discrete structural systems: A survey

    NASA Technical Reports Server (NTRS)

    Adelman, H. M.; Haftka, R. T.

    1984-01-01

    Methods for calculating sensitivity derivatives for discrete structural systems are surveyed, primarily covering literature published during the past two decades. Methods are described for calculating derivatives of static displacements and stresses, eigenvalues and eigenvectors, transient structural response, and derivatives of optimum structural designs with respect to problem parameters. The survey is focused on publications addressed to structural analysis, but also includes a number of methods developed in nonstructural fields such as electronics, controls, and physical chemistry which are directly applicable to structural problems. Most notable among the nonstructural-based methods are the adjoint variable technique from control theory, and the Green's function and FAST methods from physical chemistry.

  7. Adding results to a meta-analysis: Theory and example

    NASA Astrophysics Data System (ADS)

    Willson, Victor L.

    Meta-analysis has been used as a research method to describe bodies of research data. It promotes hypothesis formation and the development of science education laws. A function overlooked, however, is the role it plays in updating research. Methods to integrate new research with meta-analysis results need explication. A procedure is presented using Bayesian analysis. Research in science education attitude correlation with achievement has been published after a recent meta-analysis of the topic. The results show how new findings complement the previous meta-analysis and extend its conclusions. Additional methodological questions adddressed are how studies are to be weighted, which variables are to be examined, and how often meta-analysis are to be updated.

  8. A Lyapunov and Sacker–Sell spectral stability theory for one-step methods

    DOE PAGES

    Steyer, Andrew J.; Van Vleck, Erik S.

    2018-04-13

    Approximation theory for Lyapunov and Sacker–Sell spectra based upon QR techniques is used to analyze the stability of a one-step method solving a time-dependent (nonautonomous) linear ordinary differential equation (ODE) initial value problem in terms of the local error. Integral separation is used to characterize the conditioning of stability spectra calculations. The stability of the numerical solution by a one-step method of a nonautonomous linear ODE using real-valued, scalar, nonautonomous linear test equations is justified. This analysis is used to approximate exponential growth/decay rates on finite and infinite time intervals and establish global error bounds for one-step methods approximating uniformly,more » exponentially stable trajectories of nonautonomous and nonlinear ODEs. A time-dependent stiffness indicator and a one-step method that switches between explicit and implicit Runge–Kutta methods based upon time-dependent stiffness are developed based upon the theoretical results.« less

  9. A Lyapunov and Sacker–Sell spectral stability theory for one-step methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steyer, Andrew J.; Van Vleck, Erik S.

    Approximation theory for Lyapunov and Sacker–Sell spectra based upon QR techniques is used to analyze the stability of a one-step method solving a time-dependent (nonautonomous) linear ordinary differential equation (ODE) initial value problem in terms of the local error. Integral separation is used to characterize the conditioning of stability spectra calculations. The stability of the numerical solution by a one-step method of a nonautonomous linear ODE using real-valued, scalar, nonautonomous linear test equations is justified. This analysis is used to approximate exponential growth/decay rates on finite and infinite time intervals and establish global error bounds for one-step methods approximating uniformly,more » exponentially stable trajectories of nonautonomous and nonlinear ODEs. A time-dependent stiffness indicator and a one-step method that switches between explicit and implicit Runge–Kutta methods based upon time-dependent stiffness are developed based upon the theoretical results.« less

  10. Application Research of Fault Tree Analysis in Grid Communication System Corrective Maintenance

    NASA Astrophysics Data System (ADS)

    Wang, Jian; Yang, Zhenwei; Kang, Mei

    2018-01-01

    This paper attempts to apply the fault tree analysis method to the corrective maintenance field of grid communication system. Through the establishment of the fault tree model of typical system and the engineering experience, the fault tree analysis theory is used to analyze the fault tree model, which contains the field of structural function, probability importance and so on. The results show that the fault tree analysis can realize fast positioning and well repairing of the system. Meanwhile, it finds that the analysis method of fault tree has some guiding significance to the reliability researching and upgrading f the system.

  11. MAC/GMC 4.0 User's Manual: Example Problem Manual. Volume 3

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2002-01-01

    This document is the third volume in the three volume set of User's Manuals for the Micromechanics Analysis Code with Generalized Method of Cells Version 4.0 (MAC/GMC 4.0). Volume 1 is the Theory Manual, Volume 2 is the Keywords Manual, and this document is the Example Problems Manual. MAC/GMC 4.0 is a composite material and laminate analysis software program developed at the NASA Glenn Research Center. It is based on the generalized method of cells (GMC) micromechanics theory, which provides access to the local stress and strain fields in the composite material. This access grants GMC the ability to accommodate arbitrary local models for inelastic material behavior and various types of damage and failure analysis. MAC/GMC 4.0 has been built around GMC to provide the theory with a user-friendly framework, along with a library of local inelastic, damage, and failure models. Further, application of simulated thermo-mechanical loading, generation of output results, and selection of architectures to represent the composite material, have been automated in MAC/GMC 4.0. Finally, classical lamination theory has been implemented within MAC/GMC 4.0 wherein GMC is used to model the composite material response of each ply. Consequently, the full range of GMC composite material capabilities is available for analysis of arbitrary laminate configurations as well. This volume provides in-depth descriptions of 43 example problems, which were specially designed to highlight many of the most important capabilities of the code. The actual input files associated with each example problem are distributed with the MAC/GMC 4.0 software; thus providing the user with a convenient starting point for their own specialized problems of interest.

  12. Implementation of newly adopted technology in acute care settings: a qualitative analysis of clinical staff

    PubMed Central

    Langhan, Melissa L.; Riera, Antonio; Kurtz, Jordan C.; Schaeffer, Paula; Asnes, Andrea G.

    2015-01-01

    Objective Technologies are not always successfully implemented into practise. We elicited experiences of acute care providers with the introduction of technology and identified barriers and facilitators in the implementation process. Methods A qualitative study using one-on-one interviews among a purposeful sample of 19 physicians and nurses within ten emergency departments and intensive care units was performed. Grounded theory, iterative data analysis and the constant comparative method were used to inductively generate ideas and build theories. Results Five major categories emerged: decision-making factors, the impact on practise, technology's perceived value, facilitators and barriers to implementation. Barriers included negative experiences, age, infrequent use, and access difficulties. A positive outlook, sufficient training, support staff, and user friendliness were facilitators. Conclusions This study describes strategies implicated in the successful implementation of newly adopted technology in acute care settings. Improved implementation methods and evaluation of implementation processes are necessary for successful adoption of new technology. PMID:25367721

  13. Photoacoustic signal and noise analysis for Si thin plate: signal correction in frequency domain.

    PubMed

    Markushev, D D; Rabasović, M D; Todorović, D M; Galović, S; Bialkowski, S E

    2015-03-01

    Methods for photoacoustic signal measurement, rectification, and analysis for 85 μm thin Si samples in the 20-20 000 Hz modulation frequency range are presented. Methods for frequency-dependent amplitude and phase signal rectification in the presence of coherent and incoherent noise as well as distortion due to microphone characteristics are presented. Signal correction is accomplished using inverse system response functions deduced by comparing real to ideal signals for a sample with well-known bulk parameters and dimensions. The system response is a piece-wise construction, each component being due to a particular effect of the measurement system. Heat transfer and elastic effects are modeled using standard Rosencweig-Gersho and elastic-bending theories. Thermal diffusion, thermoelastic, and plasmaelastic signal components are calculated and compared to measurements. The differences between theory and experiment are used to detect and correct signal distortion and to determine detector and sound-card characteristics. Corrected signal analysis is found to faithfully reflect known sample parameters.

  14. Examining the Potential of Combining the Methods of Grounded Theory and Narrative Inquiry: A Comparative Analysis

    ERIC Educational Resources Information Center

    Lal, Shalini; Suto, Melinda; Ungar, Michael

    2012-01-01

    Increasingly, qualitative researchers are combining methods, processes, and principles from two or more methodologies over the course of a research study. Critics charge that researchers adopting combined approaches place too little attention on the historical, epistemological, and theoretical aspects of the research design. Rather than…

  15. Development of a Computer-Based Visualised Quantitative Learning System for Playing Violin Vibrato

    ERIC Educational Resources Information Center

    Ho, Tracy Kwei-Liang; Lin, Huann-shyang; Chen, Ching-Kong; Tsai, Jih-Long

    2015-01-01

    Traditional methods of teaching music are largely subjective, with the lack of objectivity being particularly challenging for violin students learning vibrato because of the existence of conflicting theories. By using a computer-based analysis method, this study found that maintaining temporal coincidence between the intensity peak and the target…

  16. Risky Business: An Ecological Analysis of Intimate Partner Violence Disclosure

    ERIC Educational Resources Information Center

    Alaggia, Ramona; Regehr, Cheryl; Jenney, Angelique

    2012-01-01

    Objective: A multistage, mixed-methods study using grounded theory with descriptive data was conducted to examine factors in disclosure of intimate partner violence (IPV). Method: In-depth interviews with individuals and focus groups were undertaken to collect data from 98 IPV survivors and service providers to identify influential factors.…

  17. Is Aggression the Same for Boys and Girls? Assessing Measurement Invariance with Confirmatory Factor Analysis and Item Response Theory

    ERIC Educational Resources Information Center

    Kim, Sangwon; Kim, Seock-Ho; Kamphaus, Randy W.

    2010-01-01

    Gender differences in aggression have typically been based on studies utilizing a mean difference method. From a measurement perspective, this method is inherently problematic unless an aggression measure possesses comparable validity across gender. Stated differently, establishing measurement invariance on the measure of aggression is…

  18. The Shock and Vibration Digest. Volume 14, Number 12

    DTIC Science & Technology

    1982-12-01

    to evaluate the uses of statistical energy analysis for determining sound transmission performance. Coupling loss factors were mea- sured and compared...measurements for the artificial (Also see No. 2623) cracks in mild-steel test pieces. 82-2676 Ihprovement of the Method of Statistical Energy Analysis for...eters, using a large number of free-response time histories In the application of the statistical energy analysis theory simultaneously in one analysis

  19. Self-consistent projection operator theory in nonlinear quantum optical systems: A case study on degenerate optical parametric oscillators

    NASA Astrophysics Data System (ADS)

    Degenfeld-Schonburg, Peter; Navarrete-Benlloch, Carlos; Hartmann, Michael J.

    2015-05-01

    Nonlinear quantum optical systems are of paramount relevance for modern quantum technologies, as well as for the study of dissipative phase transitions. Their nonlinear nature makes their theoretical study very challenging and hence they have always served as great motivation to develop new techniques for the analysis of open quantum systems. We apply the recently developed self-consistent projection operator theory to the degenerate optical parametric oscillator to exemplify its general applicability to quantum optical systems. We show that this theory provides an efficient method to calculate the full quantum state of each mode with a high degree of accuracy, even at the critical point. It is equally successful in describing both the stationary limit and the dynamics, including regions of the parameter space where the numerical integration of the full problem is significantly less efficient. We further develop a Gaussian approach consistent with our theory, which yields sensibly better results than the previous Gaussian methods developed for this system, most notably standard linearization techniques.

  20. Compatibility Conditions of Structural Mechanics

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Coroneos, Rula M.; Hopkins, Dale A.

    1999-01-01

    The theory of elasticity has camouflaged a deficiency in the compatibility formulation since 1860. In structures the ad hoc compatibility conditions through virtual "cuts" and closing "gaps" are not parallel to the strain formulation in elasticity. This deficiency in the compatibility conditions has prevented the development of a direct stress determination method in structures and in elasticity. We have addressed this deficiency and attempted to unify the theory of compatibility. This work has led to the development of the integrated force method for structures and the completed Beltrami-Michell formulation for elasticity. The improved accuracy observed in the solution of numerical examples by the integrated force method can be attributed to the compliance of the compatibility conditions. Using the compatibility conditions allows mapping of variables and facile movement among different structural analysis formulations. This paper reviews and illustrates the requirement of compatibility in structures and in elasticity. It also describes the generation of the conditions and quantifies the benefits of their use. The traditional analysis methods and available solutions (which have been obtained bypassing the missed conditions) should be verified for compliance of the compatibility conditions.

  1. Uncertainty characterization approaches for risk assessment of DBPs in drinking water: a review.

    PubMed

    Chowdhury, Shakhawat; Champagne, Pascale; McLellan, P James

    2009-04-01

    The management of risk from disinfection by-products (DBPs) in drinking water has become a critical issue over the last three decades. The areas of concern for risk management studies include (i) human health risk from DBPs, (ii) disinfection performance, (iii) technical feasibility (maintenance, management and operation) of treatment and disinfection approaches, and (iv) cost. Human health risk assessment is typically considered to be the most important phase of the risk-based decision-making or risk management studies. The factors associated with health risk assessment and other attributes are generally prone to considerable uncertainty. Probabilistic and non-probabilistic approaches have both been employed to characterize uncertainties associated with risk assessment. The probabilistic approaches include sampling-based methods (typically Monte Carlo simulation and stratified sampling) and asymptotic (approximate) reliability analysis (first- and second-order reliability methods). Non-probabilistic approaches include interval analysis, fuzzy set theory and possibility theory. However, it is generally accepted that no single method is suitable for the entire spectrum of problems encountered in uncertainty analyses for risk assessment. Each method has its own set of advantages and limitations. In this paper, the feasibility and limitations of different uncertainty analysis approaches are outlined for risk management studies of drinking water supply systems. The findings assist in the selection of suitable approaches for uncertainty analysis in risk management studies associated with DBPs and human health risk.

  2. Integrated analysis and design of thick composite structures for optimal passive damping characteristics

    NASA Technical Reports Server (NTRS)

    Saravanos, D. A.

    1993-01-01

    The development of novel composite mechanics for the analysis of damping in composite laminates and structures and the more significant results of this effort are summarized. Laminate mechanics based on piecewise continuous in-plane displacement fields are described that can represent both intralaminar stresses and interlaminar shear stresses and the associated effects on the stiffness and damping characteristics of a composite laminate. Among other features, the mechanics can accurately model the static and damped dynamic response of either thin or thick composite laminates, as well as, specialty laminates with embedded compliant damping layers. The discrete laminate damping theory is further incorporated into structural analysis methods. In this context, an exact semi-analytical method for the simulation of the damped dynamic response of composite plates was developed. A finite element based method and a specialty four-node plate element were also developed for the analysis of composite structures of variable shape and boundary conditions. Numerous evaluations and applications demonstrate the quality and superiority of the mechanics in predicting the damped dynamic characteristics of composite structures. Finally, additional development was focused on the development of optimal tailoring methods for the design of thick composite structures based on the developed analytical capability. Applications on composite plates illustrated the influence of composite mechanics in the optimal design of composites and the potential for significant deviations in the resultant designs when more simplified (classical) laminate theories are used.

  3. The analysis of harmonic generation coefficients in the ablative Rayleigh-Taylor instability

    NASA Astrophysics Data System (ADS)

    Lu, Yan; Fan, Zhengfeng; Lu, Xinpei; Ye, Wenhua; Zou, Changlin; Zhang, Ziyun; Zhang, Wen

    2017-10-01

    In this research, we use the numerical simulation method to investigate the generation coefficients of the first three harmonics and the zeroth harmonic in the Ablative Rayleigh-Taylor Instability. It is shown that the interface shifts to the low temperature side during the ablation process. In consideration of the third-order perturbation theory, the first three harmonic amplitudes of the weakly nonlinear regime are calculated and then the harmonic generation coefficients are obtained by curve fitting. The simulation results show that the harmonic generation coefficients changed with time and wavelength. Using the higher-order perturbation theory, we find that more and more harmonics are generated in the later weakly nonlinear stage, which is caused by the negative feedback of the later higher harmonics. Furthermore, extending the third-order theory to the fifth-order theory, we find that the second and the third harmonics coefficients linearly depend on the wavelength, while the feedback coefficients are almost constant. Further analysis also shows that when the fifth-order theory is considered, the normalized effective amplitudes of second and third harmonics can reach about 25%-40%, which are only 15%-25% in the frame of the previous third-order theory. Therefore, the third order perturbation theory is needed to be modified by the higher-order theory when ηL reaches about 20% of the perturbation wavelength.

  4. Decisions and Reasons: Examining Preservice Teacher Decision-Making through Video Self-Analysis

    ERIC Educational Resources Information Center

    Rich, Peter J.; Hannafin, Michael J.

    2008-01-01

    Methods used to study teacher thinking have both provided insight into the cognitive aspects of teaching and resulted in new, as yet unresolved, relationships between practice and theory. Recent developments in video-analysis tools have allowed preservice teachers to analyze both their practices and thinking, providing important feedback for…

  5. Rasch Analysis of the Geriatric Depression Scale--Short Form

    ERIC Educational Resources Information Center

    Chiang, Karl S.; Green, Kathy E.; Cox, Enid O.

    2009-01-01

    Purpose: The purpose of this study was to examine scale dimensionality, reliability, invariance, targeting, continuity, cutoff scores, and diagnostic use of the Geriatric Depression Scale-Short Form (GDS-SF) over time with a sample of 177 English-speaking U.S. elders. Design and Methods: An item response theory, Rasch analysis, was conducted with…

  6. Small-Group Instruction: Theory and Practice.

    ERIC Educational Resources Information Center

    Olmstead, Joseph A.

    The volume is an analysis of the state of the art of small-group methods of instruction. It describes some of the more commonly used small-group techniques and the rationale behind them, and provides an analysis of their potential use for various types and conditions of instructional environments. Explicit guidelines are provided to assist…

  7. Demonstrating the Financial Benefit of Human Resource Development: Status and Update on the Theory and Practice.

    ERIC Educational Resources Information Center

    Swanson, Richard A.

    1998-01-01

    A research review identified findings about the financial analysis method, forecasting of the financial benefits of human resource development (HRD), and recent financial analysis research: (1) HRD embedded in a performance improvement framework yielded high return on investment; and (2) HRD interventions focused on performance variables forecast…

  8. Analysis test of understanding of vectors with the three-parameter logistic model of item response theory and item response curves technique

    NASA Astrophysics Data System (ADS)

    Rakkapao, Suttida; Prasitpong, Singha; Arayathanitkul, Kwan

    2016-12-01

    This study investigated the multiple-choice test of understanding of vectors (TUV), by applying item response theory (IRT). The difficulty, discriminatory, and guessing parameters of the TUV items were fit with the three-parameter logistic model of IRT, using the parscale program. The TUV ability is an ability parameter, here estimated assuming unidimensionality and local independence. Moreover, all distractors of the TUV were analyzed from item response curves (IRC) that represent simplified IRT. Data were gathered on 2392 science and engineering freshmen, from three universities in Thailand. The results revealed IRT analysis to be useful in assessing the test since its item parameters are independent of the ability parameters. The IRT framework reveals item-level information, and indicates appropriate ability ranges for the test. Moreover, the IRC analysis can be used to assess the effectiveness of the test's distractors. Both IRT and IRC approaches reveal test characteristics beyond those revealed by the classical analysis methods of tests. Test developers can apply these methods to diagnose and evaluate the features of items at various ability levels of test takers.

  9. Risk analysis of gravity dam instability using credibility theory Monte Carlo simulation model.

    PubMed

    Xin, Cao; Chongshi, Gu

    2016-01-01

    Risk analysis of gravity dam stability involves complicated uncertainty in many design parameters and measured data. Stability failure risk ratio described jointly by probability and possibility has deficiency in characterization of influence of fuzzy factors and representation of the likelihood of risk occurrence in practical engineering. In this article, credibility theory is applied into stability failure risk analysis of gravity dam. Stability of gravity dam is viewed as a hybrid event considering both fuzziness and randomness of failure criterion, design parameters and measured data. Credibility distribution function is conducted as a novel way to represent uncertainty of influence factors of gravity dam stability. And combining with Monte Carlo simulation, corresponding calculation method and procedure are proposed. Based on a dam section, a detailed application of the modeling approach on risk calculation of both dam foundation and double sliding surfaces is provided. The results show that, the present method is feasible to be applied on analysis of stability failure risk for gravity dams. The risk assessment obtained can reflect influence of both sorts of uncertainty, and is suitable as an index value.

  10. Seismic wavefield propagation in 2D anisotropic media: Ray theory versus wave-equation simulation

    NASA Astrophysics Data System (ADS)

    Bai, Chao-ying; Hu, Guang-yi; Zhang, Yan-teng; Li, Zhong-sheng

    2014-05-01

    Despite the ray theory that is based on the high frequency assumption of the elastic wave-equation, the ray theory and the wave-equation simulation methods should be mutually proof of each other and hence jointly developed, but in fact parallel independent progressively. For this reason, in this paper we try an alternative way to mutually verify and test the computational accuracy and the solution correctness of both the ray theory (the multistage irregular shortest-path method) and the wave-equation simulation method (both the staggered finite difference method and the pseudo-spectral method) in anisotropic VTI and TTI media. Through the analysis and comparison of wavefield snapshot, common source gather profile and synthetic seismogram, it is able not only to verify the accuracy and correctness of each of the methods at least for kinematic features, but also to thoroughly understand the kinematic and dynamic features of the wave propagation in anisotropic media. The results show that both the staggered finite difference method and the pseudo-spectral method are able to yield the same results even for complex anisotropic media (such as a fault model); the multistage irregular shortest-path method is capable of predicting similar kinematic features as the wave-equation simulation method does, which can be used to mutually test each other for methodology accuracy and solution correctness. In addition, with the aid of the ray tracing results, it is easy to identify the multi-phases (or multiples) in the wavefield snapshot, common source point gather seismic section and synthetic seismogram predicted by the wave-equation simulation method, which is a key issue for later seismic application.

  11. Storage Capacity of the Linear Associator: Beginnings of a Theory of Computational Memory

    DTIC Science & Technology

    1988-04-27

    Issues valuable to future efforts and provides methods for analysis of perceptual/ cognitive systems. vii Table of Contents 1. Introduction...not only enables a system to vastly simplify its representation of the environment, but the identification of such symbols In a cognitive system could...subse4 uently provide a parsimonious theory of cognition (Yes, I know, *traditional AI already knows this). Not that the Identification would be easy

  12. Improving Symptom Control, QOL, and Quality of Care for Women with Breast Cancer: Developing a Research Program on Neurological Effects via Doctoral Education

    DTIC Science & Technology

    2006-06-01

    phenomenological study . Nursing Research , 41, 166-170. Beck, C. (1993). Teetering on the edge: A substantive theory ... grounded theory : Strategies for qualitative research . Chicago: Aldine. Goldstein, D., Lu, Y., Detke, M., Lee, T., & Iyengar, S. (2005). Duloxetine vs...Sandelowski, M. (2000a). Combining qualitative and quantitative sampling, data collection, and analysis techniques in mixed- method studies . Research

  13. A Review of Trend of Nursing Theories related Caregivers in Korea

    PubMed Central

    Hae Kim, Sung; Choi, Yoona; Lee, Ji-Hye; Jang, Da-El; Kim, Sanghee

    2018-01-01

    Background: The prevalence of chronic diseases has been rapidly increased due to population aging. As the duration of care needs increase, the caregivers’ socioeconomic burdens have also increased. Objective: This review examines the attributes of caregiving experience and quality of life of caregivers in Korea with a focus on the application of nursing theory. Method: We reviewed studies on caregivers’ caring for adult patients published till 2016 in 4 bio-medical research portal websites or data bases. A total of 1,939 studies were identified through the keyword search. One hundred forty five studies were selected by a process; of which, 17 studies were theory-applied. Selected studies were analyzed in accordance with the structured analysis format. Results: Quantitative studies accounted for 76.6%, while 22.1% were qualitative studies and 1.3% were triangulation studies. Caregiver-related studies increased after 2000. Most frequently, the caregivers were spouses (28.4%), and most frequently, care was provided to a recipient affected by stroke (22.5%). The 17 theory-based studies described 20 theories (70% psychology theories, 30% nursing theories). The most frequent nursing theory was the theory of stress, appraisal and coping. Conclusion: This study sought to better understand caregiving through the analysis of Korean studies on the caregiving experience and caregivers’ QOL and this finding helped presenting empirical data for nursing by identifying the nursing theories applied to the caregiving experience and caregivers’ QOL. The results suggest that the need for further expansion of nursing theories and their greater utilization in the studies of caregiving. PMID:29515682

  14. Inverse Scattering and Local Observable Algebras in Integrable Quantum Field Theories

    NASA Astrophysics Data System (ADS)

    Alazzawi, Sabina; Lechner, Gandalf

    2017-09-01

    We present a solution method for the inverse scattering problem for integrable two-dimensional relativistic quantum field theories, specified in terms of a given massive single particle spectrum and a factorizing S-matrix. An arbitrary number of massive particles transforming under an arbitrary compact global gauge group is allowed, thereby generalizing previous constructions of scalar theories. The two-particle S-matrix S is assumed to be an analytic solution of the Yang-Baxter equation with standard properties, including unitarity, TCP invariance, and crossing symmetry. Using methods from operator algebras and complex analysis, we identify sufficient criteria on S that imply the solution of the inverse scattering problem. These conditions are shown to be satisfied in particular by so-called diagonal S-matrices, but presumably also in other cases such as the O( N)-invariant nonlinear {σ}-models.

  15. Toward a methodology for moral decision making in medicine.

    PubMed

    Kushner, T; Belliotti, R A; Buckner, D

    1991-12-01

    The failure of medical codes to provide adequate guidance for physicians' moral dilemmas points to the fact that some rules of analysis, informed by moral theory, are needed to assist in resolving perplexing ethical problems occurring with increasing frequency as medical technology advances. Initially, deontological and teleological theories appear more helpful, but criticisms can be lodged against both, and neither proves to be sufficient in itself. This paper suggests that to elude the limitations of previous approaches, a method of moral decision making must be developed incorporating both coherence methodology and some independently supported theoretical foundations. Wide Reflective Equilibrium is offered, and its process described along with a theory of the person which is used to animate the process. Steps are outlined to be used in the process, leading to the application of the method to an actual case.

  16. Comprehensive risk assessment method of catastrophic accident based on complex network properties

    NASA Astrophysics Data System (ADS)

    Cui, Zhen; Pang, Jun; Shen, Xiaohong

    2017-09-01

    On the macro level, the structural properties of the network and the electrical characteristics of the micro components determine the risk of cascading failures. And the cascading failures, as a process with dynamic development, not only the direct risk but also potential risk should be considered. In this paper, comprehensively considered the direct risk and potential risk of failures based on uncertain risk analysis theory and connection number theory, quantified uncertain correlation by the node degree and node clustering coefficient, then established a comprehensive risk indicator of failure. The proposed method has been proved by simulation on the actual power grid. Modeling a network according to the actual power grid, and verified the rationality of the proposed method.

  17. An Automated Method to Identify Mesoscale Convective Complexes (MCCs) Implementing Graph Theory

    NASA Astrophysics Data System (ADS)

    Whitehall, K. D.; Mattmann, C. A.; Jenkins, G. S.; Waliser, D. E.; Rwebangira, R.; Demoz, B.; Kim, J.; Goodale, C. E.; Hart, A. F.; Ramirez, P.; Joyce, M. J.; Loikith, P.; Lee, H.; Khudikyan, S.; Boustani, M.; Goodman, A.; Zimdars, P. A.; Whittell, J.

    2013-12-01

    Mesoscale convective complexes (MCCs) are convectively-driven weather systems with a duration of ~10 - 12 hours and contributions of large amounts to the rainfall daily and monthly totals. More than 400 MCCs occur annually over various locations on the globe. In West Africa, ~170 MCCs occur annually during the 180 days representing the summer months (June - November), and contribute ~75% of the annual wet season rainfall. The main objective of this study is to improve automatic identification of MCC over West Africa. The spatial expanse of MCCs and the spatio-temporal variability in their convective characteristics make them difficult to characterize even in dense networks of radars and/or surface gauges. As such there exist criteria for identifying MCCs with satellite images - mostly using infrared (IR) data. Automated MCC identification methods are based on forward and/or backward in time spatial-temporal analysis of the IR satellite data and characteristically incorporate a manual component as these algorithms routinely falter with merging and splitting cloud systems between satellite images. However, these algorithms are not readily transferable to voluminous data or other satellite-derived datasets (e.g. TRMM), thus hindering comprehensive studies of these features both at weather and climate timescales. Recognizing the existing limitations of automated methods, this study explores the applicability of graph theory to creating a fully automated method for deriving a West African MCC dataset from hourly infrared satellite images between 2001- 2012. Graph theory, though not heavily implemented in the atmospheric sciences, has been used for the predicting (nowcasting) of thunderstorms from radar and satellite data by considering the relationship between atmospheric variables at a given time, or for the spatial-temporal analysis of cloud volumes. From these few studies, graph theory appears to be innately applicable to the complexity, non-linearity and inherent chaos of the atmospheric system. Our preliminary results show that the use of graph theory improves data management thus allowing for longer periods to be studied, and creates a transferable method that allows for other data to be utilized.

  18. Cognition of and Demand for Education and Teaching in Medical Statistics in China: A Systematic Review and Meta-Analysis

    PubMed Central

    Li, Gaoming; Yi, Dali; Wu, Xiaojiao; Liu, Xiaoyu; Zhang, Yanqi; Liu, Ling; Yi, Dong

    2015-01-01

    Background Although a substantial number of studies focus on the teaching and application of medical statistics in China, few studies comprehensively evaluate the recognition of and demand for medical statistics. In addition, the results of these various studies differ and are insufficiently comprehensive and systematic. Objectives This investigation aimed to evaluate the general cognition of and demand for medical statistics by undergraduates, graduates, and medical staff in China. Methods We performed a comprehensive database search related to the cognition of and demand for medical statistics from January 2007 to July 2014 and conducted a meta-analysis of non-controlled studies with sub-group analysis for undergraduates, graduates, and medical staff. Results There are substantial differences with respect to the cognition of theory in medical statistics among undergraduates (73.5%), graduates (60.7%), and medical staff (39.6%). The demand for theory in medical statistics is high among graduates (94.6%), undergraduates (86.1%), and medical staff (88.3%). Regarding specific statistical methods, the cognition of basic statistical methods is higher than of advanced statistical methods. The demand for certain advanced statistical methods, including (but not limited to) multiple analysis of variance (ANOVA), multiple linear regression, and logistic regression, is higher than that for basic statistical methods. The use rates of the Statistical Package for the Social Sciences (SPSS) software and statistical analysis software (SAS) are only 55% and 15%, respectively. Conclusion The overall statistical competence of undergraduates, graduates, and medical staff is insufficient, and their ability to practically apply their statistical knowledge is limited, which constitutes an unsatisfactory state of affairs for medical statistics education. Because the demand for skills in this area is increasing, the need to reform medical statistics education in China has become urgent. PMID:26053876

  19. Multilayer theory for delamination analysis of a composite curved bar subjected to end forces and end moments

    NASA Technical Reports Server (NTRS)

    Ko, William L.; Jackson, Raymond H.

    1989-01-01

    A composite test specimen in the shape of a semicircular curved bar subjected to bending offers an excellent stress field for studying the open-mode delamination behavior of laminated composite materials. This is because the open-mode delamination nucleates at the midspan of the curved bar. The classical anisotropic elasticity theory was used to construct a 'multilayer' theory for the calculations of the stress and deformation fields induced in the multilayered composite semicircular curved bar subjected to end forces and end moments. The radial location and intensity of the open-mode delamination stress were calculated and were compared with the results obtained from the anisotropic continuum theory and from the finite element method. The multilayer theory gave more accurate predictions of the location and the intensity of the open-mode delamination stress than those calculated from the anisotropic continuum theory.

  20. Multilayer theory for delamination analysis of a composite curved bar subjected to end forces and end moments

    NASA Technical Reports Server (NTRS)

    Ko, William L.; Jackson, Raymond H.

    1989-01-01

    A composite test specimen in the shape of a semicircular curved bar subjected to bending offers an excellent stress field for studying the open-mode delamination behavior of laminated composite materials. This is because the open-mode delamination nucleates at the midspan of the curved bar. The classical anisotropic elasticity theory was used to construct a multilayer theory for the calculations of the stress and deformation fields induced in the multilayered composite semicircular curved bar subjected to end forces and end moments. The radial location and intensity of the open-mode delamination stress were calculated and were compared with the results obtained from the anisotropic continuum theory and from the finite element method. The multilayer theory gave more accurate predictions of the location and the intensity of the open-mode delamination stress than those calculated from the anisotropic continuum theory.

  1. Analysis of senior high school student understanding on gas kinetic theory material

    NASA Astrophysics Data System (ADS)

    Anri, Y.; Maknun, J.; Chandra, D. T.

    2018-05-01

    The purpose of this research conducted to find out student understanding profile about gas kinetic theory. Particularly, on ideal gas law material, ideal gas equations and kinetic energy of ideal gas. This research was conducted on student of class XII in one of the schools in Bandung. This research is a descriptive research. The data of this research collected by using test instrument which was the essay that has been developed by the researcher based on Bloom’s Taxonomy revised. Based on the analysis result to student answer, this research discovered that whole student has low understanding in the material of gas kinetic theory. This low understanding caused of the misconception of the student, student attitude on physic subjects, and teacher teaching method who are less helpful in obtaining clear pictures in material being taught.

  2. Theory development for situational awareness in multi-casualty incidents.

    PubMed

    Busby, Steven; Witucki-Brown, Janet

    2011-09-01

    Nurses and other field-level providers will be increasingly called on to respond to both natural and manmade situations that involve multiple casualties. Situational Awareness (SA) is necessary for managing these complicated incidents. The purpose of the study was to create new knowledge by discovering the process of SA in multi-casualty incidents (MCI) and develop substantive theory with regard to field-level SA for use by emergency response nurses and other providers. A qualitative, grounded theory approach was used to develop the first substantive theory of SA for MCI. The sample included 15 emergency response providers from the Southeastern United States. One pilot interview was conducted to trial and refine the semi-structured interview questions. Following Institutional Review Board approval, data collection and analysis occurred from September 2008 through January 2009. The grounded theory methods of Corbin and Strauss (2008) and Charmaz (2006) informed this study. Transcribed participant interviews constituted the bulk of the data with additional data provided by field notes and extensive memos. Multiple levels of coding, theoretical sampling, and theoretical sensitivity were used to develop and relate concepts resulting in emerging theory. Multiple methods were used for maintaining the rigor of the study. The process of SA in MCI involves emergency responders establishing and maintaining control of dynamic, contextually-based situations. Against the backdrop of experience and other preparatory interval actions, responders handle various types of information and manage resources, roles, relationships and human emotion. The goal is to provide an environment of relative safety in which patient care is provided. SA in MCI is an on-going and iterative process with each piece of information informing new actions. Analysis culminated in the development of the Busby Theory of Situational Awareness in Multi-casualty Incidents. SA in MCI is a growing need at local, national and international levels. The newly developed theory provides a useful model for appreciating SA in the context of MCI thereby improving practice and providing a tool for education. The theory also provides a catalyst for further research refining and testing of the theory and for studying larger-scale incidents. Copyright © 2011 Emergency Nurses Association. Published by Mosby, Inc. All rights reserved.

  3. Theoretical Coalescence: A Method to Develop Qualitative Theory

    PubMed Central

    Morse, Janice M.

    2018-01-01

    Background Qualitative research is frequently context bound, lacks generalizability, and is limited in scope. Objectives The purpose of this article was to describe a method, theoretical coalescence, that provides a strategy for analyzing complex, high-level concepts and for developing generalizable theory. Theoretical coalescence is a method of theoretical expansion, inductive inquiry, of theory development, that uses data (rather than themes, categories, and published extracts of data) as the primary source for analysis. Here, using the development of the lay concept of enduring as an example, I explore the scientific development of the concept in multiple settings over many projects and link it within the Praxis Theory of Suffering. Methods As comprehension emerges when conducting theoretical coalescence, it is essential that raw data from various different situations be available for reinterpretation/reanalysis and comparison to identify the essential features of the concept. The concept is then reconstructed, with additional inquiry that builds description, and evidence is conducted and conceptualized to create a more expansive concept and theory. Results By utilizing apparently diverse data sets from different contexts that are linked by certain characteristics, the essential features of the concept emerge. Such inquiry is divergent and less bound by context yet purposeful, logical, and with significant pragmatic implications for practice in nursing and beyond our discipline. Conclusion Theoretical coalescence is a means by which qualitative inquiry is broadened to make an impact, to accommodate new theoretical shifts and concepts, and to make qualitative research applied and accessible in new ways. PMID:29360688

  4. An analysis of hypercritical states in elastic and inelastic systems

    NASA Astrophysics Data System (ADS)

    Kowalczk, Maciej

    The author raises a wide range of problems whose common characteristic is an analysis of hypercritical states in elastic and inelastic systems. the article consists of two basic parts. The first part primarily discusses problems of modelling hypercritical states, while the second analyzes numerical methods (so-called continuation methods) used to solve non-linear problems. The original approaches for modelling hypercritical states found in this article include the combination of plasticity theory and an energy condition for cracking, accounting for the variability and cyclical nature of the forms of fracture of a brittle material under a die, and the combination of plasticity theory and a simplified description of the phenomenon of localization along a discontinuity line. The author presents analytical solutions of three non-linear problems for systems made of elastic/brittle/plastic and elastic/ideally plastic materials. The author proceeds to discuss the analytical basics of continuation methods and analyzes the significance of the parameterization of non-linear problems, provides a method for selecting control parameters based on an analysis of the rank of a rectangular matrix of a uniform system of increment equations, and also provides a new method for selecting an equilibrium path originating from a bifurcation point. The author provides a general outline of continuation methods based on an analysis of the rank of a matrix of a corrective system of equations. The author supplements his theoretical solutions with numerical solutions of non-linear problems for rod systems and problems of the plastic disintegration of a notched rectangular plastic plate.

  5. Analysis and modification of theory for impact of seaplanes on water

    NASA Technical Reports Server (NTRS)

    Mayo, Wilbur L

    1945-01-01

    An analysis of available theory on seaplane impact and a proposed modification thereto are presented. In previous methods the overall momentum of the float and virtual mass has been assumed to remain constant during the impact but the present analysis shows that this assumption is rigorously correct only when the resultant velocity of the float is normal to the keel. The proposed modification chiefly involves consideration of the fact that forward velocity of the seaplane float causes momentum to be passed into the hydrodynamic downwash (an action that is the entire consideration in the case of the planing float) and consideration of the fact that, for an impact with trim, the rate of penetration is determined not only by the velocity component normal to the keel but also by the velocity component parallel to the keel, which tends to reduce the penetration. Experimental data for planing, oblique impact, and vertical drop are used to show that the accuracy of the proposed theory is good.

  6. Taub-NUT Spacetime in the (A)dS/CFT and M-Theory [electronic resource

    NASA Astrophysics Data System (ADS)

    Clarkson, Richard

    In the following thesis, I will conduct a thermodynamic analysis of the Taub-NUT spacetime in various dimensions, as well as show uses for Taub-NUT and other Hyper-Kahler spacetimes. Thermodynamic analysis (by which I mean the calculation of the entropy and other thermodynamic quantities, and the analysis of these quantities) has in the past been done by use of background subtraction. The recent derivation of the (A)dS/CFT correspondences from String theory has allowed for easier and quicker analysis. I will use Taub-NUT space as a template to test these correspondences against the standard thermodynamic calculations (via the N?ether method), with (in the Taub-NUT-dS case especially) some very interesting results. There is also interest in obtaining metrics in eleven dimensions that can be reduced down to ten dimensional string theory metrics. Taub-NUT and other Hyper-Kahler metrics already possess the form to easily facilitate the Kaluza-Klein reduction, and embedding such metricsinto eleven dimensional metrics containing M2 or M5 branes produces metrics with interesting Dp-brane results.

  7. Necessary and sufficient condition for the realization of the complex wavelet

    NASA Astrophysics Data System (ADS)

    Keita, Alpha; Qing, Qianqin; Wang, Nengchao

    1997-04-01

    Wavelet theory is a whole new signal analysis theory in recent years, and the appearance of which is attracting lots of experts in many different fields giving it a deepen study. Wavelet transformation is a new kind of time. Frequency domain analysis method of localization in can-be- realized time domain or frequency domain. It has many perfect characteristics that many other kinds of time frequency domain analysis, such as Gabor transformation or Viginier. For example, it has orthogonality, direction selectivity, variable time-frequency domain resolution ratio, adjustable local support, parsing data in little amount, and so on. All those above make wavelet transformation a very important new tool and method in signal analysis field. Because the calculation of complex wavelet is very difficult, in application, real wavelet function is used. In this paper, we present a necessary and sufficient condition that the real wavelet function can be obtained by the complex wavelet function. This theorem has some significant values in theory. The paper prepares its technique from Hartley transformation, then, it gives the complex wavelet was a signal engineering expert. His Hartley transformation, which also mentioned by Hartley, had been overlooked for about 40 years, for the social production conditions at that time cannot help to show its superiority. Only when it came to the end of 70s and the early 80s, after the development of the fast algorithm of Fourier transformation and the hardware implement to some degree, the completely some positive-negative transforming method was coming to take seriously. W transformation, which mentioned by Zhongde Wang, pushed the studying work of Hartley transformation and its fast algorithm forward. The kernel function of Hartley transformation.

  8. On Boundaries of the Language of Physics

    NASA Astrophysics Data System (ADS)

    Kvasz, Ladislav

    The aim of the present paper is to outline a method of reconstruction of the historical development of the language of physical theories. We will apply the theory presented in Patterns of Change, Linguistic Innovations in the Development of Classical Mathematics to the analysis of linguistic innovations in physics. Our method is based on a reconstruction of the following potentialities of language: analytical power, expressive power, integrative power, and explanatory power, as well as analytical boundaries and expressive boundaries. One of the results of our reconstruction is a new interpretation of Kant's antinomies of pure reason. If we relate Kant's antinomies to the language, they retain validity.

  9. Quantum chemical calculations of Cr2O3/SnO2 using density functional theory method

    NASA Astrophysics Data System (ADS)

    Jawaher, K. Rackesh; Indirajith, R.; Krishnan, S.; Robert, R.; Das, S. Jerome

    2018-03-01

    Quantum chemical calculations have been employed to study the molecular effects produced by Cr2O3/SnO2 optimised structure. The theoretical parameters of the transparent conducting metal oxides were calculated using DFT / B3LYP / LANL2DZ method. The optimised bond parameters such as bond lengths, bond angles and dihedral angles were calculated using the same theory. The non-linear optical property of the title compound was calculated using first-order hyperpolarisability calculation. The calculated HOMO-LUMO analysis explains the charge transfer interaction between the molecule. In addition, MEP and Mulliken atomic charges were also calculated and analysed.

  10. Historical evolution of vortex-lattice methods

    NASA Technical Reports Server (NTRS)

    Deyoung, J.

    1976-01-01

    A review of the beginning and some orientation of the vortex-lattice method were given. The historical course of this method was followed in conjunction with its field of computational fluid dynamics, spanning the period from L.F. Richardson's paper in 1910 to 1975. The following landmarks were pointed out: numerical analysis of partial differential equations, lifting-line theory, finite-difference method, 1/4-3/4 rule, block relaxation technique, application of electronic computers, and advanced panel methods.

  11. Trees, B-series and G-symplectic methods

    NASA Astrophysics Data System (ADS)

    Butcher, J. C.

    2017-07-01

    The order conditions for Runge-Kutta methods are intimately connected with the graphs known as rooted trees. The conditions can be expressed in terms of Taylor expansions written as weighted sums of elementary differentials, that is as B-series. Polish notation provides a unifying structure for representing many of the quantities appearing in this theory. Applications include the analysis of general linear methods with special reference to G-symplectic methods. A new order 6 method has recently been constructed.

  12. Task Design for Students' Work with Basic Theory in Analysis: The Cases of Multidimensional Differentiability and Curve Integrals

    ERIC Educational Resources Information Center

    Gravesen, Katrine Frovin; Grønbaek, Niels; Winsløw, Carl

    2017-01-01

    We investigate the challenges students face in the transition from calculus courses, focusing on methods related to the analysis of real valued functions given in closed form, to more advanced courses on analysis where focus is on theoretical structure, including proof. We do so based on task design aiming for a number of generic potentials for…

  13. Thermal Desorption Analysis of Effective Specific Soil Surface Area

    NASA Astrophysics Data System (ADS)

    Smagin, A. V.; Bashina, A. S.; Klyueva, V. V.; Kubareva, A. V.

    2017-12-01

    A new method of assessing the effective specific surface area based on the successive thermal desorption of water vapor at different temperature stages of sample drying is analyzed in comparison with the conventional static adsorption method using a representative set of soil samples of different genesis and degree of dispersion. The theory of the method uses the fundamental relationship between the thermodynamic water potential (Ψ) and the absolute temperature of drying ( T): Ψ = Q - aT, where Q is the specific heat of vaporization, and a is the physically based parameter related to the initial temperature and relative humidity of the air in the external thermodynamic reservoir (laboratory). From gravimetric data on the mass fraction of water ( W) and the Ψ value, Polyanyi potential curves ( W(Ψ)) for the studied samples are plotted. Water sorption isotherms are then calculated, from which the capacity of monolayer and the target effective specific surface area are determined using the BET theory. Comparative analysis shows that the new method well agrees with the conventional estimation of the degree of dispersion by the BET and Kutilek methods in a wide range of specific surface area values between 10 and 250 m2/g.

  14. Integrated Force Method Solution to Indeterminate Structural Mechanics Problems

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Hopkins, Dale A.; Halford, Gary R.

    2004-01-01

    Strength of materials problems have been classified into determinate and indeterminate problems. Determinate analysis primarily based on the equilibrium concept is well understood. Solutions of indeterminate problems required additional compatibility conditions, and its comprehension was not exclusive. A solution to indeterminate problem is generated by manipulating the equilibrium concept, either by rewriting in the displacement variables or through the cutting and closing gap technique of the redundant force method. Compatibility improvisation has made analysis cumbersome. The authors have researched and understood the compatibility theory. Solutions can be generated with equal emphasis on the equilibrium and compatibility concepts. This technique is called the Integrated Force Method (IFM). Forces are the primary unknowns of IFM. Displacements are back-calculated from forces. IFM equations are manipulated to obtain the Dual Integrated Force Method (IFMD). Displacement is the primary variable of IFMD and force is back-calculated. The subject is introduced through response variables: force, deformation, displacement; and underlying concepts: equilibrium equation, force deformation relation, deformation displacement relation, and compatibility condition. Mechanical load, temperature variation, and support settling are equally emphasized. The basic theory is discussed. A set of examples illustrate the new concepts. IFM and IFMD based finite element methods are introduced for simple problems.

  15. A Monte-Carlo game theoretic approach for Multi-Criteria Decision Making under uncertainty

    NASA Astrophysics Data System (ADS)

    Madani, Kaveh; Lund, Jay R.

    2011-05-01

    Game theory provides a useful framework for studying Multi-Criteria Decision Making problems. This paper suggests modeling Multi-Criteria Decision Making problems as strategic games and solving them using non-cooperative game theory concepts. The suggested method can be used to prescribe non-dominated solutions and also can be used as a method to predict the outcome of a decision making problem. Non-cooperative stability definitions for solving the games allow consideration of non-cooperative behaviors, often neglected by other methods which assume perfect cooperation among decision makers. To deal with the uncertainty in input variables a Monte-Carlo Game Theory (MCGT) approach is suggested which maps the stochastic problem into many deterministic strategic games. The games are solved using non-cooperative stability definitions and the results include possible effects of uncertainty in input variables on outcomes. The method can handle multi-criteria multi-decision-maker problems with uncertainty. The suggested method does not require criteria weighting, developing a compound decision objective, and accurate quantitative (cardinal) information as it simplifies the decision analysis by solving problems based on qualitative (ordinal) information, reducing the computational burden substantially. The MCGT method is applied to analyze California's Sacramento-San Joaquin Delta problem. The suggested method provides insights, identifies non-dominated alternatives, and predicts likely decision outcomes.

  16. [Development of a program theory as a basis for the evaluation of a dementia special care unit].

    PubMed

    Adlbrecht, Laura; Bartholomeyczik, Sabine; Mayer, Hanna

    2018-06-01

    Background: An existing dementia special care unit should be evaluated. In order to build a sound foundation of the evaluation a deep theoretical understanding of the implemented intervention is needed, which has not been explicated yet. One possibility to achieve this is the development of a program theory. Aim: The aim is to present a method to develop a program theory for the existing living and care concept of the dementia special care unit, which is used in a larger project to evaluate the concept theory-drivenly. Method: The evaluation is embedded in the framework of van Belle et al. (2010) and an action model and a change model (Chen, 2015) is created. For the specification of the change model the contribution analysis (Mayne, 2011) is applied. Data were collected in workshops with the developers and the nurses of the dementia special care unit and a literature research concerning interventions and outcomes was carried out. The results were synthesized in a consens workshop. Results: The action model describes the interventions of the dementia special care unit, the implementers, the organization and the context. The change model compromises the mechanisms through which interventions achieve outcomes. Conclusions: The results of the program theory can be employed to choose data collection methods and instruments for the evaluation. On the basis of the results of the evaluation the program theory can be refined and adapted.

  17. Biological embedding: evaluation and analysis of an emerging concept for nursing scholarship.

    PubMed

    Nist, Marliese Dion

    2017-02-01

    The purpose of this paper was to report the analysis of the concept of biological embedding. Research that incorporates a life course perspective is becoming increasingly prominent in the health sciences. Biological embedding is a central concept in life course theory and may be important for nursing theories to enhance our understanding of health states in individuals and populations. Before the concept of biological embedding can be used in nursing theory and research, an analysis of the concept is required to advance it towards full maturity. Concept analysis. PubMed, CINAHL and PsycINFO were searched for publications using the term 'biological embedding' or 'biological programming' and published through 2015. An evaluation of the concept was first conducted to determine the concept's level of maturity and was followed by a concept comparison, using the methods for concept evaluation and comparison described by Morse. A consistent definition of biological embedding - the process by which early life experience alters biological processes to affect adult health outcomes - was found throughout the literature. The concept has been used in several theories that describe the mechanisms through which biological embedding might occur and highlight its role in the development of health trajectories. Biological embedding is a partially mature concept, requiring concept comparison with an overlapping concept - biological programming - to more clearly establish the boundaries of biological embedding. Biological embedding has significant potential for theory development and application in multiple academic disciplines, including nursing. © 2016 John Wiley & Sons Ltd.

  18. Dependence in probabilistic modeling Dempster-Shafer theory and probability bounds analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferson, Scott; Nelsen, Roger B.; Hajagos, Janos

    2015-05-01

    This report summarizes methods to incorporate information (or lack of information) about inter-variable dependence into risk assessments that use Dempster-Shafer theory or probability bounds analysis to address epistemic and aleatory uncertainty. The report reviews techniques for simulating correlated variates for a given correlation measure and dependence model, computation of bounds on distribution functions under a specified dependence model, formulation of parametric and empirical dependence models, and bounding approaches that can be used when information about the intervariable dependence is incomplete. The report also reviews several of the most pervasive and dangerous myths among risk analysts about dependence in probabilistic models.

  19. Symplectic analysis of three-dimensional Abelian topological gravity

    NASA Astrophysics Data System (ADS)

    Cartas-Fuentevilla, R.; Escalante, Alberto; Herrera-Aguilar, Alfredo

    2017-02-01

    A detailed Faddeev-Jackiw quantization of an Abelian topological gravity is performed; we show that this formalism is equivalent and more economical than Dirac's method. In particular, we identify the complete set of constraints of the theory, from which the number of physical degrees of freedom is explicitly computed. We prove that the generalized Faddeev-Jackiw brackets and the Dirac ones coincide with each other. Moreover, we perform the Faddeev-Jackiw analysis of the theory at the chiral point, and the full set of constraints and the generalized Faddeev-Jackiw brackets are constructed. Finally we compare our results with those found in the literature and we discuss some remarks and prospects.

  20. [Progress in industrial bioprocess engineering in China].

    PubMed

    Zhuang, Yingping; Chen, Hongzhang; Xia, Jianye; Tang, Wenjun; Zhao, Zhimin

    2015-06-01

    The advances of industrial biotechnology highly depend on the development of industrial bioprocess researches. In China, we are facing several challenges because of a huge national industrial fermentation capacity. The industrial bioprocess development experienced several main stages. This work mainly reviews the development of the industrial bioprocess in China during the past 30 or 40 years: including the early stage kinetics model study derived from classical chemical engineering, researching method based on control theory, multiple-parameter analysis techniques of on-line measuring instruments and techniques, and multi-scale analysis theory, and also solid state fermentation techniques and fermenters. In addition, the cutting edge of bioprocess engineering was also addressed.

  1. Applications of Ergodic Theory to Coverage Analysis

    NASA Technical Reports Server (NTRS)

    Lo, Martin W.

    2003-01-01

    The study of differential equations, or dynamical systems in general, has two fundamentally different approaches. We are most familiar with the construction of solutions to differential equations. Another approach is to study the statistical behavior of the solutions. Ergodic Theory is one of the most developed methods to study the statistical behavior of the solutions of differential equations. In the theory of satellite orbits, the statistical behavior of the orbits is used to produce 'Coverage Analysis' or how often a spacecraft is in view of a site on the ground. In this paper, we consider the use of Ergodic Theory for Coverage Analysis. This allows us to greatly simplify the computation of quantities such as the total time for which a ground station can see a satellite without ever integrating the trajectory, see Lo 1,2. More over, for any quantity which is an integrable function of the ground track, its average may be computed similarly without the integration of the trajectory. For example, the data rate for a simple telecom system is a function of the distance between the satellite and the ground station. We show that such a function may be averaged using the Ergodic Theorem.

  2. Principle-based concept analysis: intentionality in holistic nursing theories.

    PubMed

    Aghebati, Nahid; Mohammadi, Eesa; Ahmadi, Fazlollah; Noaparast, Khosrow Bagheri

    2015-03-01

    This is a report of a principle-based concept analysis of intentionality in holistic nursing theories. A principle-based concept analysis method was used to analyze seven holistic theories. The data included eight books and 31 articles (1998-2011), which were retrieved through MEDLINE and CINAHL. Erickson, Kriger, Parse, Watson, and Zahourek define intentionality as a capacity, a focused consciousness, and a pattern of human being. Rogers and Newman do not explicitly mention intentionality; however, they do explain pattern and consciousness (epistemology). Intentionality has been operationalized as a core concept of nurse-client relationships (pragmatic). The theories are consistent on intentionality as a noun and as an attribute of the person-intentionality is different from intent and intention (linguistic). There is ambiguity concerning the boundaries between intentionality and consciousness (logic). Theoretically, intentionality is an evolutionary capacity to integrate human awareness and experience. Because intentionality is an individualized concept, we introduced it as "a matrix of continuous known changes" that emerges in two forms: as a capacity of human being and as a capacity of transpersonal caring. This study has produced a theoretical definition of intentionality and provides a foundation for future research to further investigate intentionality to better delineate its boundaries. © The Author(s) 2014.

  3. The Stability Analysis Method of the Cohesive Granular Slope on the Basis of Graph Theory.

    PubMed

    Guan, Yanpeng; Liu, Xiaoli; Wang, Enzhi; Wang, Sijing

    2017-02-27

    This paper attempted to provide a method to calculate progressive failure of the cohesivefrictional granular geomaterial and the spatial distribution of the stability of the cohesive granular slope. The methodology can be divided into two parts: the characterization method of macro-contact and the analysis of the slope stability. Based on the graph theory, the vertexes, the edges and the edge sequences are abstracted out to characterize the voids, the particle contact and the macro-contact, respectively, bridging the gap between the mesoscopic and macro scales of granular materials. This paper adopts this characterization method to extract a graph from a granular slope and characterize the macro sliding surface, then the weighted graph is analyzed to calculate the slope safety factor. Each edge has three weights representing the sliding moment, the anti-sliding moment and the braking index of contact-bond, respectively, . The safety factor of the slope is calculated by presupposing a certain number of sliding routes and reducing Weight repeatedly and counting the mesoscopic failure of the edge. It is a kind of slope analysis method from mesoscopic perspective so it can present more detail of the mesoscopic property of the granular slope. In the respect of macro scale, the spatial distribution of the stability of the granular slope is in agreement with the theoretical solution.

  4. A GIS-based extended fuzzy multi-criteria evaluation for landslide susceptibility mapping

    NASA Astrophysics Data System (ADS)

    Feizizadeh, Bakhtiar; Shadman Roodposhti, Majid; Jankowski, Piotr; Blaschke, Thomas

    2014-12-01

    Landslide susceptibility mapping (LSM) is making increasing use of GIS-based spatial analysis in combination with multi-criteria evaluation (MCE) methods. We have developed a new multi-criteria decision analysis (MCDA) method for LSM and applied it to the Izeh River basin in south-western Iran. Our method is based on fuzzy membership functions (FMFs) derived from GIS analysis. It makes use of nine causal landslide factors identified by local landslide experts. Fuzzy set theory was first integrated with an analytical hierarchy process (AHP) in order to use pairwise comparisons to compare LSM criteria for ranking purposes. FMFs were then applied in order to determine the criteria weights to be used in the development of a landslide susceptibility map. Finally, a landslide inventory database was used to validate the LSM map by comparing it with known landslides within the study area. Results indicated that the integration of fuzzy set theory with AHP produced significantly improved accuracies and a high level of reliability in the resulting landslide susceptibility map. Approximately 53% of known landslides within our study area fell within zones classified as having "very high susceptibility", with the further 31% falling into zones classified as having "high susceptibility".

  5. A GIS-based extended fuzzy multi-criteria evaluation for landslide susceptibility mapping

    PubMed Central

    Feizizadeh, Bakhtiar; Shadman Roodposhti, Majid; Jankowski, Piotr; Blaschke, Thomas

    2014-01-01

    Landslide susceptibility mapping (LSM) is making increasing use of GIS-based spatial analysis in combination with multi-criteria evaluation (MCE) methods. We have developed a new multi-criteria decision analysis (MCDA) method for LSM and applied it to the Izeh River basin in south-western Iran. Our method is based on fuzzy membership functions (FMFs) derived from GIS analysis. It makes use of nine causal landslide factors identified by local landslide experts. Fuzzy set theory was first integrated with an analytical hierarchy process (AHP) in order to use pairwise comparisons to compare LSM criteria for ranking purposes. FMFs were then applied in order to determine the criteria weights to be used in the development of a landslide susceptibility map. Finally, a landslide inventory database was used to validate the LSM map by comparing it with known landslides within the study area. Results indicated that the integration of fuzzy set theory with AHP produced significantly improved accuracies and a high level of reliability in the resulting landslide susceptibility map. Approximately 53% of known landslides within our study area fell within zones classified as having “very high susceptibility”, with the further 31% falling into zones classified as having “high susceptibility”. PMID:26089577

  6. A GIS-based extended fuzzy multi-criteria evaluation for landslide susceptibility mapping.

    PubMed

    Feizizadeh, Bakhtiar; Shadman Roodposhti, Majid; Jankowski, Piotr; Blaschke, Thomas

    2014-12-01

    Landslide susceptibility mapping (LSM) is making increasing use of GIS-based spatial analysis in combination with multi-criteria evaluation (MCE) methods. We have developed a new multi-criteria decision analysis (MCDA) method for LSM and applied it to the Izeh River basin in south-western Iran. Our method is based on fuzzy membership functions (FMFs) derived from GIS analysis. It makes use of nine causal landslide factors identified by local landslide experts. Fuzzy set theory was first integrated with an analytical hierarchy process (AHP) in order to use pairwise comparisons to compare LSM criteria for ranking purposes. FMFs were then applied in order to determine the criteria weights to be used in the development of a landslide susceptibility map. Finally, a landslide inventory database was used to validate the LSM map by comparing it with known landslides within the study area. Results indicated that the integration of fuzzy set theory with AHP produced significantly improved accuracies and a high level of reliability in the resulting landslide susceptibility map. Approximately 53% of known landslides within our study area fell within zones classified as having "very high susceptibility", with the further 31% falling into zones classified as having "high susceptibility".

  7. Accurate determination of the valence band edge in hard x-ray photoemission spectra using GW theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lischner, Johannes, E-mail: jlischner597@gmail.com; Department of Physics and Department of Materials and the Thomas Young Centre for Theory and Simulation of Materials, Imperial College London, London SW7 2AZ; Nemšák, Slavomír

    We introduce a new method for determining accurate values of the valence-band maximum in x-ray photoemission spectra. Specifically, we align the sharpest peak in the valence-band region of the experimental spectrum with the corresponding feature of a theoretical valence-band density of states curve from ab initio GW theory calculations. This method is particularly useful for soft and hard x-ray photoemission studies of materials with a mixture of valence-band characters, where strong matrix element effects can render standard methods for extracting the valence-band maximum unreliable. We apply our method to hydrogen-terminated boron-doped diamond, which is a promising substrate material for novelmore » solar cell devices. By carrying out photoemission experiments with variable light polarizations, we verify the accuracy of our analysis and the general validity of the method.« less

  8. Multi-scale modelling of elastic moduli of trabecular bone

    PubMed Central

    Hamed, Elham; Jasiuk, Iwona; Yoo, Andrew; Lee, YikHan; Liszka, Tadeusz

    2012-01-01

    We model trabecular bone as a nanocomposite material with hierarchical structure and predict its elastic properties at different structural scales. The analysis involves a bottom-up multi-scale approach, starting with nanoscale (mineralized collagen fibril) and moving up the scales to sub-microscale (single lamella), microscale (single trabecula) and mesoscale (trabecular bone) levels. Continuum micromechanics methods, composite materials laminate theory and finite-element methods are used in the analysis. Good agreement is found between theoretical and experimental results. PMID:22279160

  9. [Structural analysis of the functional status of the brain as affected by bemethyl using pattern recognition theory].

    PubMed

    Bobkov, Iu G; Machula, A I; Morozov, Iu I; Dvalishvili, E G

    1987-11-01

    Evoked visual potentials in associated, parietal and second somatosensory zones of the neocortex were analysed in trained cats using implanted electrodes. The influence of bemethyl on the structure of behavioral reactions was analysed using theoretical methods of perceptual images, particularly the method of cluster analysis. Bemethyl was shown to increase the level of interaction between the functional elements of the system, leading to a more stable resolution of problems facing the system, as compared to the initial state.

  10. Lagrangian methods in the analysis of nonlinear wave interactions in plasma

    NASA Technical Reports Server (NTRS)

    Galloway, J. J.

    1972-01-01

    An averaged-Lagrangian method is developed for obtaining the equations which describe the nonlinear interactions of the wave (oscillatory) and background (nonoscillatory) components which comprise a continuous medium. The method applies to monochromatic waves in any continuous medium that can be described by a Lagrangian density, but is demonstrated in the context of plasma physics. The theory is presented in a more general and unified form by way of a new averaged-Lagrangian formalism which simplifies the perturbation ordering procedure. Earlier theory is extended to deal with a medium distributed in velocity space and to account for the interaction of the background with the waves. The analytic steps are systematized, so as to maximize calculational efficiency. An assessment of the applicability and limitations of the method shows that it has some definite advantages over other approaches in efficiency and versatility.

  11. Methods of training the graduate level and professional geologist in remote sensing technology

    NASA Technical Reports Server (NTRS)

    Kolm, K. E.

    1981-01-01

    Requirements for a basic course in remote sensing to accommodate the needs of the graduate level and professional geologist are described. The course should stress the general topics of basic remote sensing theory, the theory and data types relating to different remote sensing systems, an introduction to the basic concepts of computer image processing and analysis, the characteristics of different data types, the development of methods for geological interpretations, the integration of all scales and data types of remote sensing in a given study, the integration of other data bases (geophysical and geochemical) into a remote sensing study, and geological remote sensing applications. The laboratories should stress hands on experience to reinforce the concepts and procedures presented in the lecture. The geologist should then be encouraged to pursue a second course in computer image processing and analysis of remotely sensed data.

  12. Non-homogeneous harmonic analysis: 16 years of development

    NASA Astrophysics Data System (ADS)

    Volberg, A. L.; Èiderman, V. Ya

    2013-12-01

    This survey contains results and methods in the theory of singular integrals, a theory which has been developing dramatically in the last 15-20 years. The central (although not the only) topic of the paper is the connection between the analytic properties of integrals and operators with Calderón-Zygmund kernels and the geometric properties of the measures. The history is traced of the classical Painlevé problem of describing removable singularities of bounded analytic functions, which has provided a strong incentive for the development of this branch of harmonic analysis. The progress of recent decades has largely been based on the creation of an apparatus for dealing with non-homogeneous measures, and much attention is devoted to this apparatus here. Several open questions are stated, first and foremost in the multidimensional case, where the method of curvature of a measure is not available. Bibliography: 128 titles.

  13. Recent advances in combination of capillary electrophoresis with mass spectrometry: methodology and theory.

    PubMed

    Klepárník, Karel

    2015-01-01

    This review focuses on the latest development of microseparation electromigration methods in capillaries and microfluidic devices with MS detection and identification. A wide selection of 183 relevant articles covers the literature published from June 2012 till May 2014 as a continuation of the review article on the same topic by Kleparnik [Electrophoresis 2013, 34, 70-86]. Special attention is paid to the new improvements in the theory of instrumentation and methodology of MS interfacing with capillary versions of zone electrophoresis, ITP, and IEF. Ionization methods in MS include ESI, MALDI, and ICP. Although the main attention is paid to the development of instrumentation and methodology, representative examples illustrate also applications in the proteomics, glycomics, metabolomics, biomarker research, forensics, pharmacology, food analysis, and single-cell analysis. The combinations of MS with capillary versions of electrochromatography and micellar electrokinetic chromatography are not included. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Measuring uncertainty by extracting fuzzy rules using rough sets

    NASA Technical Reports Server (NTRS)

    Worm, Jeffrey A.

    1991-01-01

    Despite the advancements in the computer industry in the past 30 years, there is still one major deficiency. Computers are not designed to handle terms where uncertainty is present. To deal with uncertainty, techniques other than classical logic must be developed. The methods are examined of statistical analysis, the Dempster-Shafer theory, rough set theory, and fuzzy set theory to solve this problem. The fundamentals of these theories are combined to possibly provide the optimal solution. By incorporating principles from these theories, a decision making process may be simulated by extracting two sets of fuzzy rules: certain rules and possible rules. From these rules a corresponding measure of how much these rules is believed is constructed. From this, the idea of how much a fuzzy diagnosis is definable in terms of a set of fuzzy attributes is studied.

  15. Exploitation of SAR data for measurement of ocean currents and wave velocities

    NASA Technical Reports Server (NTRS)

    Shuchman, R. A.; Lyzenga, D. R.; Klooster, A., Jr.

    1981-01-01

    Methods of extracting information on ocean currents and wave orbital velocities from SAR data by an analysis of the Doppler frequency content of the data are discussed. The theory and data analysis methods are discussed, and results are presented for both aircraft and satellite (SEASAT) data sets. A method of measuring the phase velocity of a gravity wave field is also described. This method uses the shift in position of the wave crests on two images generated from the same data set using two separate Doppler bands. Results of the current measurements are pesented for 11 aircraft data sets and 4 SEASAT data sets.

  16. Cancer diagnosis marker extraction for soft tissue sarcomas based on gene expression profiling data by using projective adaptive resonance theory (PART) filtering method

    PubMed Central

    Takahashi, Hiro; Nemoto, Takeshi; Yoshida, Teruhiko; Honda, Hiroyuki; Hasegawa, Tadashi

    2006-01-01

    Background Recent advances in genome technologies have provided an excellent opportunity to determine the complete biological characteristics of neoplastic tissues, resulting in improved diagnosis and selection of treatment. To accomplish this objective, it is important to establish a sophisticated algorithm that can deal with large quantities of data such as gene expression profiles obtained by DNA microarray analysis. Results Previously, we developed the projective adaptive resonance theory (PART) filtering method as a gene filtering method. This is one of the clustering methods that can select specific genes for each subtype. In this study, we applied the PART filtering method to analyze microarray data that were obtained from soft tissue sarcoma (STS) patients for the extraction of subtype-specific genes. The performance of the filtering method was evaluated by comparison with other widely used methods, such as signal-to-noise, significance analysis of microarrays, and nearest shrunken centroids. In addition, various combinations of filtering and modeling methods were used to extract essential subtype-specific genes. The combination of the PART filtering method and boosting – the PART-BFCS method – showed the highest accuracy. Seven genes among the 15 genes that are frequently selected by this method – MIF, CYFIP2, HSPCB, TIMP3, LDHA, ABR, and RGS3 – are known prognostic marker genes for other tumors. These genes are candidate marker genes for the diagnosis of STS. Correlation analysis was performed to extract marker genes that were not selected by PART-BFCS. Sixteen genes among those extracted are also known prognostic marker genes for other tumors, and they could be candidate marker genes for the diagnosis of STS. Conclusion The procedure that consisted of two steps, such as the PART-BFCS and the correlation analysis, was proposed. The results suggest that novel diagnostic and therapeutic targets for STS can be extracted by a procedure that includes the PART filtering method. PMID:16948864

  17. Progressive Failure Analysis Methodology for Laminated Composite Structures

    NASA Technical Reports Server (NTRS)

    Sleight, David W.

    1999-01-01

    A progressive failure analysis method has been developed for predicting the failure of laminated composite structures under geometrically nonlinear deformations. The progressive failure analysis uses C(exp 1) shell elements based on classical lamination theory to calculate the in-plane stresses. Several failure criteria, including the maximum strain criterion, Hashin's criterion, and Christensen's criterion, are used to predict the failure mechanisms and several options are available to degrade the material properties after failures. The progressive failure analysis method is implemented in the COMET finite element analysis code and can predict the damage and response of laminated composite structures from initial loading to final failure. The different failure criteria and material degradation methods are compared and assessed by performing analyses of several laminated composite structures. Results from the progressive failure method indicate good correlation with the existing test data except in structural applications where interlaminar stresses are important which may cause failure mechanisms such as debonding or delaminations.

  18. Markov Chain Monte Carlo Methods for Bayesian Data Analysis in Astronomy

    NASA Astrophysics Data System (ADS)

    Sharma, Sanjib

    2017-08-01

    Markov Chain Monte Carlo based Bayesian data analysis has now become the method of choice for analyzing and interpreting data in almost all disciplines of science. In astronomy, over the last decade, we have also seen a steady increase in the number of papers that employ Monte Carlo based Bayesian analysis. New, efficient Monte Carlo based methods are continuously being developed and explored. In this review, we first explain the basics of Bayesian theory and discuss how to set up data analysis problems within this framework. Next, we provide an overview of various Monte Carlo based methods for performing Bayesian data analysis. Finally, we discuss advanced ideas that enable us to tackle complex problems and thus hold great promise for the future. We also distribute downloadable computer software (available at https://github.com/sanjibs/bmcmc/ ) that implements some of the algorithms and examples discussed here.

  19. Multi-level Discourse Analysis in a Physics Teaching Methods Course from the Psychological Perspective of Activity Theory

    NASA Astrophysics Data System (ADS)

    Vieira, Rodrigo Drumond; Kelly, Gregory J.

    2014-11-01

    In this paper, we present and apply a multi-level method for discourse analysis in science classrooms. This method is based on the structure of human activity (activity, actions, and operations) and it was applied to study a pre-service physics teacher methods course. We argue that such an approach, based on a cultural psychological perspective, affords opportunities for analysts to perform a theoretically based detailed analysis of discourse events. Along with the presentation of analysis, we show and discuss how the articulation of different levels offers interpretative criteria for analyzing instructional conversations. We synthesize the results into a model for a teacher's practice and discuss the implications and possibilities of this approach for the field of discourse analysis in science classrooms. Finally, we reflect on how the development of teachers' understanding of their activity structures can contribute to forms of progressive discourse of science education.

  20. Embedding of multidimensional time-dependent observations.

    PubMed

    Barnard, J P; Aldrich, C; Gerber, M

    2001-10-01

    A method is proposed to reconstruct dynamic attractors by embedding of multivariate observations of dynamic nonlinear processes. The Takens embedding theory is combined with independent component analysis to transform the embedding into a vector space of linearly independent vectors (phase variables). The method is successfully tested against prediction of the unembedded state vector in two case studies of simulated chaotic processes.

Top